Pre-screened and vetted.
Senior Data Analyst specializing in marketing, BI, and financial analytics
“Marketing analytics candidate with experience at WPP and on a global Coca-Cola campaign, focused on turning messy multi-platform media data into trusted reporting and decision systems. They combine hands-on SQL/Python pipeline building with stakeholder KPI alignment, and cite a 22% improvement in media effectiveness plus faster budget reallocation through daily automated reporting.”
Mid-level Data Analyst specializing in business analytics and BI
“Analytics professional with higher education experience at the University of Dayton, focused on turning inconsistent operational data into standardized metrics and recurring dashboards. They combine SQL, Python, and Power BI to automate reporting, improve data integrity, and reduce manual reporting by 30%, with outputs adopted in semester planning and cross-department performance tracking.”
Junior Data Engineer / Analyst specializing in AI/ML data infrastructure
“Built and deployed a compliance-sensitive LLM pipeline that extracts rebate logic from hospital–supplier medical contracts, using multi-layer redaction (regex/NER/dictionary), schema-validated structured outputs, and secure placeholder reinsertion. Hosted models on Amazon Bedrock to avoid retraining on sensitive data and improved both accuracy and cost by splitting the workflow into a lightweight section classifier plus a fine-tuned extraction model, orchestrated with LangChain and evaluated via layered, test-driven agent assessments.”
Senior AI Engineer specializing in Generative AI and RAG applications
“AI engineer who has shipped production LLM systems across customer service and marketing use cases—building a RAG app on Azure OpenAI and speeding retrieval with Redis caching tied to Okta sessions. Also implemented a LangGraph multi-agent workflow that pulls image context from Figma to generate structured HTML marketing emails, adding a verification agent to improve image-selection accuracy while optimizing solution cost for business stakeholders.”
Senior Agile/Product Delivery Leader specializing in enterprise transformation, data and cybersecurity
“Built a web-based online Sudoku game in JavaScript (multiplayer format supporting up to 6 teams with up to 5 players each) and demonstrates strong product/analytics orientation. Uses a KPI-driven approach (DAU/WAU, ARPU, session duration, LTV) and structured prioritization methods (MoSCoW, story mapping, cost of delay, DFV) to iterate toward targets; seeking a remote role around $70k/year.”
Mid-Level Full-Stack Software Engineer specializing in healthcare, cloud, and data platforms
“Backend/platform engineer who owned a real-time customer analytics microservice stack in Python/FastAPI with Kafka streaming into PostgreSQL, including schema enforcement (Avro) and high-throughput optimizations. Strong Kubernetes + GitOps practitioner (EKS/GKE, Helm, Argo CD) who has handled CI/CD reliability issues with automated pre-deploy checks and rollbacks, and supported major migrations (on-prem to AWS; VM to EKS) with blue-green cutover planning.”
Junior Machine Learning Engineer specializing in LLMs, NLP, and computer vision
“Built a production, agentic multi-agent pharmaceutical intelligence system for US oncology (breast cancer) conference/news intelligence, automating MSL-style information gathering and summarization for pharma and healthcare stakeholders. Uses CrewAI + LangChain orchestration, custom scraping across ~15 pharma newsrooms, and a grounding-score evaluation approach (sentence transformers/cosine similarity) to mitigate hallucinations.”
Mid-level Data & AI Engineer specializing in healthcare data pipelines and MLOps
“Built and deployed a production LLM-powered clinical note summarization system used by care managers to speed review of 5–20 page unstructured medical records. Implemented safety-focused validation (prompt constraints, rule-based and section-level checks, human-in-the-loop) to reduce hallucinations while maintaining low latency and meeting privacy/regulatory constraints, integrating via APIs into existing clinical tools.”
Mid-level Data Scientist/ML Engineer specializing in healthcare AI and MLOps
“Designed and deployed an enterprise LLM-powered clinical/pharmacy policy knowledge assistant at CVS Health, replacing manual searches across PDFs/Word/SharePoint with a HIPAA-compliant RAG system. Built end-to-end ingestion and orchestration (Airflow + Azure ML/Data Lake + vector index) with PHI masking, versioned re-embedding, and production monitoring (Prometheus/Grafana), and partnered closely with clinicians/compliance to ensure policy-grounded, auditable answers.”
Mid-level AI/ML Engineer specializing in healthcare ML and LLM/RAG systems
“AI/LLM engineer with recent production experience at UnitedHealth Group building an end-to-end RAG system over structured EMR data and unstructured clinical notes, including evidence retrieval, GPT/LLaMA-based reasoning, and a validation layer for reliability. Strong in orchestration (Kubeflow/Airflow/MLflow), prompt engineering for noisy healthcare text, and rigorous evaluation/monitoring with gold-standard benchmarking, plus close collaboration with clinical operations stakeholders.”
Mid-level AI/ML Engineer specializing in Generative AI and NLP
“AI/LLM engineer with production experience building secure, scalable compliance-focused generative AI systems (GPT-3/4, BERT) including RAG over internal regulatory document bases. Has delivered end-to-end pipelines on AWS with PySpark/Airflow/Kubernetes/FastAPI, emphasizing privacy controls, monitoring, and iterative evaluation (A/B testing). Also partnered closely with bank compliance officers using prototypes to refine NLP summarization/classification and reduce document review time.”
Mid-level Data Engineer specializing in scalable ETL, streaming analytics, and cloud data platforms
“At Dreamline AI, built and productionized an AWS-based incentive intelligence platform that uses Llama-2/GPT-4 to extract eligibility rules from unstructured state policy documents into structured JSON, then processes them with Glue/PySpark and serves results via Lambda/SageMaker/API Gateway. Designed state-specific ingestion connectors plus schema validation and automated checks/alerts to handle frequent policy/format changes without breaking the pipeline, and partnered with business/analytics stakeholders to deliver interpretable eligibility decisions via explanations and dashboards.”
Mid-level AI/ML Engineer specializing in MLOps and LLM-powered applications
“AI/ML engineer with production experience building a RAG-based internal analytics assistant (Databricks + ADF ingestion, Pinecone vector store, LangChain orchestration) deployed via Docker on AWS SageMaker with CI/CD and MLflow. Strong focus on real-world constraints—latency/cost optimization (LoRA ~60% compute reduction), hallucination control with citation grounding, and enterprise security/governance. Previously at Intuit, delivered an interpretable churn prediction system (PySpark/Databricks, Airflow/Azure ML) that improved retention targeting ~12%.”
Mid-level AI/ML Engineer specializing in NLP, Generative AI, and MLOps in Financial Services
“ML/LLM engineer at Charles Schwab who built a production loan-advisor chatbot integrated with internal knowledge and loan-calculator APIs, adding strict numeric validation to prevent rate hallucinations and optimizing context to control costs. Also runs ~40 Airflow DAGs orchestrating retraining/ETL/drift monitoring with an automated Snowflake→SageMaker→auto-deploy pipeline, and uses rigorous testing plus canary rollouts tied to business metrics and compliance constraints.”
Senior Sales & Customer Success leader specializing in territory growth and data-driven GTM
“Enterprise healthcare GTM/CS leader who owned large accounts end-to-end (sales through renewal), including scaling a hospital network rollout to 51 of 52 facilities in ~3 months. Built an automated customer-feedback and response system (“Schism”) using n8n + ChatGPT that cut CS workload by ~50% and informed product priorities, and has experience driving cross-functional EHR-related deployments and GTM analytics (Metabase/Snowflake/Firebase).”
Mid-Level Software Engineer specializing in backend, data platforms, and FinTech systems
“Backend engineer with experience at HSBC and Machinations who has delivered major production performance wins (cutting large trade-file upload times from ~13–15s to ~2s) using chunked parallel processing with strong reliability controls. Also built and shipped an applied AI RAG workflow using Langflow + Cohere embeddings + FAISS with hosted/local LLM fallbacks (Hugging Face, Ollama) and production-grade guardrails, observability, and evaluation.”
Mid-level AI/ML Engineer specializing in GenAI agents, RAG pipelines, and MLOps
“AI/ML engineer who built a production RAG-based internal document intelligence assistant (LangChain + Pinecone) to let employees query enterprise reports in natural language. Demonstrated hands-on pipeline orchestration with Apache Airflow and tackled real production issues like retrieval grounding and latency using tuning, caching, and token optimization, while partnering closely with non-technical business stakeholders through iterative demos.”
Junior QA Automation Engineer specializing in banking and trading platforms
“QA automation engineer with Barclays digital banking experience who owned an end-to-end regression suite across UI, API, and database layers (Selenium/TestNG, REST Assured, SQL) and integrated it into CI/CD (Jenkins/GitLab). Known for preventing high-impact financial defects like duplicate transaction postings by adding backend SQL validations, negative/edge-case coverage, and converting production issues into automated regression tests; also strong in Cypress flake reduction using cy.intercept/cy.session and stable selectors.”
Mid-level Java Full-Stack Developer specializing in microservices and cloud-native web apps
“Full-stack engineer who has shipped and owned production analytics dashboards using Next.js App Router + TypeScript, combining server components for data-heavy pages with client components for interactive charts/filters. Also built a Temporal-orchestrated payment reconciliation workflow with versioning, idempotency, and exponential-backoff retries, and has hands-on Postgres query/index optimization using EXPLAIN ANALYZE.”
Senior Data Engineer specializing in cloud-native data platforms for finance and healthcare
“Data engineer/backend data services practitioner with Bank of America experience building real-time and batch transaction-monitoring pipelines and APIs (Kafka + databases, REST/GraphQL). Highlights include a reported 45% response-time improvement through performance optimizations and use of Delta Lake schema evolution plus CI/CD (GitHub Actions/Jenkins) and operational reliability patterns like CloudWatch monitoring and dead-letter queues.”
Mid-level Data Scientist specializing in NLP, LLMs, and RAG systems
“Built and deployed a production-style vision-language pipeline that generates structured medical reports from chest X-rays using BioViLT embeddings, an image-text alignment module, and BiGPT fine-tuned with LoRA, delivered via Streamlit and hosted on AWS EC2. Also collaborating experience presenting EDA findings, feature importance, and model performance to Ford managers while working with vehicle parts data at Bimcon.”
Mid-Level Software Developer specializing in Java, Cloud, and Microservices
“Backend/Python engineer who owned an end-to-end FastAPI + AWS internal natural-language document Q&A system (Textract extraction, embeddings/vector DB, LLM integration) with strong focus on reliability and latency. Hands-on with Kubernetes + GitOps (Argo CD, Helm, rolling updates/auto-rollback) and built/optimized Kafka streaming pipelines using Prometheus/Grafana. Also supported a zero-downtime on-prem to cloud migration with parallel run and gradual traffic cutover.”
Senior Data Engineer specializing in cloud data platforms and big data pipelines
“Data engineer focused on building reliable, production-grade pipelines and external data collection systems on AWS (S3/Lambda/SQS/Glue/EMR) using PySpark/SQL, serving curated datasets to Snowflake/Redshift for finance and fraud teams. Has operated a large-scale crawler ingesting millions of records/day with anti-bot tactics, schema versioning/quarantine, and CloudWatch/Datadog monitoring, and also shipped a versioned REST API with caching and query optimization.”
Mid-level ML Data Engineer specializing in MLOps and scalable healthcare data pipelines
“Data/ML platform engineer with healthcare (Cigna) experience owning an end-to-end pipeline spanning Airflow + Debezium CDC ingestion, PySpark/SQL transformations, rigorous data quality gates, and feature-store/API serving for ML training and inference. Worked at 10+ TB scale and cites a ~30% latency reduction plus stronger reliability via idempotent design, monitoring, and backfill-safe reprocessing; also built pragmatic early-stage data pipelines at Frankenbuild Ventures.”