Pre-screened and vetted.
Mid-level Machine Learning Engineer specializing in production ML, forecasting, NLP and computer vision
“Built and deployed a production LLM-powered support assistant for customer support agents using a RAG architecture over internal docs and past tickets, with human-in-the-loop review. Demonstrates strong applied LLM engineering focused on real-world constraints (hallucinations, latency, cost) using routing to smaller models, reranking, caching, and rigorous evaluation/monitoring (offline eval sets, A/B tests, KPI tracking).”
Mid-level Machine Learning Engineer specializing in LLMs, RAG, and MLOps
“Built and deployed a production RAG system for financial/compliance teams using GPT-4, Claude, and local models to retrieve and summarize thousands of internal documents with strong security controls (role-based retrieval, PII masking). Drove significant operational gains (30+ hours/week saved, ~35% productivity lift, ~45% faster responses) and orchestrated end-to-end ingestion/embedding/index refresh pipelines with Airflow, S3, and SageMaker while partnering closely with compliance stakeholders on auditability and traceability.”
Senior Data Engineer specializing in cloud lakehouse and streaming data platforms
“Data platform/data engineer with cross-industry experience in banking and healthcare, building cloud-native lakehouse architectures across AWS/Azure/GCP. Has owned high-volume (millions of records; TB/day) pipelines with strong data quality automation (dbt/Great Expectations), observability (Grafana/Prometheus), and real-time streaming (Kafka/Spark) for fraud monitoring; also delivered an early-stage migration from SQL Server to BigQuery with 40% batch latency reduction.”
Mid-level GenAI & Data Engineer specializing in agentic AI systems and AWS Bedrock
“At onedata, built and deployed an LLM-powered, multi-agent analytics platform on AWS Bedrock that lets users create Amazon QuickSight dashboards through natural-language conversation, cutting dashboard build time from ~30 minutes to ~5 minutes. Strong in production concerns (observability, token/cost tracking, model tradeoffs) and in bridging business + technical work, owning pre-sales pitching through delivery with an engineering management background focused on AI product management.”
Mid-level AI/ML Engineer specializing in GenAI, MLOps, and anomaly detection
“LLM/MLOps engineer who has shipped a production RAG-based technical documentation assistant (FastAPI) cutting manual review by 45%, with deep hands-on retrieval optimization in Pinecone/LangChain (HNSW, hybrid + multi-query search, caching). Also brings healthcare domain experience—building Airflow-orchestrated EHR pipelines and delivering FDA-auditability-friendly predictive maintenance solutions using SHAP/LIME explainability surfaced in Power BI.”
“Backend/data engineer who builds Python (FastAPI) data-processing API services for internal analytics/reporting, emphasizing modular architecture, async performance tuning, and reliability patterns (health checks, retries, observability). Also migrated legacy on-prem ETL pipelines to Azure using ADF/Data Lake/Functions and implemented a near-real-time ingestion flow with Event Hubs plus watermarking to handle late events and deduplication.”
Mid-level Data Scientist specializing in ML, NLP, and LLM-powered solutions
“AI/NLP-focused practitioner who built a zero-/few-shot LLM event extraction system on the long-tail Maven dataset, combining prompt-structured outputs with LoRA/QLoRA fine-tuning and rigorous F1 evaluation. Also implemented entity resolution/data cleaning pipelines and embedding-based semantic search using Sentence-BERT + FAISS, and has healthcare experience delivering a multilingual speech/translation mobile prototype using HIPAA-compliant Azure Cognitive Services.”
Mid-level Data Scientist specializing in NLP and predictive modeling
“AI/ML practitioner in healthcare/insurance (Blue Cross Blue Shield) who built and deployed a production NLP system to classify patient risk from unstructured clinical notes. Experienced in end-to-end pipeline orchestration (Airflow, AWS Step Functions/Lambda/SageMaker) and real-time optimization (BERT to DistilBERT on AWS GPUs), with strong clinician collaboration to drive adoption.”
Mid-level Data Engineer specializing in cloud data platforms and AI agents
“Data/Backend engineer who has owned end-to-end merchant analytics systems on AWS: orchestrated multi-source ingestion (FISERV/Shopify/Clover) with Step Functions/Lambda, enforced strong data quality gates, and served curated datasets via Redshift and a FastAPI layer. Also built an early-stage Merchant Insights AI agent that converts natural language questions into SQL using OpenAI models, with full CI/CD and observability.”
Mid-level ML & Data Engineer specializing in GenAI, graph modeling, and fraud/risk analytics
“Built a production AI fraud/risk scoring platform at BlueArc that ingests web business/product/site data, generates text+image embeddings, and connects entities in a graph to detect reuse patterns and links to known bad actors. Optimized for scale with incremental graph re-scoring and delivered investigator-friendly explainability by surfacing the exact signals/relationships behind each score; orchestrated workflows with Airflow and GCP event-driven components (Pub/Sub, Dataflow, Cloud Run) and has recent LLM workflow orchestration experience (retrieval, prompting, scoring).”
Mid-level Data Engineer specializing in scalable ETL/ELT and real-time streaming pipelines
“Built and shipped a production LLM-powered customer support agent for an EV charging platform using RAG plus internal APIs, automating session/payment issues and ticket routing. Emphasizes production readiness via guardrails, schema validation, state-machine orchestration, monitoring, and continuous evals, delivering a reported 35–40% reduction in support tickets and improved customer satisfaction.”
Mid-level AI Developer & Machine Learning Engineer specializing in LLM and MLOps systems
“Built and deployed an enterprise RAG application at Centene to help clinical teams retrieve insights from large internal policy document sets, cutting manual research by 30–40%. Implemented custom domain-adapted embeddings (SageMaker + BERT transfer learning) and hybrid retrieval (BM25 + Pinecone) to drive a 22% relevance lift, and ran the system in production on AWS EKS with CI/CD, MLflow, and Prometheus monitoring (99% uptime, ~40% latency reduction).”
Mid-level AI/ML Engineer specializing in LLMs, RAG pipelines, and MLOps
“Data professional with ~4 years of experience, most recently at AIG (insurance), building ML/NLP systems for fraud detection and policy automation using transformers, CNNs, and clustering/anomaly detection. Also developed a RAG-based knowledge retrieval system, iterating across embedding models and moving to production based on precision and latency SLAs, then containerizing and deploying with SageMaker and CI/CD.”
Mid-Level Data/ML Engineer specializing in Generative AI and cloud data platforms
“Built and productionized an LLM-based financial document analysis system using a RAG pipeline, including robust ingestion/chunking/embedding workflows, vector DB retrieval, and an AWS-deployed FastAPI service containerized with Docker. Demonstrates strong applied expertise in improving retrieval quality and latency at scale, plus hands-on experience debugging agentic/LLM workflows with monitoring and trace-based analysis while supporting demos and customer-facing adoption.”
Mid-level Data Engineer specializing in multi-cloud real-time data pipelines
“Data engineer with healthcare/clinical trial domain experience who owned a 100TB+/month AWS pipeline end-to-end (Glue/S3/Redshift/Airflow) and drove measurable outcomes (20% lower latency, 99.9% reliability, 40% less manual reporting). Also built production data services and API-based ingestion on GCP (Cloud Run/Functions/BigQuery) with strong validation, versioning, and safe migration practices, and launched an early-stage RAG solution (LangChain + GPT-4) for researchers.”
Mid-level Data Engineer specializing in Azure, Spark, and scalable ETL/ELT pipelines
“Data engineer with banking FP&A experience who led an end-to-end migration of 10+ TB from Teradata to Azure (ADF + Data Lake + Databricks/PySpark + Synapse). Emphasizes reliability (multi-stage validation, monitoring/alerts) and performance (Spark tuning, incremental loads, autoscaling), reporting ~99.5% pipeline reliability while supporting downstream consumers with stable schemas and clear change management.”
Senior Data Engineer specializing in cloud data platforms and real-time analytics
“Data/analytics engineer focused on finance and e-commerce integrations, building end-to-end pipelines and services across Odoo, QuickBooks, Snowflake, and Tableau. Replaced a costly third-party Walmart connector with a serverless AWS Lambda pipeline deployed via Terraform/GitHub and monitored with CloudWatch/Datadog, and shipped a bi-directional Odoo↔QuickBooks invoice sync with distributed locking plus Slack-based finance approvals.”
Mid-level Data Scientist & AI/ML Engineer specializing in GenAI, NLP, and predictive modeling
Mid-level Prompt Engineer specializing in Generative AI and RAG systems
Senior QA Automation Engineer specializing in web, API, and enterprise systems testing
Mid-level Generative AI & Machine Learning Engineer specializing in LLMs, RAG, and multimodal AI
Mid-level Data Engineer specializing in cloud data pipelines, analytics, and AI/ML
Mid-level Data Engineer specializing in cloud lakehouse, streaming, and Snowflake/Databricks