Pre-screened and vetted.
Mid-level AI/ML Engineer specializing in Generative AI and data engineering
“IBM engineer who built and deployed a production RAG-based LLM assistant using LangChain/FAISS with a fine-tuned LLaMA model, served via FastAPI microservices on Kubernetes, achieving 99%+ uptime. Demonstrates strong practical expertise in reducing hallucinations (semantic chunking + metadata-driven retrieval) and managing latency, plus mature MLOps practices (Airflow/dbt pipelines, MLflow tracking, monitoring, A/B and shadow deployments) and effective collaboration with non-technical stakeholders.”
Intern Data Scientist specializing in AI, analytics, and cloud data engineering
“Built a production multimodal LLM-based vendor risk assessment platform that ingests SOC reports and other documents, uses a strict RAG pipeline with grounded evidence (page/paragraph citations), and dramatically reduces analyst review time. Experienced with LangGraph/LangChain/AutoGen for stateful, fault-tolerant agent workflows, and emphasizes reliability (schema validation, guardrails) plus low-latency delivery (~1–2s) through hybrid retrieval, reranking, caching, and model tiering.”
Senior Solutions Engineer specializing in blockchain governance and compliance analytics
“Consulting background (Accenture) delivering technically complex solutions involving on-chain data and strict government security standards, including building isolated sandbox environments to move from PoC to production. Experienced in debugging agentic/LLM-style workflows (e.g., document scanning issues) with deterministic guardrails, preprocessing, and strong logging/monitoring. Has led large-scale crypto wallet workshops (including for the CFTC) and helped win business via clear, layered technical demos; also built internal marketing taxonomy tooling and drove adoption through cross-functional alignment.”
Mid-level DevOps/Cloud Engineer specializing in AWS & Azure infrastructure and CI/CD automation
“Infrastructure engineer with hands-on ownership of a scaled IBM Power/AIX estate (AIX 7.x, VIOS, HMC; 2 frames/20+ LPARs) supporting critical middleware and database workloads, including live DLPAR changes and VIOS/SAN outage recovery. Also brings modern DevOps/IaC experience building GitHub Actions pipelines for Docker/Kubernetes deployments and provisioning AWS environments with Terraform (EKS/RDS/VPC/IAM) using modular, review-driven workflows.”
Mid-level Data Engineer specializing in cloud data platforms and scalable ETL pipelines
“Data engineer (~4 years) with full-stack delivery experience (Next.js App Router/TypeScript + React) building a real-time operations monitoring dashboard backed by Kafka and orchestrated data pipelines. Strong production focus: Airflow + CloudWatch monitoring, automated Python/SQL validation (99.5% accuracy), and CI/CD with Jenkins/Docker; has delivered measurable improvements in latency, pipeline reliability, and query performance (Postgres/Redshift).”
Mid-level Full-Stack Developer specializing in React/Node, GraphQL, and Databricks lakehouse
“Full-stack engineer currently at Southern Glazer’s who built and owned a real-time commercial finance expense analytics dashboard end-to-end (Next.js App Router + TypeScript), including post-launch monitoring, data quality checks, and stakeholder-driven iteration. Strong data/analytics backend experience (Postgres modeling and Databricks Delta Lake pipelines) with demonstrated performance wins—e.g., cutting a key reconciliation query from 8–12s to <400ms and improving frontend load time ~40% with a 25% bounce-rate drop at Verizon.”
Mid-level Data Engineer specializing in cloud ETL/ELT and lakehouse architecture
“Data engineer focused on sales/marketing analytics pipelines, owning ingestion from CRMs/ad platforms through warehouse serving and dashboards at ~hundreds of thousands of records/day. Built reliability-focused systems including dbt/SQL/Python data quality gates with alerting, a resilient web-scraping pipeline (retries/backoff, anti-bot tactics, schema-change detection, backfills), and a versioned internal REST API with caching and strong developer usability.”
Mid-level Data Engineer specializing in real-time streaming and cloud data platforms
“Data engineer with Wells Fargo experience owning an end-to-end lakehouse ETL pipeline on Databricks/Azure Data Factory, processing ~480GB daily and implementing robust data quality/reconciliation across 40+ tables to reach ~99.3% reliability. Strong in performance optimization (cut runtime 5.5h→3.8h), CI/CD and monitoring, and resilient external/API ingestion with retries, schema validation, and backfills.”
Mid-Level Full-Stack Software Engineer specializing in AWS cloud and Python/Java
“Accenture consultant who shipped an LLM-based production solution during a client cloud migration to parse application code and identify only the database objects actually used, cutting migration time by 30% and accelerating realization of cloud cost benefits. Emphasizes production robustness with timeouts/retries/fallback routing, validation, observability, and a disciplined eval/monitoring loop that turns failures into regression tests.”
Mid-level Software Engineer specializing in AWS cloud infrastructure and data platforms
“Backend/infra-focused software engineer who built an autonomous Python API-orchestration agent using asyncio with strong reliability and observability (trace IDs, structured logs, retries/timeouts) and containerized dev workflow. Experienced deploying Python services to Kubernetes with Helm and running GitOps CI/CD via ArgoCD, plus leading an AWS IAM-to-Identity Center migration using CloudTrail-driven least-privilege role design. Also built and debugged a Kafka/SnapLogic bidirectional pipeline syncing Redshift and HBase, resolving missing-record issues via Kibana-driven investigation.”
Mid-level AI/ML Engineer specializing in GenAI and financial risk & compliance analytics
“Built and deployed a production LLM-powered financial risk and compliance platform to reduce manual trade exception handling and speed up insights from regulatory documents. Implemented a LangChain multi-agent workflow with structured/unstructured data integration (Redshift + vector DB) and emphasized hallucination reduction for regulatory safety using Amazon Bedrock. Strong MLOps/orchestration background across Kubernetes, Airflow, Jenkins, and monitoring/testing with MLflow, Evidently AI, and PyTest.”
Intern Data Scientist specializing in ML engineering and LLM agentic workflows
“Built an agentic, multi-step LLM system that generates full-stack code for API integrations using LangChain orchestration, Pinecone/SentenceBERT RAG, and a human-in-the-loop feedback loop for iterative code refinement. Also collaborated with non-technical content writers and PMs during a Contentstack internship to deliver a Slack-based AI workflow that generates and brand-checks articles with one-click approvals.”
Mid-level Full-Stack Software Engineer specializing in AI and data applications
“Analytics-focused candidate with experience building SQL/Python pipelines and dashboards for donor, campaign, and website performance reporting. They have worked with messy multi-source data, standardized metric definitions, and delivered automated reporting that reportedly reduced manual effort by about 80%.”
Mid-level Data Analyst specializing in business intelligence and cloud data platforms
“Healthcare analytics professional with TCS/Humana experience turning messy claims and eligibility data into reliable reporting assets using SQL and Python. They combine strong data engineering and analytics execution with stakeholder management, including automating monthly claims reporting from half a day to under 5 minutes and driving a provider outreach effort that reduced claim rejection rates by about 20%.”
Mid-level Full-Stack Software Engineer specializing in cloud and data engineering
“Backend engineer with experience at Cigna evolving REST API services backed by PostgreSQL, emphasizing reliability/correctness, scalability, and observability. Has hands-on production experience with FastAPI (contract-first design, Pydantic schemas), performance tuning (indexes, caching), and secure auth patterns (OAuth/JWT, RBAC, row-level security via Supabase), plus low-risk incremental rollouts using feature flags and dual writes.”
Mid-level Data Scientist specializing in LLMs, RAG, and document intelligence
“LLM/ML engineer who has shipped production systems in legal/financial-risk domains at Wolters Kluwer, including a hybrid OCR+deterministic+LLM extraction pipeline that structured UCC filings at massive scale and drove $6M+ in revenue. Also built LangGraph-based multi-agent “Deep Research” workflows with model routing, tool calls (MCP), persistence, and human-in-the-loop review, and partnered closely with policy writers to deliver LLM summarization that cut writing time by ~60%.”
Mid-level Machine Learning Engineer specializing in LLMs, RAG, and MLOps
“LLM/agentic systems engineer who built a production "Agentic AI Diagnostic Assistant" for network engineers, using a multi-agent Llama 2 + LangChain architecture with RAG over telemetry/incident data in DynamoDB and confidence-based deferrals to reduce hallucinations. Also has strong MLOps/orchestration experience (Airflow, EventBridge, Spark, Docker, SageMaker/ECS) at multi-terabyte/day scale and delivered multilingual NLP analytics (fine-tuned BERT/spaCy) for support operations through hands-on stakeholder workshops.”
Mid-level Machine Learning Engineer specializing in NLP, LLMs, and MLOps
“Built a production internal LLM/RAG assistant at CVS Health to cut time spent searching long policy and clinical guideline PDFs, combining fine-tuned BERT/GPT models with FAISS retrieval and a FastAPI service on AWS. Demonstrates strong real-world reliability work (document cleanup, hallucination controls, monitoring/drift tracking with MLflow) and close collaboration with non-technical clinical operations teams via demos and feedback-driven iteration.”
Mid-level Data Scientist & AI/ML Engineer specializing in GenAI and cloud ML
“GenAI/LLM engineer who recently built a production compliance assistant at State Farm for KYC/AML and regulatory teams, using AWS Bedrock + LangChain with Textract/Lambda pipelines to extract fields, tag risk, and summarize long documents. Implemented RAG, strict structured outputs, and human-in-the-loop guardrails, and reports automating ~80% of documentation work while reducing review time by ~40%.”
Mid-level Data Analyst specializing in cloud ETL, BI, and machine learning
“Data/ML practitioner with experience at UnitedHealth Group building a fraud claims detection solution combining structured claims data and unstructured notes, validated with compliance stakeholders to improve actionable accuracy. Also applied embeddings, vector databases, and fine-tuned language models in a Bank of America capstone to detect threats/anomalies in financial documents, with production-minded Python ETL workflows using Airflow.”
Senior Data Scientist specializing in NLP and explainable machine learning
“NLP/ML practitioner who built an explainable, clinician-aligned system to detect cognitive decline (Alzheimer’s/stroke-related) from audio responses, achieving 97% accuracy on only a few hundred data points. Also has experience with healthcare claims entity resolution and prototyped a word2vec-based patent search vector database in Elasticsearch, with strong emphasis on testing, interpretability, and scalable Python data workflows.”
Mid-level Data Engineer specializing in AWS cloud data platforms
“Data engineer with Charter Communications experience modernizing large-scale AWS data lake pipelines: ingesting S3 data, validating against legacy systems, transforming with PySpark/Spark SQL, and serving via Iceberg/Delta tables. Worked at 50M–300M record scale, delivered >99.5% data match, and built monitoring/alerting (CloudWatch/SNS) plus retry orchestration (Step Functions) and data quality gates (Great Expectations).”
Mid-level Backend Software Engineer specializing in cloud-native microservices and FinTech
“Backend-focused engineer with Mastercard experience building and operating high-volume transaction-processing microservices. Has owned customer-facing banking services end-to-end and built an internal on-call analytics tool that centralized logs/metrics with real-time filtering to speed root-cause analysis and reduce incident investigation time.”
Mid-level Full-Stack Developer specializing in React and scalable web applications
“Backend/data engineer with hands-on production experience across FastAPI microservices and AWS data platforms. Has delivered serverless and Glue/EMR-based ETL pipelines with strong observability (Prometheus/Grafana/Sentry, CloudWatch/SNS), schema-evolution resilience, and measurable SQL performance wins (5 min to <30 sec). Open to onsite meetings in the Bethesda, MD area and flexible on remote arrangements.”