Pre-screened and vetted.
Mid-level Data Scientist specializing in ML, MLOps, and Generative AI
“ML/NLP engineer who built a RAG-based technical assistant for Caterpillar field engineers, transforming PDF keyword search into intent-based semantic retrieval across manuals, logs, sensor reports, and technician notes. Strong in productionizing data/ML systems (Airflow, PySpark) with rigorous preprocessing, entity resolution, and evaluation—delivering measurable gains in accuracy, relevance, and duplicate reduction.”
Senior Data Scientist specializing in data engineering and analytics
“Data/NLP practitioner with experience in both financial services (Truist) and government (USDA), including an NLP-driven analysis of EU regulations to anticipate US regulatory focus and a major redesign/cleaning of complex pathogen lab-test public datasets. Built production data-quality pipelines with Dagster, Pandera, and Azure Synapse, and is comfortable validating hypotheses with historical backtesting and SME-driven quality controls.”
Mid-level Analytics Professional specializing in marketing and business intelligence
“Analytics professional at TIAA with hands-on experience combining SQL, Python, and statistical modeling to unify complex marketing, product, finance, and customer datasets. Has worked on advisor-tool adoption analysis, 10-year wealth diagnostics, forecasting, cohort analysis, and escalation-risk modeling, with findings used by marketing and contact-center stakeholders.”
Mid-level Machine Learning Engineer specializing in NLP, computer vision, and LLMs
“Wayfair ML/AI engineer who has shipped and operated production LLM systems for both internal analytics and customer-facing assistants. Stands out for combining strong RAG/retrieval engineering with production-grade platform work—improving trust, reducing latency by ~30%, and cutting ad hoc reporting demand by ~50%.”
Mid-level Full-Stack Developer specializing in cloud-native enterprise platforms
“Built Nexthire-AI, shipping an end-to-end LLM-powered resume–job description matching product (React + Node.js) using embeddings and retrieval to generate match scores and skill-gap recommendations. Improved post-launch engagement by making feedback cleaner and more actionable, and added production guardrails (validation, timeouts, fallbacks) to handle messy resume formats and AI API instability.”
Mid-level Data Engineer specializing in cloud data platforms
“Built an AI-powered internal support assistant at CVS Health using GPT-4, LangChain, and Pinecone, applying RAG, validation, and monitoring to reduce repetitive support tickets while protecting sensitive healthcare data. Stands out for a pragmatic approach to AI engineering: using multi-agent and LLM workflows to accelerate development while keeping systems constrained, observable, and production-friendly.”
Mid-Level Software Engineer specializing in AI/ML and distributed systems
“Software engineer with production experience building a serverless monolith and multi-layer video pipeline at easyML, plus hands-on integration of multiple LLM providers (Grok/Claude/OpenAI) into a full-stack app. Interested in robotics via computer vision (OpenCV/OpenMMLab), with a strong real-time systems mindset around SLOs, latency, determinism, and reliability; also has low-level OS experience writing a keyboard device driver.”
Mid-level Data Engineer specializing in cloud data pipelines and analytics platforms
“Data engineer with healthcare and enterprise experience (Molina Healthcare, Dell Technologies) building and operating high-volume batch + streaming pipelines across AWS and Azure. Strong focus on data quality (schema validation, fail-fast checks), reliability (monitoring/alerts, retries), and performance tuning (Spark/partitioning), with measurable runtime reduction and improved downstream trust.”
Mid-level Data Engineer specializing in cloud data pipelines and financial services warehousing
“Data engineer (Charles Schwab) who took ownership of an unstable, ambiguous nightly financial data pipeline and rebuilt it into a reliable, incremental AWS Glue/Airflow/Redshift system feeding Power BI. Created a custom Python data-quality framework with hard-stop gating and schema drift detection, improving integrity (99.9%), cutting runtime (~20%), and reducing incidents/tickets (35% fewer schema-related dashboard incidents; 30% fewer investigations).”
Mid-level Data Engineer specializing in cloud lakehouse and streaming platforms
“Data engineer focused on building production-grade pipelines on AWS (Kafka/Kinesis/Glue/S3) through to curated serving layers in Snowflake and Delta Lake. Emphasizes automated data quality validation (PySpark + CI/CD), modular dbt transformations for analytics (customer spending, risk metrics), and operational reliability with CloudWatch and DLQs; data consumed by BI tools and ML pipelines for fraud detection and risk analytics.”
Mid-level Data Engineer specializing in multi-cloud real-time and batch data pipelines
“Data engineer with healthcare domain experience who owned 100M+ record pipelines end-to-end (Kafka/Kinesis/ADF → PySpark/dbt validation → Spark SQL transforms → Snowflake/Power BI serving). Built production-grade reliability practices (Airflow orchestration, CloudWatch/Grafana monitoring, pytest + contract/regression tests, idempotent ingestion/backfills) and delivered measurable improvements: 35% lower latency and 40% better query performance.”
Mid-level Data Engineer specializing in real-time pipelines and cloud data platforms
“Backend engineer with hands-on experience building secure Python/Flask services (sessions, JWT, RBAC) and optimizing PostgreSQL/SQLAlchemy performance, including custom SQL using CTEs/window functions profiled via EXPLAIN ANALYZE. Also integrates LLM features via OpenAI/Azure into backend systems and improves scalability with RabbitMQ-driven async processing, caching, and multi-tenant data isolation patterns.”
Junior Data Analyst specializing in financial and operational analytics
“Analytics professional with experience at KPMG turning messy operational and financial data from SQL Server and AWS S3 into clean reporting datasets and automated Python workflows. They combine SQL, Python, Power BI, and experimentation methods to deliver stakeholder-aligned KPI dashboards and marketing performance insights with a strong focus on data integrity and reproducibility.”
Senior Data Analyst specializing in cloud data platforms, experimentation, and predictive analytics
“Healthcare data/ML practitioner with experience at UnitedHealth Group building production ETL and streaming pipelines (Python, BigQuery, Kafka) that unify EHR, IoT device, and lab data for patient risk prediction. Also implemented embedding-based semantic search/linking for noisy clinical notes via domain adaptation and rigorous validation with clinical stakeholders; previously built churn prediction at DirecTV using XGBoost.”
Mid-level Data Engineer specializing in AWS/Azure pipelines and streaming analytics
“Data engineer with experience across healthcare and geospatial risk systems, owning end-to-end pipelines from ingestion through serving on AWS/Azure stacks. Built HIPAA-compliant data quality gates and CDC for millions of daily claims, and also delivered a real-time wildfire risk platform with 20-minute refresh cycles and a 60% data accuracy lift. Strong in streaming (Kafka), Spark performance tuning, and production-grade orchestration/CI/CD (Airflow, Docker, Jenkins, GitHub Actions, Terraform).”
Senior AI Engineer specializing in forward-deployed voice agents and incident-response automation
“FDE at Bland.ai and founder of Fi (incident-response agent) who routinely takes LLM/agentic concepts from prototype to production. Has hands-on experience reverse-engineering undocumented systems to deliver integrations, building LLM testbeds for voice-agent reliability, and rapidly shipping RAG/semantic search solutions (e.g., Confluence runbooks) after deep customer discovery with DevOps/SRE teams.”
Mid-level Data Analyst specializing in healthcare and financial analytics
“Built and productionized an LLM-powered clinical documentation and insights pipeline at Cardinal Health using LangChain + GPT-4 with RAG to summarize long clinical notes, extract medication/dosage entities, and generate structured SQL-ready outputs for downstream analytics. Emphasizes clinical reliability via labeled benchmarking (precision/recall/F1), shadow deployments, clinician human-in-the-loop review, and ongoing monitoring/orchestration with Airflow, Lambda, S3, Postgres, and Power BI.”
Mid-level AI/ML Engineer specializing in MLOps, LLMs, and real-time inference in FinTech
“ML/LLM engineer who has deployed a production LLM-powered assistant for intent classification and query routing (order recommendation/support deflection), combining BERT fine-tuning with an embedding-based retrieval layer and optimizing for low-latency inference. Experienced with end-to-end reliability practices—Airflow-orchestrated ETL, data validation/alerting, MLflow experiment tracking, and iterative improvements driven by user feedback and monitoring.”
Mid Software Engineer specializing in machine learning and real-time data systems
“Hands-on implementation-focused candidate with experience owning cloud deployments and putting LLM/RAG workflows into production. They stand out for combining customer-facing deployment ownership with practical AI systems work, including retrieval tuning, hallucination mitigation, production incident response, and document-processing pipelines for messy real-world inputs.”
Executive growth leader specializing in AI-powered SaaS, marketplaces, and e-commerce
“Growth leader with strong zero-to-one and systems-building experience across e-commerce and retail media. Most notably, they proposed and launched a new retail media division from scratch, presold demand before product build, and scaled it to $1.5M ARR with 85% margin, while also building data-driven lifecycle and acquisition systems that materially improved activation and CAC efficiency.”
Mid-level Data Engineer specializing in cloud ETL/ELT and healthcare analytics
“Healthcare-focused data engineer/ML practitioner with experience at Lightbeam Health Solutions and Humana building production entity-resolution and semantic similarity pipelines across EMR, lab, and claims data. Uses NLP/ML (spaCy, scikit-learn, BioBERT/LightGBM) plus Snowflake/Airflow and vector search (Pinecone) to improve linkage accuracy (reported 90%) and semantic match quality (reported +12–15%), while reducing manual cleanup by 40%+.”
Senior AI/ML Engineer specializing in Generative AI, RAG, and agentic systems
“GenAI/LLM ML engineer (currently at Webprobo) building an enterprise GenAI platform with document intelligence and automation on AWS and blockchain. Has hands-on experience with RAG, LLM evaluation tooling, and orchestrating production LLM workflows with Apache Airflow, plus deep exposure to reliability challenges in globally distributed/edge deployments. Also partnered with business/marketing stakeholders at a banking client to deliver an AI-driven customer retention insights solution.”
Engineering leader specializing in FinTech ML/AI platforms
“Engineering Manager/player-coach leading Data Infrastructure, ML/DS, and AI Engineering pods who recently shipped multiple production agentic GenAI features. Built privacy-preserving LLM workflows (PII redaction via Microsoft Presidio) and drove an AI expense-approval agent from ambiguous ask to GA, cutting approval time from ~2.5 days to <4 hours with >85% accuracy. Also owned a major LLM cost overrun incident and implemented cost observability plus circuit breakers to prevent runaway agent loops.”
Senior Data Analyst specializing in data pipelines, web scraping, and legal data enrichment
“Data engineer focused on reliable, scalable analytics pipelines and external data collection. Has owned end-to-end pipelines processing 5–10M records/day, serving Snowflake data marts to Power BI/Tableau, and reports ~99% reliability through strong validation/monitoring. Also shipped versioned REST APIs for curated data with query optimization and caching.”