Pre-screened and vetted.
Mid-level AI/ML Engineer specializing in MLOps, Databricks Lakehouse, and GenAI RAG systems
Mid-level Data Engineer specializing in streaming data pipelines and cloud data platforms
Mid-level Data Engineer specializing in cloud lakehouse and streaming pipelines
Senior Data Engineer specializing in cloud data platforms and ETL/streaming pipelines
Mid-level Data Engineer specializing in cloud data platforms and big data pipelines
Mid-level AI/ML Engineer specializing in production ML systems and agentic workflows
Senior AI/ML Engineer specializing in Generative AI, Agentic AI, and RAG systems
Senior Data Engineer specializing in real-time pipelines, cloud data platforms, and healthcare analytics
Senior Data Engineer specializing in AWS cloud data platforms and streaming analytics
Senior Data Engineer specializing in multi-cloud lakehouse architectures and privacy/AI governance
Senior Software Engineer specializing in FinTech and distributed systems testing
Senior AI Python Engineer specializing in Generative AI and MLOps
Mid-Level Software Engineer specializing in Cloud Infrastructure and Full-Stack Platforms
“Built and shipped a production LLM-powered grading platform that automates rubric-aligned scoring and feedback, with strong guardrails (RAG grounding, structured JSON, validation/retries) and operational rigor (metrics, drift monitoring). Experienced using CrewAI to orchestrate multi-agent workflows end-to-end and validating quality via gold-set benchmarking against human graders with regression testing on every prompt/model change.”
Senior Data Engineer specializing in cloud data platforms and ML pipelines
“Data engineer focused on AWS-based enterprise data platforms, owning end-to-end pipelines from multi-source batch/stream ingestion (Glue/Kinesis/StreamSets/Airflow) through PySpark transformations into curated datasets for Redshift/Snowflake. Emphasizes production reliability with strong monitoring/observability and data quality gates, and reports ~30% performance improvement plus improved SLAs and latency after optimization.”
Mid-level Backend Python Engineer specializing in APIs, microservices, and data pipelines
“Backend engineer (Marsh McLennan) who evolved a high-volume claims automation pipeline in Python, emphasizing thin APIs with background job processing, strong validation/retries, and production-grade observability. Experienced in secure FastAPI API design (centralized JWT/RBAC), multi-tenant Postgres/Supabase-style row-level security, and low-risk refactors using parallel runs and feature flags; targeting founding-engineer scope roles.”
Mid-level Data Engineer specializing in AWS lakehouse platforms and scalable ETL/ELT
“Data engineer focused on reliable, production-grade pipelines and data services: has owned end-to-end ingestion-to-serving workflows processing millions of records/day, using Airflow, Python/SQL, and PySpark. Demonstrates strong operational rigor (monitoring, retries, idempotency, backfills) and measurable outcomes (98% stability, ~30% faster processing), plus experience exposing curated warehouse data via versioned REST APIs.”
Mid-level Data Engineer specializing in cloud data platforms and lakehouse architectures
“Data engineer in a banking context who has owned end-to-end Azure lakehouse pipelines ingesting financial/vendor data from APIs, Azure SQL, and flat files into Databricks/Delta (bronze-silver-gold). Emphasizes production reliability via schema-drift validation, data quality controls, monitoring/alerting, retries/checkpointing, and Spark/Delta performance tuning, with outputs served to BI/reporting teams (e.g., Tableau).”
Mid-level Data Scientist specializing in ML, NLP, and Generative AI
“Data engineering / ML practitioner with experience at MetLife building transformer-based sentiment analysis over large unstructured datasets and productionizing pipelines with Airflow/PySpark/Hadoop (reported 52% efficiency gain). Also implemented embedding-based semantic search using Pinecone/Weaviate to improve retrieval relevance and enable RAG for customer support and document matching use cases.”
Mid-level Full-Stack Developer specializing in FinTech and enterprise platforms
“Engineer with a pragmatic, production-focused approach to AI-assisted development, using tools like Copilot and ChatGPT to accelerate coding while maintaining strict validation for correctness, security, and performance. Particularly notable for building a multi-agent incident-resolution workflow for a financial platform, with specialized agents for log analysis, root cause identification, fix suggestions, and test generation.”
Senior Data & Platform Engineer specializing in cloud-native streaming and distributed systems
“Financial data engineer who has built and operated high-volume batch + streaming pipelines (200–300 GB/day; 5–10k events/sec) using AWS, Spark/Delta, Airflow, Kafka, and Snowflake, with strong emphasis on data quality and reliability. Demonstrated measurable impact via 99.9% SLA adherence, major reductions in bad records/nulls, MTTR improvements, and significant latency/runtime/query performance gains; also built a distributed web-scraping system processing 5–10M records/day with anti-bot and schema-drift defenses.”