Pre-screened and vetted.
Senior Data Engineer specializing in cloud-native data platforms for finance and healthcare
“Data engineer/backend data services practitioner with Bank of America experience building real-time and batch transaction-monitoring pipelines and APIs (Kafka + databases, REST/GraphQL). Highlights include a reported 45% response-time improvement through performance optimizations and use of Delta Lake schema evolution plus CI/CD (GitHub Actions/Jenkins) and operational reliability patterns like CloudWatch monitoring and dead-letter queues.”
Senior Data Engineer specializing in cloud data platforms and big data pipelines
“Data engineer focused on building reliable, production-grade pipelines and external data collection systems on AWS (S3/Lambda/SQS/Glue/EMR) using PySpark/SQL, serving curated datasets to Snowflake/Redshift for finance and fraud teams. Has operated a large-scale crawler ingesting millions of records/day with anti-bot tactics, schema versioning/quarantine, and CloudWatch/Datadog monitoring, and also shipped a versioned REST API with caching and query optimization.”
Mid-level Data Engineer specializing in cloud ETL/ELT and big data pipelines
“Data engineer focused on production-grade pipelines and data services: ingests millions of records/day into S3, performs SQL/Python quality validation and PySpark/SQL transformations, and serves curated datasets via Athena/Redshift. Has experience hardening external data collection with retries/rate-limit handling and shipping versioned internal data APIs with backward compatibility, monitoring, and CI/CD in early-stage environments.”
Mid-level ML Data Engineer specializing in MLOps and scalable healthcare data pipelines
“Data/ML platform engineer with healthcare (Cigna) experience owning an end-to-end pipeline spanning Airflow + Debezium CDC ingestion, PySpark/SQL transformations, rigorous data quality gates, and feature-store/API serving for ML training and inference. Worked at 10+ TB scale and cites a ~30% latency reduction plus stronger reliability via idempotent design, monitoring, and backfill-safe reprocessing; also built pragmatic early-stage data pipelines at Frankenbuild Ventures.”
Senior Backend Software Engineer specializing in microservices, Kafka, and cloud-native AWS platforms
“LLM/agent engineer with production experience in the insurance claims domain, integrating OpenAI + LangChain into a claims platform to automate unstructured document extraction/classification and cut manual effort by 35%. Built reliable, fault-tolerant AWS/Kubernetes microservices with CloudWatch monitoring plus circuit breakers/retries/fallbacks, and implemented multi-step Spring Boot orchestration with schema validation, confidence gating, and human-in-the-loop handling for low-confidence cases.”
Mid-level Software Engineer specializing in cloud microservices and data pipelines
“Data engineer/platform builder who has owned production pipelines end-to-end processing millions of records/day, with strong emphasis on data quality (quarantine workflows) and reliability (monitoring, retries, incremental loads). Also designed large-scale external data collection/crawling with anti-bot handling and backfills, and shipped versioned REST data services optimized for performance and developer usability in an early-stage environment.”
“Built an AI-driven insurance policy summarization platform at Marsh, taking it end-to-end from messy PDF ingestion/OCR and custom extraction through LLM fine-tuning and AWS SageMaker deployment. Delivered measurable impact (25% reduction in manual review time, 99% uptime) and demonstrated strong production MLOps/LLMOps practices with Airflow/Step Functions orchestration, rigorous evaluation (ROUGE + human review), and continuous monitoring for drift, latency, and hallucinations.”
Junior Machine Learning Engineer specializing in LLM evaluation and GenAI pipelines
“LLM/agent engineer who built a production LangGraph multi-agent orchestrator connecting GitHub and APM/observability signals with a chain-of-verification loop for root-cause analysis. Emphasizes pragmatic architecture (start simple with state summaries), performance tuning (async LLM calls, Docker), and rigorous evaluation (LLM-as-judge, adversarial testing, hallucination/instruction adherence metrics, tool-call tracing) while iterating with non-technical stakeholders via A/B testing.”
Mid-level Data Engineer specializing in cloud lakehouse, streaming, and MLOps
“Data engineer at AT&T focused on large-scale telecom (5G/IoT) data platforms, owning end-to-end pipelines from Kafka/Azure ingestion through Databricks/Delta Lake transformations to serving analytics and ML. Has operated at very high volumes (~50+ TB/day) and delivered measurable performance gains (25–30% faster processing) plus improved reliability via Airflow monitoring, robust data quality checks, and resilient external data collection patterns (rate limiting, retries, dynamic schemas).”
Mid-Level .NET Developer specializing in microservices and cloud-native FinTech/Healthcare systems
“Backend engineer with healthcare and financial services experience (Humana, PNC) who owned production-grade, high-volume ingestion-to-API pipelines end-to-end in C#/.NET and SQL. Strong focus on data quality, handling out-of-order/partial upstream records, and improving reliability/observability via structured logging and telemetry, plus significant SQL performance tuning to reduce peak-load issues.”
Mid-level Data Engineer specializing in cloud data platforms and streaming pipelines
“Data engineer currently at American Airlines who built and owned end-to-end flight operations and booking data pipelines (batch + real-time) using Azure Data Factory, Kafka, Spark/Databricks, Synapse, and Snowflake—processing hundreds of GBs/day. Strong focus on reliability and data quality (idempotency, checkpointing, retries, validation/alerts) and delivered near-real-time analytics powering Power BI dashboards; previously helped stand up an early-stage data platform at Sysco on AWS (Glue/S3/Redshift) with Airflow and Jenkins CI/CD.”
Executive Engineering Leader specializing in SaaS data platforms, integrations, and risk & compliance
“Former founding engineer and eventual CTO at 2Plus2 Partners (25 years ago) with additional experience in two private-equity-backed companies (apex analytix and HICX). Interested in helping build another company before retirement; comfortable with entrepreneurial risk but cannot self-fund significant capital.”
Junior Business & Operations Analyst specializing in banking analytics
“Analytics professional with Bank of America experience analyzing the digital card replacement journey at scale, transforming 20M event records across 900K clients into actionable reporting and recommendations. Stands out for combining SQL/Python data engineering, customer journey analytics, and executive-facing storytelling strong enough to get senior leadership to adopt most of their proposed app improvements.”
Mid-level Data Analyst specializing in business intelligence and customer analytics
“Healthcare-focused data analyst with hands-on experience at Molina Healthcare building SQL and Python workflows for retention and churn analytics. They combined enrollment, CRM, and claims data into Power BI reporting, automated predictive churn analysis, and tied their work to measurable outcomes including faster processing, better reporting accuracy, and reduced churn.”
Mid-level Software Engineer specializing in FinTech trading platforms
“Built and deployed internal trading tools at Wells Fargo that reduced manual production-support dependency for trader configuration workflows. Brings hands-on experience in financial systems, data quality, and production incident resolution, including building 400+ SQL validation rules and designing an internal RAG assistant for engineering documentation.”
Mid-Level Software Engineer specializing in backend microservices and cloud platforms
“Backend engineer in healthcare data systems who has owned production pipelines end-to-end, from ingesting patient and claims data to serving it through secure APIs. Brings a strong mix of Python, SQL, microservices, cloud deployment, and data reliability practices, with measurable performance gains and experience building resilient integrations with external data sources.”
Senior AI/ML & Data Engineer specializing in Generative AI and RAG systems
“GenAI/RAG engineer who has deployed a production policy/regulatory search assistant for a financial client using LangChain + Vertex AI, FastAPI, Docker/Kubernetes, and Airflow-orchestrated data pipelines. Demonstrated measurable impact with 50–60% latency reduction and 70% fewer pipeline failures, plus KPI-driven grounding evaluation (90%+ target) and strong cross-functional collaboration with compliance/business teams.”
Mid-level AI/ML Engineer specializing in Generative AI and NLP
“Built an end-to-end GenAI underwriting copilot at TD Bank for complex financial documents, combining RoBERTa-based risk classification with Azure OpenAI RAG to deliver grounded, citation-based insights. Drove a 40-50% reduction in manual underwriting review time and created reusable FastAPI ML services that cut integration effort for other teams by 30-40%.”
Junior Data Scientist specializing in machine learning and reinforcement learning
Mid-level Data Analyst specializing in banking analytics and machine learning
Mid-level AI/ML Engineer specializing in Generative AI and LLM solutions
Mid-level Data Engineer specializing in cloud ETL and data platforms (AWS/Azure)
Mid-Level Data Engineer specializing in scalable cloud data pipelines and API-driven data services
Senior Data Engineer specializing in cloud data platforms and LLM/RAG solutions