Pre-screened and vetted.
Senior Data Engineer specializing in cloud data platforms and regulated analytics
“Data engineer at Capital One building AWS-based real-time and batch pipelines and backend data services for financial/fraud use cases. Has owned end-to-end pipelines processing millions of records/day, implemented dbt/Great Expectations quality gates, and tuned Redshift/Snowflake workloads (cutting query latency ~22–25% and reducing pipeline failures ~30–40%) while supporting 15+ downstream consumers.”
Mid-level Data Engineer specializing in cloud data platforms and big data pipelines
“Healthcare data engineer with hands-on ownership of claims/member data pipelines on a cloud analytics platform, spanning batch and streaming ingestion (Airflow/Kafka/Spark/Databricks) through serving for reporting. Emphasizes reliability and data quality via embedded validation, schema-drift detection, deduplication, and operational monitoring/incident response, plus pragmatic CI/CD and observability setup in early-stage/ambiguous projects.”
Mid-level Data Engineer specializing in cloud ETL pipelines (Azure, AWS, GCP)
“Data engineer/backend developer who owned end-to-end pipelines and external data collection systems, including API ingestion and large-scale web scraping. Worked at ~50M records/month scale, improving processing speed by 20% and reducing reporting errors by 15%, and shipped a Rust-based internal data API with versioning, caching, and strong validation/observability practices.”
Senior Engineering Manager specializing in platform, data/ML, and identity/access systems
“Senior engineering leader from Goodyear’s AndGo startup-like division who scaled the org from 12 to 30+ across pod-based teams and introduced an Architect Guild/ARD governance model. Led a 4-month Europe launch requiring AWS regional infrastructure, GDPR compliance, i18n/l10n, and new EMEA reporting pipelines, and has hands-on depth in API performance, incident response, and GraphQL/Hasura adoption to boost product velocity.”
Mid-Level Full-Stack Software Engineer specializing in enterprise AI, data pipelines, and scalable APIs
“Forward-deployed engineer/tech lead who built an end-to-end demand planning and forecasting application for a major US steel manufacturer, integrating Snowflake data into the C3 platform with batch/MapReduce workflows, monitoring, and a React/TypeScript UI. Also productionized an enterprise LLM integration with structured outputs and authorization guardrails, reporting +30% stakeholder engagement and broad adoption across customer deployments.”
Mid-level Software Engineer specializing in cloud data platforms and LLM applications
“LLM/agent builder with experience shipping production LLM features at an early-stage ed-tech mental wellness startup (conversation analysis + structured feedback via FastAPI, OpenAI API, Render, CI/CD). Also built a multi-step dining concierge agent using OpenSearch over Yelp data with fallback query relaxation, and has enterprise data engineering experience at Capgemini migrating databases to Snowflake with robust ETL normalization and data-quality handling.”
Mid-level Data Engineer specializing in lakehouse ETL and analytics engineering
“Data engineer with strong end-to-end ownership of production lakehouse pipelines (Snowflake + Databricks + Airflow + dbt + Great Expectations), handling 8M+ records/month and 500K+ daily CDC updates. Delivered measurable reliability and efficiency gains (41% cost reduction, freshness improved from 4h to 30m, 35% fewer downstream incidents) and has experience building a lakehouse platform from scratch across 12 source systems.”
Executive CTO specializing in digital health platforms, cloud & AI, and FHIR/HL7 interoperability
“Healthcare diagnostics/health tech founder building Casandra.ai, an API-driven lab test catalog and ordering platform designed to standardize fragmented test catalogs and integrate into provider workflows via FHIR. Bootstrapped and built a deploy-ready product, drawing on prior startup experience and accelerator participation (Health Box, DreamIt Ventures).”
Senior Business Analytics Consultant specializing in BI, data engineering, and predictive analytics
“Healthcare analytics candidate with hands-on experience turning messy claims, enrollment, and reference data into trusted SQL reporting layers and reproducible Python workflows. They emphasize metric standardization, stakeholder alignment, and operational impact, including ~40% reduction in manual reporting effort and improved forecasting/resource prioritization through high-risk patient segmentation.”
Mid-level Data Analyst specializing in banking and product analytics
“Analytics engineer/data analyst with Bank of America experience turning fragmented financial data across SQL Server, PostgreSQL, Kafka, and flat files into trusted Snowflake/dbt reporting models. Stands out for unifying disputed business definitions like churn and payment success rate, automating manual analysis in Python, and pairing strong data quality rigor with stakeholder adoption through self-service dashboards.”
Senior AI/ML Engineer specializing in predictive analytics and NLP
“ML/AI engineer with hands-on experience building production healthcare AI systems across predictive modeling and GenAI. They built an end-to-end patient risk prediction platform and a RAG-based clinical summarization feature, combining strong NLP/LLM skills with AWS deployment, monitoring, drift detection, and reusable Python service design to deliver measurable clinical and operational impact.”
Mid-level AI/ML Engineer specializing in scalable ML, NLP, and MLOps
“ML/AI engineer with strong production depth across classical ML, MLOps, LLM/RAG, and scalable Python data platforms, with experience at Cisco and Accenture. Stands out for tying technical decisions to measurable business outcomes, including $1.2M annual savings, 40% faster support resolution, and broad internal adoption of shared engineering frameworks.”
Mid-level AI/ML Engineer specializing in Generative AI and MLOps
“AI engineer and current tech lead building a RAG-based multi-agent QA platform for financial document analysis at significant scale (40,000-50,000 documents). They combine Python, CrewAI, FastAPI, Hugging Face embeddings, Pinecone, and AWS SageMaker to deliver retrieval, calculation, summarization, forecasting, and visualization workflows, while leading a small cross-functional team.”
Executive CTO specializing in AI, Cloud, and digital engineering transformation
“CTO-for-equity founder who partners with non-technical co-founders to bring AI-powered product ideas to market, emphasizing financial viability and go-to-market planning over "just an MVP." Currently working with two other founders (CEO/Sales and CPO/industry expert) to launch a SaaS platform, leveraging AI coding tools (Cursor/Windsurf) for extremely fast, production-ready iteration with daily releases.”
“Built and productionized a secure internal RAG-based AI assistant (LangChain/FastAPI/FAISS on GCP), tackling real-world issues like latency, retrieval speed, and hallucinations—delivering 25% faster retrieval and 99.9% uptime. Also implemented scalable, reliable ML retraining orchestration with AWS Step Functions/SageMaker/Lambda and partners closely with compliance analysts to iteratively refine prompts and outputs to meet governance standards.”
Mid-level AI/ML Engineer specializing in computer vision, NLP, forecasting, and GenAI
Senior Data Scientist specializing in GenAI, fraud/credit risk, and cloud MLOps
Mid-level AI/ML Engineer specializing in fraud detection and Generative AI
Mid-level Data Engineer specializing in scalable ETL pipelines and data quality automation
Mid-Level Software Engineer specializing in distributed systems and ML pipelines
Mid-level Machine Learning Engineer specializing in NLP, MLOps, and risk/fraud analytics
Executive Technology Leader (CTO) specializing in SaaS modernization, cloud, AI, and cybersecurity
Executive Technology Leader specializing in digital transformation, data platforms, and cybersecurity