Pre-screened and vetted.
Senior Support Engineer specializing in AI Search, LLMs, and Document Intelligence
Senior Data Scientist specializing in marketing analytics, attribution, and revenue forecasting
Junior Full-Stack Software Engineer specializing in data-driven web apps and cloud platforms
Senior ETL/Data Engineer specializing in cloud data platforms and AI/ML-ready pipelines
Senior Data Engineer specializing in AWS serverless and real-time data platforms
Mid-level Data Engineer specializing in cloud lakehouse and streaming platforms
Senior Data Engineer specializing in multi-cloud data platforms and real-time analytics
Mid-level Data Engineer specializing in financial data engineering and scalable pipelines
Senior Data Engineer specializing in cloud data platforms and real-time streaming
“Data engineer focused on building reliable, production-grade data systems end-to-end: batch and real-time pipelines (Airflow/Kafka/Spark) with strong data quality, monitoring/alerting, and incident response. Has experience integrating external API/web data with retries, throttling, and schema-change handling, and serving curated datasets to analytics (Power BI) and backend consumers with performance optimizations like Redis caching.”
Mid-level Data & AI Engineer specializing in data engineering, analytics, and LLM/RAG apps
“Built a production RAG-based “unified assistant” that consolidates siloed company documents into a single chatbot while enforcing fine-grained access control via RBAC/metadata filtering with OAuth2/JWT. Experienced orchestrating LLM workflows with LangChain/LangGraph + FastAPI (async + caching) and measuring performance via retrieval accuracy and response-time SLAs. Also delivered a churn analytics solution with dashboards and automated retention campaigns using n8n.”
Mid-level AI/ML Engineer specializing in Generative AI and healthcare data
“Built and deployed a production RAG-based document Q&A system on Azure OpenAI to help business teams search thousands of PDFs/Word files, using Qdrant vector search, MongoDB, and a Flask API. Demonstrates strong production engineering (streaming large-file ingestion, parallel preprocessing, monitoring/retries) plus systematic prompt/embedding/chunking experimentation to improve accuracy and reduce hallucinations, and has hands-on orchestration experience with ADF/Airflow/Databricks/Synapse.”
Senior Data Scientist specializing in data engineering and analytics
“Data/NLP practitioner with experience in both financial services (Truist) and government (USDA), including an NLP-driven analysis of EU regulations to anticipate US regulatory focus and a major redesign/cleaning of complex pathogen lab-test public datasets. Built production data-quality pipelines with Dagster, Pandera, and Azure Synapse, and is comfortable validating hypotheses with historical backtesting and SME-driven quality controls.”
Mid-level AI/ML Engineer specializing in healthcare analytics and MLOps
“AI/ML engineer at Cigna Healthcare building a production, HIPAA-compliant LLM-powered clinical insights platform that summarizes unstructured medical notes using a fine-tuned transformer + RAG on AWS. Demonstrates strong end-to-end MLOps and cloud optimization (distillation, Spot/Lambda/Auto Scaling) with quantified outcomes (~28% accuracy lift, ~40% less manual review, ~25% lower ops cost) and strong clinician-facing explainability via SHAP and dashboards.”
Executive Technology Leader (CTO) specializing in IoT, enterprise systems, and digital transformation
“Founder of an LLC operating as a consulting firm providing fractional CTO services to startups, giving them parallel exposure to multiple early-stage companies. Has direct experience with MVP development, building org structures from scratch, and supporting early fundraising, and is exploring a pivot from consulting into a scalable product business while staying engaged with the VC/accelerator ecosystem.”
Senior Data Engineer specializing in Palantir Foundry and Snowflake for regulated industries
“Data engineer focused on high-volume transaction pipelines (2M+ per day) using Snowflake/Snowpipe, Spark/PySpark, Kafka, and Airflow, with a strong emphasis on schema/data-quality enforcement and reliability improvements. Also built a greenfield compliance-focused RAG solution, using CloudWatch monitoring and adding ingestion validation to prevent malformed OCR documents from degrading search quality.”
Senior GenAI/ML Engineer specializing in LLMs, RAG, and multimodal generative AI
“LLM/RAG engineer with production deployments in highly regulated domains (Frost Bank and GE Healthcare). Built secure, explainable document-grounded Q&A systems using LoRA fine-tuning, strict RAG with confidence thresholds, and citation-based responses; also established evaluation/monitoring (golden QA sets, hallucination tracking, drift) and achieved ~40% latency reduction through retrieval/prompt tuning.”
Mid-level Machine Learning Engineer specializing in LLM agents, RAG, and MLOps
“Built a production AI-driven contract/document extraction system combining OCR, normalization, and LLM schema-guided extraction, orchestrated with PySpark and Azure Data Factory and loaded into PostgreSQL for analytics. Emphasizes reliability at scale—using strict JSON schemas, confidence scoring, targeted retries, and multi-layer validation to control hallucinations while processing thousands of PDFs per hour—and partners closely with non-technical business teams to refine fields and deliver usable dashboards.”
Mid-level Data Engineer specializing in real-time pipelines and cloud data platforms
“Backend engineer with hands-on experience building secure Python/Flask services (sessions, JWT, RBAC) and optimizing PostgreSQL/SQLAlchemy performance, including custom SQL using CTEs/window functions profiled via EXPLAIN ANALYZE. Also integrates LLM features via OpenAI/Azure into backend systems and improves scalability with RabbitMQ-driven async processing, caching, and multi-tenant data isolation patterns.”
Mid-level Data Analyst specializing in financial risk and healthcare analytics
“AI/ML engineer focused on real-time, production-grade LLM systems, with a robotics-adjacent mindset around latency/accuracy tradeoffs and modular pipelines. Built a scalable RAG-based assistant orchestrated as microservices on Kubernetes with Kafka async messaging, ONNX/quantization optimizations, and monitoring (Prometheus/Grafana), citing a ~35% hallucination reduction; has also experimented with ROS Noetic/Gazebo to understand ROS concepts.”
Mid-level Data Scientist specializing in MLOps, LLM/RAG applications, and deep learning
“Built and deployed a production compliance automation RAG system (at Citi) that generates citation-backed, schema-validated risk summaries for regulatory document review. Emphasizes regulated-environment reliability with retrieval-only grounding, abstention, confidence thresholds, and immutable audit logging, plus orchestration using LangChain/LangGraph and Airflow. Reported ~60% reduction in compliance review effort while maintaining high precision and traceability.”
Mid-level Data Engineer specializing in cloud data pipelines and analytics platforms
“Data engineer with healthcare and enterprise experience (Molina Healthcare, Dell Technologies) building and operating high-volume batch + streaming pipelines across AWS and Azure. Strong focus on data quality (schema validation, fail-fast checks), reliability (monitoring/alerts, retries), and performance tuning (Spark/partitioning), with measurable runtime reduction and improved downstream trust.”
Mid-level Data Engineer specializing in cloud lakehouse and streaming platforms
“Data engineer focused on building production-grade pipelines on AWS (Kafka/Kinesis/Glue/S3) through to curated serving layers in Snowflake and Delta Lake. Emphasizes automated data quality validation (PySpark + CI/CD), modular dbt transformations for analytics (customer spending, risk metrics), and operational reliability with CloudWatch and DLQs; data consumed by BI tools and ML pipelines for fraud detection and risk analytics.”
Senior Python Developer specializing in data engineering, MLOps, and cloud platforms
“Backend/data engineer with production experience building secure Django/DRF APIs (JWT RS256 + rotating refresh tokens), background processing with Celery, and strong reliability practices (timeouts, retries/backoff, structured logging, audit trails). Has delivered AWS solutions spanning Lambda + ECS with IaC/CI-CD and built Glue/PySpark ETL pipelines with schema evolution and data-quality quarantine patterns; also modernized a legacy SAS pipeline to Python/PySpark with parallel-run parity validation and phased rollout.”