Pre-screened and vetted.
Senior Data Engineer specializing in cloud data platforms and real-time streaming pipelines
Senior Data Engineer specializing in multi-cloud data platforms and real-time analytics
Senior Data Scientist specializing in healthcare analytics and scalable ML pipelines
Senior Technical Product Manager specializing in data instrumentation and analytics platforms
“Technical Product Manager with hands-on experience shipping and running live free-to-play games (including Cartoon Network BMX Champions) across web and mobile, with exposure to Roblox and some console. Focuses on data-driven live ops—daily rewards, streaks, push notifications, and limited-time events—paired with A/B testing and funnel/retention analytics to improve engagement and IAP performance.”
Senior Data Engineer specializing in cloud data platforms and real-time streaming
“Data engineer focused on building reliable, production-grade data systems end-to-end: batch and real-time pipelines (Airflow/Kafka/Spark) with strong data quality, monitoring/alerting, and incident response. Has experience integrating external API/web data with retries, throttling, and schema-change handling, and serving curated datasets to analytics (Power BI) and backend consumers with performance optimizations like Redis caching.”
Director-level growth marketer specializing in DTC e-commerce and CPG
“Lifecycle/CRM marketing leader from the wine and beverage space who has owned high-volume email/SMS programs end-to-end and translated CRM insights into broader revenue strategy. Stands out for combining rigorous process design, cross-channel automation, and personalization testing to drive measurable gains, including a 45% lift in incremental email revenue and $1.6M in incremental on-site upsell/cross-sell revenue.”
Mid-Level Game Designer specializing in systems and economy design
“Game economy/progression designer with hands-on ownership of end-to-end systems and live-ops tuning on major mobile titles (Star Trek Fleet Command, Marvel Contest of Champions). Builds spreadsheet-based simulations and telemetry-driven tuning loops to prevent inflation and reduce progression friction, including a multi-iteration optimization that lifted D7 retention ~7% while maintaining monetization targets.”
Mid-level AI/ML Engineer specializing in Generative AI and healthcare data
“Built and deployed a production RAG-based document Q&A system on Azure OpenAI to help business teams search thousands of PDFs/Word files, using Qdrant vector search, MongoDB, and a Flask API. Demonstrates strong production engineering (streaming large-file ingestion, parallel preprocessing, monitoring/retries) plus systematic prompt/embedding/chunking experimentation to improve accuracy and reduce hallucinations, and has hands-on orchestration experience with ADF/Airflow/Databricks/Synapse.”
Mid-level Data Scientist specializing in fraud detection and healthcare ML
“Applied NLP/ML in healthcare and financial services, including fine-tuning BERT on unstructured EHR text and building embedding-based similarity search for clinical concepts. Also redesigned a Wells Fargo fraud detection data pipeline using modular Python + AWS Glue/Step Functions, cutting runtime ~40% with improved monitoring and reliability.”
Senior Data Engineer specializing in cloud data platforms and real-time streaming
“Data engineer in healthcare (HCA) who owned end-to-end Azure-based pipelines at very large scale (50M+ daily claims/patient records). Strong focus on reliability: schema-drift fail-fast validation, quarantine layers, and Python/SQL data quality checks that reduced issues ~25%, plus performance tuning in Databricks/PySpark and versioned serving in Synapse for downstream consumers.”
Mid-level AI Solutions Engineer specializing in enterprise GenAI and automation
“Built and shipped multiple production LLM/agentic systems, including an agentic RAG NL-to-SQL analytics app that cut manual reporting from 9 hours/week to 15 minutes by grounding on schema-aware retrieval and robust fallback/monitoring. Also implemented a LangChain supervisor-orchestrated enterprise IT automation agent that routes requests for search, identity validation, and action execution, and created a RAG search tool spanning Jira/Confluence/SharePoint for operations stakeholders.”
Intern IT & Data Analytics professional specializing in automation, cloud operations, and dashboards
“AppSec-focused engineer with experience spanning Accenture and a digital operations support internship, emphasizing secure SDLC and CI/CD security automation (SAST/DAST/SCA). Has hands-on troubleshooting experience using logs/metrics/APM traces (e.g., resolving DAST timeouts caused by rate limiting) and designs AWS/Kubernetes scanning integrations with least-privilege IAM, private networking, secrets management, and observability.”
Mid-level Data Engineer specializing in cloud data platforms
“Built an AI-powered internal support assistant at CVS Health using GPT-4, LangChain, and Pinecone, applying RAG, validation, and monitoring to reduce repetitive support tickets while protecting sensitive healthcare data. Stands out for a pragmatic approach to AI engineering: using multi-agent and LLM workflows to accelerate development while keeping systems constrained, observable, and production-friendly.”
Senior GenAI/ML Engineer specializing in LLMs, RAG, and multimodal generative AI
“LLM/RAG engineer with production deployments in highly regulated domains (Frost Bank and GE Healthcare). Built secure, explainable document-grounded Q&A systems using LoRA fine-tuning, strict RAG with confidence thresholds, and citation-based responses; also established evaluation/monitoring (golden QA sets, hallucination tracking, drift) and achieved ~40% latency reduction through retrieval/prompt tuning.”
Senior Healthcare Operations Leader specializing in value-based care and payer-provider partnerships
“Healthcare operator/strategic program lead currently at Time Care working directly with national health insurance payer partners to translate executive priorities into operational initiatives (e.g., patient marketing, data interoperability, in-home member access). Previously led the build-out and day-to-day execution of a new value-based performance risk division at Stellar Health, combining metrics-driven operations with long-term strategy and executive alignment.”
Mid-level Data Analyst specializing in healthcare and finance analytics
“Built an end-to-end Alexa smart-home IoT application controlling a Wi-Fi bulb, including ESP32 firmware (MQTT) and an AWS serverless backend (IoT Core/Device Shadow, Lambda, DynamoDB) with a REST API. Demonstrates strong real-time scalability patterns (streaming ingestion, stateless processing, partition-key design) and full-stack delivery with Spring Boot + React (JWT auth, CORS, data-heavy dashboards).”
Mid-level Data Scientist specializing in MLOps, LLM/RAG applications, and deep learning
“Built and deployed a production compliance automation RAG system (at Citi) that generates citation-backed, schema-validated risk summaries for regulatory document review. Emphasizes regulated-environment reliability with retrieval-only grounding, abstention, confidence thresholds, and immutable audit logging, plus orchestration using LangChain/LangGraph and Airflow. Reported ~60% reduction in compliance review effort while maintaining high precision and traceability.”
Mid-level Data Analyst specializing in financial risk and healthcare analytics
“AI/ML engineer focused on real-time, production-grade LLM systems, with a robotics-adjacent mindset around latency/accuracy tradeoffs and modular pipelines. Built a scalable RAG-based assistant orchestrated as microservices on Kubernetes with Kafka async messaging, ONNX/quantization optimizations, and monitoring (Prometheus/Grafana), citing a ~35% hallucination reduction; has also experimented with ROS Noetic/Gazebo to understand ROS concepts.”
Mid-level Data Engineer specializing in cloud lakehouse and streaming platforms
“Data engineer focused on building production-grade pipelines on AWS (Kafka/Kinesis/Glue/S3) through to curated serving layers in Snowflake and Delta Lake. Emphasizes automated data quality validation (PySpark + CI/CD), modular dbt transformations for analytics (customer spending, risk metrics), and operational reliability with CloudWatch and DLQs; data consumed by BI tools and ML pipelines for fraud detection and risk analytics.”
Mid-level Data Engineer specializing in multi-cloud real-time and batch data pipelines
“Data engineer with healthcare domain experience who owned 100M+ record pipelines end-to-end (Kafka/Kinesis/ADF → PySpark/dbt validation → Spark SQL transforms → Snowflake/Power BI serving). Built production-grade reliability practices (Airflow orchestration, CloudWatch/Grafana monitoring, pytest + contract/regression tests, idempotent ingestion/backfills) and delivered measurable improvements: 35% lower latency and 40% better query performance.”
Junior Data Analyst specializing in financial and operational analytics
“Analytics professional with experience at KPMG turning messy operational and financial data from SQL Server and AWS S3 into clean reporting datasets and automated Python workflows. They combine SQL, Python, Power BI, and experimentation methods to deliver stakeholder-aligned KPI dashboards and marketing performance insights with a strong focus on data integrity and reproducibility.”
Principal Software Engineer/Consultant specializing in cloud, geospatial, and enterprise platforms
“Runs two lean real estate companies remotely by building local on-the-ground contact networks and leveraging free-tier technology to keep total annual business costs under $100. Brings a cost-elimination and MVP/validation-first mindset, preferring to join an established company unless a clearly viable business idea emerges.”
Senior Python Developer specializing in data engineering, MLOps, and cloud platforms
“Backend/data engineer with production experience building secure Django/DRF APIs (JWT RS256 + rotating refresh tokens), background processing with Celery, and strong reliability practices (timeouts, retries/backoff, structured logging, audit trails). Has delivered AWS solutions spanning Lambda + ECS with IaC/CI-CD and built Glue/PySpark ETL pipelines with schema evolution and data-quality quarantine patterns; also modernized a legacy SAS pipeline to Python/PySpark with parallel-run parity validation and phased rollout.”