Pre-screened and vetted in California.
Mid-level Data Engineer specializing in ML, OCR, and cloud-native pipelines
Mid-level Data Engineer specializing in ML-driven pipelines and cloud microservices
Principal Data Engineer specializing in petabyte-scale Spark pipelines on GCP
Mid-level AI/ML Engineer specializing in MLOps, real-time data platforms, and generative AI
Intern Data Analyst / Data Developer specializing in retail analytics and healthcare data science
Senior AI/ML Engineer specializing in Generative AI and MLOps
Junior Data & Analytics Analyst specializing in data engineering and customer insights
Mid-Level AI & Full-Stack Engineer specializing in data engineering and real-time streaming
Executive AI Product & Controls Engineering Leader specializing in agentic video editing and EV systems
“Startup builder (MagicSeven) who designed and implemented a browser-based, agentic video editor end-to-end, including an AWS event-driven multimodal LLM “indexing” pipeline and an orchestration LLM agent for searching and manipulating footage. Demonstrates deep video file/codec knowledge plus practical production hardening of LLM workflows (format validation, plan/execute, S3-based state for debuggability).”
Mid-level Data & Machine Learning Engineer specializing in production ML and data platforms
“Built and deployed a production LLM system that scraped Google Maps menu photos, extracted structured prices via OpenAI, and cross-validated them against website-scraped data to automate data-quality verification at scale (replacing costly manual contractor checks). Demonstrates strong reliability instincts—precision-first prompting, output gating with image-quality metadata, and fuzzy matching/RAG techniques—plus solid orchestration (Dagster/Airflow) and observability (Sentry, Prometheus/Grafana).”
Senior AI/ML Engineer specializing in Generative AI and RAG
“ML/NLP practitioner at Morf Health focused on unifying fragmented healthcare data by linking structured patient/encounter records with unstructured clinical notes. Has hands-on experience with transformer embeddings, vector databases, and domain fine-tuning, plus rigorous evaluation (precision/recall) and human-in-the-loop validation with clinical SMEs to make pipelines production-grade.”
Mid-level AI/ML Engineer specializing in MLOps and LLM-powered applications
“AI/ML engineer with production experience building a RAG-based internal analytics assistant (Databricks + ADF ingestion, Pinecone vector store, LangChain orchestration) deployed via Docker on AWS SageMaker with CI/CD and MLflow. Strong focus on real-world constraints—latency/cost optimization (LoRA ~60% compute reduction), hallucination control with citation grounding, and enterprise security/governance. Previously at Intuit, delivered an interpretable churn prediction system (PySpark/Databricks, Airflow/Azure ML) that improved retention targeting ~12%.”
Mid-level Software/Data Engineer specializing in AI-driven data platforms and cloud ETL
Mid-level Data Engineer specializing in Azure, Databricks, and Lakehouse ETL
Mid-level Data Engineer specializing in cloud-native ETL and real-time data pipelines
Senior AWS Data Engineer specializing in data lakehouse, ETL/ELT, and streaming pipelines
Mid-level Data Engineer specializing in AI/ML data pipelines and lakehouse platforms
Mid-level AI/ML Engineer specializing in GenAI, MLOps, and anomaly detection
Junior Software Engineer specializing in machine learning and data science
“Python backend engineer who built a personal LLM-powered AI code review tool that parses code into context-preserving diff chunks and uses the OpenAI API to analyze and summarize changes. Has hands-on Kubernetes deployment experience (replicas, rolling updates, ConfigMaps/Secrets, health probes) and follows GitOps-style, declarative CI/CD workflows; also has experience designing streaming/event-style processing with attention to reliability and observability.”
Mid-level Solutions Architect/Engineer specializing in AI and data integrations
“Solutions Engineer specializing in taking LLM copilots from demo to production, with a strong emphasis on enterprise security (RBAC/OAuth), grounded RAG behavior (cite-or-refuse), and operational readiness (eval loops, logging, runbooks). Experienced in real-time diagnosis of agentic/LLM workflow failures and in partnering with Sales/CS to run integration-first POCs that clear security and reliability concerns and accelerate rollout.”
Mid-level Data Engineer specializing in AWS cloud data platforms
“Data engineer with Charter Communications experience modernizing large-scale AWS data lake pipelines: ingesting S3 data, validating against legacy systems, transforming with PySpark/Spark SQL, and serving via Iceberg/Delta tables. Worked at 50M–300M record scale, delivered >99.5% data match, and built monitoring/alerting (CloudWatch/SNS) plus retry orchestration (Step Functions) and data quality gates (Great Expectations).”