Pre-screened and vetted in California.
Senior AWS Data Engineer specializing in data lakehouse, ETL/ELT, and streaming pipelines
Mid-level Data Scientist specializing in ML, GenAI, and data engineering for finance and PropTech
Intern Data Scientist specializing in LLMs, RAG, and computer vision
Mid-level Data Engineer specializing in AI/ML data pipelines and lakehouse platforms
Mid-level Business Systems Analyst specializing in enterprise SaaS, HRIS, and ERP implementations
Mid-level Data Scientist specializing in real-time fraud detection and MLOps
“ML/NLP engineer with experience at Charles Schwab building an NLP + graph (Neo4j) entity-resolution system to unify fragmented user/device/transaction data and improve downstream model quality and analyst querying. Has applied embeddings (SentenceTransformers + FAISS) with domain fine-tuning to boost hard-case matching recall by ~12% while maintaining precision, and has a track record of hardening scalable Python/Spark pipelines and productionizing fraud models via A/B tests and shadow-mode monitoring.”
Junior Product Manager / APM specializing in data tools, CMS platforms, and AI-enabled products
“Data Software Tools Analyst at Q.ai through rapid growth and a $2B Apple acquisition who led an internal CMS for participant/PII workflows using Next.js (App Router) + FastAPI/Postgres with strong security controls (JWT + Postgres RLS). Also drove a major frontend architecture shift toward React Server Components, reporting ~4x faster page loads, and has experience building durable realtime collaboration systems with Supabase/SvelteKit and server-centric state management.”
Senior AI/ML Engineer and Data Scientist specializing in Generative AI and MLOps
“ML/NLP practitioner focused on financial-services document intelligence and compliance workflows—built an end-to-end pipeline to classify documents and extract financial entities from loan applications, emails, and statements stored in S3/internal databases. Strong in entity resolution/record linkage and in productionizing pipelines with GitHub Actions CI/CD, testing, data validation, and Docker, plus semantic search using OpenAI embeddings and a vector database.”
Principal Data Scientist specializing in Generative AI, NLP, and MLOps
“ML/NLP practitioner with banking experience (M&T Bank) who has built a GPT-4 RAG system using LangChain and Pinecone to connect unstructured customer data with internal knowledge bases, improving accuracy and reducing manual lookup time by 50%+. Strong in entity resolution and productionizing scalable Python data workflows, including major performance wins by migrating bottleneck joins from Pandas to Dask.”
Junior Full-Stack Developer specializing in Django/React and data-driven product support
Mid-level Data Scientist specializing in ML, RAG chatbots, and analytics
Mid-level Software/Data Engineer specializing in FinTech and real-time data platforms
Mid-level AI/ML Engineer & Data Scientist specializing in MLOps, LLMs, and anomaly detection
Mid-level Data Engineer specializing in cloud data platforms and real-time streaming
Mid-level Data Scientist specializing in NLP, deep learning, and compliance automation
Mid-level AI/ML & Data Engineer specializing in MLOps and Generative AI
Junior Data Engineer specializing in Azure, ETL, and applied ML
Principal Software Engineer specializing in cloud platforms and data analytics
Mid-level Data Scientist specializing in production ML systems, GenAI, and AWS
Senior Data Engineer specializing in forecasting, analytics platforms, and BI
Intern Data Analyst specializing in data pipelines and LLM/RAG applications
“Built and deployed LLM-powered analytics and reporting systems, including a RAG-based assistant over Snowflake that let business users ask questions in plain English instead of writing SQL. Experienced orchestrating LLM agents (LangChain) and serverless reporting pipelines (AWS Lambda/S3/RDS), with a strong focus on grounded outputs, monitoring/evaluation, and data quality—used daily by non-technical finance and operations teams at Cigna.”
Junior Data Scientist specializing in agentic AI and RAG pipelines
“LLM/agentic systems builder who shipped production workflows at Angel Flight West and Eureka AI, combining LangGraph + RAG (Postgres/pgvector) with strong observability (LangSmith/Langfuse). Delivered large operational gains (address lookup cut from 10 minutes to 60 seconds; accuracy to 92%) and has a track record of quickly stabilizing customer-critical pipelines (Pydantic-enforced JSON for ETL) while partnering with sales/ops to drive adoption.”
Mid-level Data Engineer specializing in multi-cloud real-time data pipelines
“Data engineer with healthcare/clinical trial domain experience who owned a 100TB+/month AWS pipeline end-to-end (Glue/S3/Redshift/Airflow) and drove measurable outcomes (20% lower latency, 99.9% reliability, 40% less manual reporting). Also built production data services and API-based ingestion on GCP (Cloud Run/Functions/BigQuery) with strong validation, versioning, and safe migration practices, and launched an early-stage RAG solution (LangChain + GPT-4) for researchers.”