Pre-screened and vetted.
Senior Product Manager / Project Manager specializing in data platforms, BI, and cloud transformation
Mid-level Data Scientist & AI Engineer specializing in NLP, computer vision, and MLOps
Mid-level Software Engineer specializing in full-stack web and cloud applications
Senior Backend Python Engineer specializing in cloud-native APIs and data platforms
Mid-level Machine Learning Engineer specializing in healthcare and enterprise analytics
Mid-level Data Engineer specializing in AWS data platforms and streaming pipelines
Principal Cloud & Data Architect specializing in AI-enabled AWS platforms
Mid-level Data Scientist specializing in GenAI, RAG, and forecasting
“ML/NLP engineer focused on large-scale data linking for e-commerce-style catalogs and customer records, combining transformer embeddings (BERT/Sentence-BERT), NER, and FAISS-based vector search. Has delivered measurable lifts (e.g., +30% matching accuracy, Precision@10 62%→84%) and built production-grade, scalable pipelines in Airflow/PySpark with strong data quality and schema-drift handling.”
Mid-level GenAI Engineer specializing in RAG systems and AI agents
“LLM/agentic systems builder who has deployed production solutions for a resource management firm, using an MCP-driven architecture with Neo4j + Elasticsearch and a ChatGPT frontend to generate candidate/company “SmartPacks” and answer entity Q&A. Also built a LangGraph/LangSmith-orchestrated multi-agent workflow that automates data-infra change requests end-to-end (impact analysis, SQL + tests, and PR creation), and delivered a ~60% latency reduction through TTL-based context caching while improving accuracy via a business data dictionary.”
Mid-Level Data Engineer specializing in cloud data pipelines and big data platforms
“Data engineer with ~4 years of experience building Python-based data ingestion/processing services and real-time streaming pipelines (Kafka/PubSub + Spark Structured Streaming). Has deployed containerized data applications on Kubernetes with GitLab CI/Jenkins pipelines and applied GitOps to cut deployment time ~40% while reducing config drift. Also supported a legacy on-prem data warehouse/backend migration to GCP using phased migration and parallel validation to meet strict reliability/SLA needs.”
Mid-level Data Engineer specializing in cloud data pipelines and analytics engineering
“Built and deployed a production LLM-powered demand and churn forecasting system for an e-commerce client, combining open-source LLMs (LLaMA/Mistral) and Sentence-BERT embeddings to generate business-friendly explanations of forecast drivers. Strong focus on data quality and model trust (validation, baselines, segmented monitoring) and production reliability via Airflow-orchestrated pipelines with readiness checks, retries, and ongoing drift/A-B testing.”
Mid-level Data Scientist specializing in ML, data engineering, and real-time analytics
Mid-level MLOps/ML Engineer specializing in LLMs and financial risk modeling
Mid-level Data Scientist specializing in Generative AI, RAG systems, and MLOps
Mid-Level Full-Stack Developer specializing in automation and AI pipelines
Senior Data Scientist / AI-ML Engineer specializing in LLMs, NLP, and MLOps
Mid-level Machine Learning Engineer specializing in healthcare and financial AI
Junior Data Engineer specializing in cloud ETL/ELT and lakehouse platforms
Mid-Level Software Engineer specializing in full-stack web and data engineering
“Backend/ML engineer who has built both enterprise data pipelines and real-time AI products: modular Python (Flask/FastAPI) services integrating automation scripts and low-latency ML inference (MediaPipe, PyTorch) plus OpenAI-powered feedback. Demonstrated measurable performance wins (~30% faster HR workflows; ~40% faster AWS pipelines across 100+ Oscar Health feeds) and strong multi-tenant/data-isolation patterns (schema-based isolation, RBAC, microservices).”
Mid-level Business Analyst specializing in data analytics and BI
“Healthcare analytics professional with hands-on experience turning messy claims, eligibility, and utilization data into validated BI-ready models using SQL and Python. They combine strong data engineering and KPI design skills with stakeholder-facing delivery, including Power BI prototyping, retention metric operationalization, and analyses that supported care management interventions and cost-control decisions.”