Pre-screened and vetted.
Mid-level AI/ML Engineer specializing in LLMs, NLP, and AWS MLOps
“Recent master’s graduate in robotics with applied experience across reinforcement learning and ROS 2 autonomy stacks. Built an RL-based drone vertiport traffic controller (PPO) focused on reward design and simulation integration, and has hands-on navigation work in ROS 2 including LiDAR preprocessing, SLAM/path planning, and stabilizing TurtleBot3 wall-following. Also brings deployment experience containerizing robotics nodes and scaling them with Kubernetes on AWS.”
“Built and operated end-to-end legal-document data pipelines fed by hundreds of scraper sources, emphasizing data quality validation, reliability (CloudWatch monitoring/alerting, retries, backfills), and serving enriched legal data via serverless AWS APIs (Lambda/API Gateway). Experienced in keeping API contracts stable with additive versioning practices and shipping MVPs quickly with CI/CD and observability in place.”
Mid-Level Data Engineer specializing in cloud data pipelines and big data platforms
“Data engineer with ~4 years of experience building Python-based data ingestion/processing services and real-time streaming pipelines (Kafka/PubSub + Spark Structured Streaming). Has deployed containerized data applications on Kubernetes with GitLab CI/Jenkins pipelines and applied GitOps to cut deployment time ~40% while reducing config drift. Also supported a legacy on-prem data warehouse/backend migration to GCP using phased migration and parallel validation to meet strict reliability/SLA needs.”
Mid-level Data Engineer specializing in cloud data pipelines and analytics engineering
“Built and deployed a production LLM-powered demand and churn forecasting system for an e-commerce client, combining open-source LLMs (LLaMA/Mistral) and Sentence-BERT embeddings to generate business-friendly explanations of forecast drivers. Strong focus on data quality and model trust (validation, baselines, segmented monitoring) and production reliability via Airflow-orchestrated pipelines with readiness checks, retries, and ongoing drift/A-B testing.”
Mid-level Data Scientist specializing in Generative AI, RAG systems, and MLOps
Senior Software Engineer specializing in AWS serverless, APIs, and data/ETL platforms
Mid-level AI/ML Data Engineer specializing in secure ML pipelines and AI governance
Mid-level Software Engineer specializing in AI and cloud-native data platforms
Senior Full-Stack Software Engineer specializing in AWS, .NET, and data/telemetry platforms
Mid-level Data Engineer specializing in FinTech and AI-ready data platforms
Mid-level Machine Learning Engineer specializing in LLMs, Generative AI, and MLOps
Senior Data Scientist and Machine Learning Researcher specializing in NLP, LLMs, and MLOps
Mid-level Data Analyst specializing in BI, ETL, and cloud analytics
Executive AI Architect specializing in low-power edge/embedded AI systems
Junior Data Engineer specializing in cloud ETL/ELT and lakehouse platforms
Senior Full-Stack Engineer specializing in Python, cloud-native microservices, and APIs
Mid-level Generative AI & ML Engineer specializing in production LLM and RAG systems
“AI/ML engineer who shipped a production blood-test report understanding and personalized supplement recommendation product, using a LangGraph multi-agent pipeline on AWS serverless with OCR via Bedrock and RAG over vetted clinical research. Also built end-to-end recommender system pipelines at ASANTe using Airflow (ingestion, embeddings/features, training, registry, batch scoring/monitoring) with KPI reporting to Tableau, with a strong focus on safety, evaluation, and measurable reliability.”
Mid-level Full-Stack Engineer specializing in cloud-native, event-driven data platforms
“Backend/data engineer with hands-on production experience building Python (FastAPI/Flask) data enrichment services secured with Okta OAuth2 and monitored via Splunk/Dynatrace. Has delivered AWS event-driven and data-migration solutions (Lambda + Kafka to EKS; Glue from on-prem Oracle to S3/data lake) and modernized Informatica match/merge logic to cloud services using parallel-run parity validation and stakeholder sign-off.”
Mid-level Business Analyst specializing in data analytics and BI
“Healthcare analytics professional with hands-on experience turning messy claims, eligibility, and utilization data into validated BI-ready models using SQL and Python. They combine strong data engineering and KPI design skills with stakeholder-facing delivery, including Power BI prototyping, retention metric operationalization, and analyses that supported care management interventions and cost-control decisions.”
Mid-level Full-Stack Software Engineer specializing in Java/Spring, React, and AWS
“Backend/data engineer with production experience across event-driven Python ingestion services on AWS (EventBridge/SQS/MongoDB), serverless APIs (Lambda/API Gateway), and analytics ETL (Glue → Redshift). Has modernized legacy reporting into Node.js/React systems and demonstrated measurable SQL performance wins (minutes to seconds) plus strong incident ownership with validation, DLQs, and alerting.”
Junior Machine Learning Engineer specializing in NLP, data pipelines, and LLM workflows
“Built and shipped a production LLM-powered decision system that replaced a slow, inconsistent manual review process by turning messy text into structured, auditable outputs behind an API. Demonstrates strong end-to-end ownership of reliability and operations (schema validation, retries/fallbacks, latency/cost controls, monitoring for drift) and a disciplined approach to evaluation and regression testing. Experienced collaborating with non-technical reviewers to define success criteria and deliver interpretable outputs that get adopted.”
Mid-level Full-Stack Engineer specializing in data automation, cloud & AI
“JavaScript engineer who effectively "maintains" an internal open-source-style React/Node.js shared library used by multiple teams—owning API stability, semantic versioning, CI/testing, logging, and documentation. Demonstrates strong cross-team debugging and change-management skills (schema-driven refactors, feature flags, validation layers) to ship new features without breaking existing workflows, plus a profiling/benchmarking-driven approach to performance.”