Pre-screened and vetted.
Junior Solutions Engineer / Full-Stack Engineer specializing in AI-native SaaS and APIs
“Worked at easybee ai building a production-grade "voice of the customer" LLM intake agent, hardening a fragile sandbox prototype with JSON-schema constrained outputs, Python/FastAPI validation middleware, and automated retries. Strong in real-time debugging of agentic workflows (snapshot isolation, modular tracing) and in implementing safety/compliance guardrails like a content-moderation middleware to support enterprise adoption.”
Mid-level Data Engineer specializing in cloud ETL/ELT and healthcare analytics
“Healthcare-focused data engineer/ML practitioner with experience at Lightbeam Health Solutions and Humana building production entity-resolution and semantic similarity pipelines across EMR, lab, and claims data. Uses NLP/ML (spaCy, scikit-learn, BioBERT/LightGBM) plus Snowflake/Airflow and vector search (Pinecone) to improve linkage accuracy (reported 90%) and semantic match quality (reported +12–15%), while reducing manual cleanup by 40%+.”
Mid-level QA Automation Engineer / SDET specializing in Financial Services and Healthcare IT
“QA automation engineer with end-to-end ownership of a loan-processing automation suite spanning UI, API, and database validations (Selenium/Playwright/TestNG/REST Assured; Java/Python). Caught and prevented high-impact financial defects (e.g., risk-calculation rounding errors) through CI-driven nightly regressions and API-to-DB checks, and has implemented maintainable Cypress patterns with flake reduction plus GitLab CI gating and Allure reporting.”
Mid-Level Full-Stack Software Engineer specializing in cloud-native microservices and DevSecOps
“Backend-leaning product engineer with DevSecOps depth who has shipped real-time, Kafka-driven data pipelines and AI-enabled customer-facing features to production on AWS. Built a Spring Boot API layer serving real-time predictions at 100K+ requests/day, improving latency by 35% and user task completion by ~25%, and delivered a React/TypeScript dashboard plus a Postgres audit/history model optimized for search and large event volumes.”
Intern Software Engineer specializing in backend, cloud data platforms, and microservices
“Full-stack engineer who shipped a group scheduling SaaS feature with live availability updates using Next.js App Router + TypeScript, owning production reliability after launch (auth debugging, monitoring, polling/backoff tuning). Has hands-on experience with Postgres schema/index design and query optimization (EXPLAIN ANALYZE) and building durable orchestrated backend workflows with retries and idempotency.”
Senior AI/ML Engineer specializing in Generative AI, RAG, and agentic systems
“GenAI/LLM ML engineer (currently at Webprobo) building an enterprise GenAI platform with document intelligence and automation on AWS and blockchain. Has hands-on experience with RAG, LLM evaluation tooling, and orchestrating production LLM workflows with Apache Airflow, plus deep exposure to reliability challenges in globally distributed/edge deployments. Also partnered with business/marketing stakeholders at a banking client to deliver an AI-driven customer retention insights solution.”
Senior Data Analyst specializing in data pipelines, web scraping, and legal data enrichment
“Data engineer focused on reliable, scalable analytics pipelines and external data collection. Has owned end-to-end pipelines processing 5–10M records/day, serving Snowflake data marts to Power BI/Tableau, and reports ~99% reliability through strong validation/monitoring. Also shipped versioned REST APIs for curated data with query optimization and caching.”
Executive Technology Leader/CTO specializing in data platforms, AI agents, and e-commerce/payments
“Engineering leader with hands-on coding time who has driven major commerce and data-platform transformations: defined goop’s omnichannel strategy, unified payments to Square, and rebuilt real-time NetSuite inventory flows plus forecasting tools. Currently reorganized engineering into Product/Data/Support teams to hit aggressive seasonal roadmaps, and led a data-lake/medallion ELT refactor feeding embedded analytics (Tinybird) with improved reliability and cost efficiency; also accelerates onboarding via AI coding tools in a serverless, event-driven architecture.”
Mid-level Generative AI & Machine Learning Engineer specializing in agentic LLM systems
“Built and deployed a production agentic LLM knowledge assistant that answers complex questions over internal documents, APIs, and databases using a RAG architecture (FAISS/Pinecone) and LangChain/LangGraph orchestration. Emphasizes production-grade reliability and hallucination control through grounding, confidence thresholds, validation, retries/fallbacks, and full observability (logging/metrics/traces) with continuous evaluation and feedback loops.”
Intern Site Reliability Engineer specializing in Kubernetes, AWS, and observability
“Backend/data engineering candidate specializing in Python/Flask services and ML-enabled systems, deploying containerized workloads on AWS ECS/EKS with strong observability (Prometheus/Grafana) and PostgreSQL performance tuning. Built multi-tenant architectures with row- and schema-level isolation and optimized a Kubernetes-based Airflow + Spark nightly ETL pipeline for an e-commerce client, improving performance by 250%+ and reliably beating morning reporting deadlines; also contributed to Apache Airflow (SQLAlchemy/PostgreSQL area).”
Junior Data Engineer specializing in Snowflake and investment data platforms
“Private markets/private credit data engineer owning core Snowflake/AWS data infrastructure (S3 → ActiveBatch → Snowflake) with automated iceDQ quality checks and curated datasets for internal Power BI/React reporting. Drove major reliability and delivery improvements, including cutting DB CI/CD deploy time 50% and reducing downstream table errors by 90%+, and also built an internal React/FastAPI app to visualize the team’s data infrastructure in an ambiguous early-stage environment.”
Mid-level Data Engineer specializing in big data pipelines and real-time streaming
“Data engineer who has owned end-to-end production pipelines processing a few million records/day, using Python/Airflow/SQL/PySpark with Snowflake serving to BI (Power BI). Built resilient external web data collection systems (anti-bot, schema-change detection, backfills) and shipped versioned REST APIs for internal consumers, improving pipeline success rates to 99% through monitoring, retries, and idempotent design.”
Mid-Level Data Engineer specializing in cloud data platforms and governed analytics
“Data engineer with Optum experience building end-to-end healthcare data pipelines for HL7/FHIR, processing millions of records daily across Kafka streaming and Databricks/Spark batch. Strong focus on data quality (schema enforcement/validations), reliability (Airflow monitoring/alerts), and analytics-ready serving in Snowflake powering Power BI/Tableau, with CI/CD via Git and Jenkins.”
Junior Software Engineer specializing in LLMs, ML, and full-stack development
“Built and shipped a production LLM-driven data harmonization/record-matching pipeline for pharmaceutical datasets, combining normalization, embeddings/vector search, and an LLM validation step. Emphasizes production reliability via guardrails, confidence thresholds, idempotent/retryable stages, and human-in-the-loop fallbacks, with monitoring focused on manual review and error rates to reduce false positives.”
Mid-level Cloud Data Engineer specializing in Azure/AWS pipelines and medallion architecture
“Data engineer focused on reliability and data quality, owning end-to-end pipelines processing ~100k–300k records/day. Implemented robust validation and monitoring that cut reporting issues by ~30%, and built stable external data collection with anti-bot measures, backfills, and schema-change detection while maintaining backward-compatible internal data services.”
Senior Data Analyst specializing in marketing, BI, and financial analytics
“Marketing analytics candidate with experience at WPP and on a global Coca-Cola campaign, focused on turning messy multi-platform media data into trusted reporting and decision systems. They combine hands-on SQL/Python pipeline building with stakeholder KPI alignment, and cite a 22% improvement in media effectiveness plus faster budget reallocation through daily automated reporting.”
Mid-level Data Analyst specializing in business analytics and BI
“Analytics professional with higher education experience at the University of Dayton, focused on turning inconsistent operational data into standardized metrics and recurring dashboards. They combine SQL, Python, and Power BI to automate reporting, improve data integrity, and reduce manual reporting by 30%, with outputs adopted in semester planning and cross-department performance tracking.”
Mid-level Software Engineer specializing in backend systems and real-time analytics
“Full-stack engineer at BigCommerce who combines customer-facing deployment ownership with hands-on AI/LLM systems work. Built and launched merchant analytics and predictive inventory workflows using React, TypeScript, FastAPI, Kafka, AWS, and RAG-style architectures, and has real production experience debugging non-deterministic AI issues caused by data pipeline freshness and event-ordering problems.”
Junior Full-Stack Software Engineer specializing in AI, FinTech, and e-commerce
“Built both traditional internal tooling and LLM-powered systems during an internship, including a React/Python/AWS calculator onboarding platform and a production-style ROS2 RAG assistant over 10K+ documents. Stands out for combining full-stack delivery, stakeholder coordination, and practical AI reliability work like retrieval tuning, source-grounded answers, and low-confidence fallbacks.”
Mid-level Full-Stack Java Developer specializing in cloud-native enterprise systems
“Backend/full-stack engineer with Blue Cross Blue Shield experience building a reactive, event-driven claims processing microservice platform on AWS (ECS, SNS/SQS) with Terraform-based IaC and strong observability (Dynatrace/CloudWatch). Demonstrated measurable production impact (32% less downtime, 24% higher processing efficiency) and deep database performance/migration expertise across MongoDB and Postgres.”
Mid-level Full-Stack Python Developer specializing in cloud, data engineering, and AI/ML
“Full stack Python developer who actively integrates AI coding assistants into day-to-day engineering work, including code generation, debugging, testing, and documentation. Has also coordinated multi-agent workflows across backend, frontend, testing, and code review, showing an applied, productivity-focused approach to AI-enabled software delivery.”
Mid-level Software Engineer in Test specializing in AI and healthcare platforms
“QA/data pipeline engineer with hands-on AI product building experience, spanning enterprise AWS migration testing for Belgium postal services and personal multi-agent systems in fintech and recruiting. Stands out for combining rigorous validation and production stability work with modern LLM orchestration, guardrails, and messy-document normalization workflows.”
Junior Data Scientist / Big Data Engineer specializing in ML, LLMs, and analytics platforms
“Backend/data platform engineer who led a major redesign of a hybrid streaming+batch analytics platform processing 10+ TB/day (Airflow/Hive/BigQuery) with strong data-quality automation. Also built a production RAG PDF assistant with concrete mitigations for hallucinations and prompt injection (re-ranking, grounding, verifier step) and has deep experience executing low-risk migrations (dual-write, blue-green, rapid rollback) and implementing JWT-based row-level security.”