Pre-screened and vetted.
Mid-level Data Engineer specializing in AWS, Snowflake, Databricks, and PySpark
Senior Unity Developer specializing in AI/LLM systems and multiplayer VR
“Backend/data engineer focused on AWS-native Python systems: built a FastAPI microservice on ECS/Fargate serving real-time analytics at millions of daily requests with strong reliability (OAuth2/JWT, retries/timeouts, correlation IDs) and autoscaling. Also delivered Glue/PySpark ETL pipelines to curated S3 Parquet/Athena with schema evolution + data quality controls, owned Airflow pipeline incidents, and has a track record of measurable performance and cost optimizations (e.g., ~80%+ query latency reduction; reduced logging/NAT/Fargate spend).”
Mid-level Data Scientist specializing in Generative AI and MLOps
“GenAI/LLM engineer with production experience at Allstate building an end-to-end document intelligence workflow for insurance operations—automating document intake, classification, and risk signal extraction. Emphasizes high-reliability design for regulated/high-stakes outputs using schema enforcement, confidence thresholds, validation rules, and human-in-the-loop routing, with metric-driven offline evaluation and production monitoring.”
Senior Backend Python Engineer specializing in cloud-native APIs and data platforms
Director-level Talent Analytics & Recruiting Strategy leader specializing in workforce optimization
Principal Cloud & Data Architect specializing in AI-enabled AWS platforms
Senior Data Engineer specializing in AWS-based data pipelines and multi-tenant SaaS
Senior Python Developer specializing in AWS, microservices, and data pipelines
“Backend/data engineer with strong AWS production experience spanning serverless APIs and containerized workers (Lambda, API Gateway, ECS) plus data pipelines (Glue, S3, Athena/Redshift). Has modernized legacy SAS/cron batch systems into Python/AWS with parallel-run parity validation and low-risk cutovers, and has owned ETL incidents end-to-end (CloudWatch detection, backfills, and preventative controls). Targeting $130k–$150k base and strongly prefers remote, with occasional Bethesda onsite acceptable.”
Executive CTO and VP Engineering leader specializing in SaaS, AI, and cloud platforms
“Repeat founder/CTO with hands-on experience raising capital from friends and family, angels, corporate sources, federal grants, private equity, and venture capital. Built a startup in a software business incubator, later sold the company, and went on to serve as an Engineering Manager at the acquirer inside the Plug and Play accelerator ecosystem.”
Mid-level Data Scientist specializing in credit risk, fraud detection, and ESG analytics
“AI/LLM practitioner who has deployed production chatbots across e-commerce, HRMS, and real estate, focusing on retrieval-first workflows for factual tasks like product and property search. Optimized intent understanding and significantly improved latency by using lightweight embeddings and tuning the inference pipeline on Groq (Llama 3.3), while applying modular orchestration and measurable production evaluation.”
Mid-level AI/ML Engineer specializing in LLMs, NLP, and AWS MLOps
“Recent master’s graduate in robotics with applied experience across reinforcement learning and ROS 2 autonomy stacks. Built an RL-based drone vertiport traffic controller (PPO) focused on reward design and simulation integration, and has hands-on navigation work in ROS 2 including LiDAR preprocessing, SLAM/path planning, and stabilizing TurtleBot3 wall-following. Also brings deployment experience containerizing robotics nodes and scaling them with Kubernetes on AWS.”
“Built and operated end-to-end legal-document data pipelines fed by hundreds of scraper sources, emphasizing data quality validation, reliability (CloudWatch monitoring/alerting, retries, backfills), and serving enriched legal data via serverless AWS APIs (Lambda/API Gateway). Experienced in keeping API contracts stable with additive versioning practices and shipping MVPs quickly with CI/CD and observability in place.”
Mid-Level Data Engineer specializing in cloud data pipelines and big data platforms
“Data engineer with ~4 years of experience building Python-based data ingestion/processing services and real-time streaming pipelines (Kafka/PubSub + Spark Structured Streaming). Has deployed containerized data applications on Kubernetes with GitLab CI/Jenkins pipelines and applied GitOps to cut deployment time ~40% while reducing config drift. Also supported a legacy on-prem data warehouse/backend migration to GCP using phased migration and parallel validation to meet strict reliability/SLA needs.”
Mid-level Machine Learning Engineer specializing in LLMs, Generative AI, and MLOps
Mid-level Data Analyst specializing in BI, ETL, and cloud analytics
Senior Performance Marketing & Marketing Operations Manager specializing in automation and analytics
Junior Data Engineer specializing in cloud ETL/ELT and lakehouse platforms
Mid-level Generative AI & ML Engineer specializing in production LLM and RAG systems
“AI/ML engineer who shipped a production blood-test report understanding and personalized supplement recommendation product, using a LangGraph multi-agent pipeline on AWS serverless with OCR via Bedrock and RAG over vetted clinical research. Also built end-to-end recommender system pipelines at ASANTe using Airflow (ingestion, embeddings/features, training, registry, batch scoring/monitoring) with KPI reporting to Tableau, with a strong focus on safety, evaluation, and measurable reliability.”
Mid-level Business Analyst specializing in data analytics and BI
“Healthcare analytics professional with hands-on experience turning messy claims, eligibility, and utilization data into validated BI-ready models using SQL and Python. They combine strong data engineering and KPI design skills with stakeholder-facing delivery, including Power BI prototyping, retention metric operationalization, and analyses that supported care management interventions and cost-control decisions.”
Senior Frontend Engineer specializing in React/Next.js and large-scale media platforms
“Backend/data engineer with hands-on AWS serverless and data platform experience: built a Lambda-based reCAPTCHA/token validation + SES subscription workflow and designed Glue ETL pipelines producing Parquet datasets for Athena. Also led a high-traffic multilingual media CMS modernization at Radio Free Asia by migrating to Arc XP using parallel runs, phased rollouts, feature flags, and rollback plans.”
“At Liberty Mutual, built a production underwriting decision assistant combining LLM reasoning with quantitative models and strong auditability. Implemented a claims-based response verification pipeline that cut hallucinations from 18% to 3% and materially improved user trust/validation scores. Experienced orchestrating ML/LLM workflows end-to-end with Airflow, Kubeflow Pipelines, and Jenkins, including SLA-focused pipeline hardening.”