Pre-screened and vetted.
Mid-level QA Engineer specializing in AI/ML model validation and data quality
“ML practitioner with a QA background who has built end-to-end ML pipelines for a health risk prediction use case (lifestyle + demographics), emphasizing robustness through strict data validation, leakage prevention, and cross-validation. Collaborated with a dietician to sanity-check predictions and refine feature interpretation for real-world practicality; has not yet deployed LLM/AI systems to production and has no hands-on orchestration framework experience but is willing to learn.”
Mid-level Machine Learning Engineer specializing in NLP, LLMs, and MLOps
“Built and deployed an LLM-powered financial/regulatory document analysis platform at State Street, combining fine-tuned transformer models with a RAG pipeline over internal knowledge bases. Owned the productionization stack (FastAPI, Docker, SageMaker, Terraform, CI/CD) plus monitoring for drift/latency/hallucinations, delivering ~40% faster analyst review and improved reliability through chunking/embeddings and grounding.”
Mid-level AI/ML Engineer specializing in Generative AI and data engineering
“IBM engineer who built and deployed a production RAG-based LLM assistant using LangChain/FAISS with a fine-tuned LLaMA model, served via FastAPI microservices on Kubernetes, achieving 99%+ uptime. Demonstrates strong practical expertise in reducing hallucinations (semantic chunking + metadata-driven retrieval) and managing latency, plus mature MLOps practices (Airflow/dbt pipelines, MLflow tracking, monitoring, A/B and shadow deployments) and effective collaboration with non-technical stakeholders.”
Mid-Level Game Designer specializing in live ops and content design
“Game/content designer with experience at EA and on a mobile title (Ultimate Fishing), owning PvP mode UX/gameplay improvements and building UE5 Blueprint systems (state machine + modular weapon/ability architecture) to support multi-developer teams. Also designed College Football 2025/2026 challenge content grounded in real football culture and research, improved broken data-driven toolsets via spreadsheet formula work, and partnered with economy to rebalance rewards by rapidly shipping additional challenges.”
Mid-level Data Engineer specializing in cloud data platforms and scalable ETL pipelines
“Data engineer (~4 years) with full-stack delivery experience (Next.js App Router/TypeScript + React) building a real-time operations monitoring dashboard backed by Kafka and orchestrated data pipelines. Strong production focus: Airflow + CloudWatch monitoring, automated Python/SQL validation (99.5% accuracy), and CI/CD with Jenkins/Docker; has delivered measurable improvements in latency, pipeline reliability, and query performance (Postgres/Redshift).”
Mid-Level Full-Stack Software Engineer specializing in AI/ML and cloud-native systems
“At BondiTech, built and deployed customer-facing backend improvements for enterprise dashboards handling 1M+ records, redesigning a .NET/Entity Framework API with server-side pagination/filtering and feature-flagged rollout to cut latency from ~15s to ~2s. Experienced integrating customer systems into existing APIs, including stabilizing a legacy CRM sync by normalizing inconsistent IDs, handling strict rate limits with batching, and adding DLQs plus reconciliation reporting.”
Mid-level Data Engineer specializing in cloud ETL/ELT and lakehouse architecture
“Data engineer focused on sales/marketing analytics pipelines, owning ingestion from CRMs/ad platforms through warehouse serving and dashboards at ~hundreds of thousands of records/day. Built reliability-focused systems including dbt/SQL/Python data quality gates with alerting, a resilient web-scraping pipeline (retries/backoff, anti-bot tactics, schema-change detection, backfills), and a versioned internal REST API with caching and strong developer usability.”
Mid-level Data Engineer specializing in real-time streaming and cloud data platforms
“Data engineer with Wells Fargo experience owning an end-to-end lakehouse ETL pipeline on Databricks/Azure Data Factory, processing ~480GB daily and implementing robust data quality/reconciliation across 40+ tables to reach ~99.3% reliability. Strong in performance optimization (cut runtime 5.5h→3.8h), CI/CD and monitoring, and resilient external/API ingestion with retries, schema validation, and backfills.”
Mid-Level Software Engineer specializing in backend APIs and distributed systems
“JavaScript engineer with Walmart experience contributing to the Yup validation library—reproduced a nested-object validation bug, fixed merge logic, and added test coverage. Strong in systematic debugging/performance isolation (DevTools + timing logs), plus end-to-end ownership including documentation, monitoring, and issue triage.”
Mid-level AI/ML Engineer specializing in GenAI and financial risk & compliance analytics
“Built and deployed a production LLM-powered financial risk and compliance platform to reduce manual trade exception handling and speed up insights from regulatory documents. Implemented a LangChain multi-agent workflow with structured/unstructured data integration (Redshift + vector DB) and emphasized hallucination reduction for regulatory safety using Amazon Bedrock. Strong MLOps/orchestration background across Kubernetes, Airflow, Jenkins, and monitoring/testing with MLflow, Evidently AI, and PyTest.”
Senior Data Engineer specializing in Spark, Kafka, and Databricks Lakehouse platforms
“Data engineer at Fidelity who built and operated a real-time financial transactions lakehouse on AWS/Databricks, processing millions of records daily with Kafka streaming. Demonstrated strong reliability and data quality practices (watermarking, idempotent Delta writes, validation/reconciliation, observability) and delivered measurable improvements (~30% faster jobs and ~30% fewer data issues) while enabling trusted gold-layer analytics for downstream teams.”
Junior Robotics Data Engineer specializing in multi-sensor perception datasets
“Robotics software engineer focused on perception data pipelines and multi-robot coordination. Built ROS 2 (rclpy) nodes for synchronized RGB/ToF/pose processing and scaled a perception training data generation pipeline from single-object to multi-object while preserving backward compatibility. Also has strong DevOps experience deploying containerized APIs on Kubernetes with Kustomize and automated releases via GitHub Actions.”
Senior Full-Stack Developer specializing in Python, AWS serverless, and data workflows
“Backend/data engineer from ALDI Tech Hub who modernized legacy analytics (Excel/SAS) into production-grade Python services on AWS serverless (FastAPI on Lambda behind API Gateway with Step Functions). Strong in reliability and operations (Cognito auth, retries/timeouts, structured logging, CloudWatch alarms) and data pipelines (Glue ETL with schema evolution); delivered measurable SQL tuning gains (30s to 2s, 70% CPU reduction).”
Mid-level Full-Stack Software Engineer specializing in AI and data applications
“Analytics-focused candidate with experience building SQL/Python pipelines and dashboards for donor, campaign, and website performance reporting. They have worked with messy multi-source data, standardized metric definitions, and delivered automated reporting that reportedly reduced manual effort by about 80%.”
Mid-level Data Analyst specializing in financial and healthcare analytics
“Analytics professional with experience at JPMorgan and Deloitte, focused on financial and risk data. They stand out for building scalable SQL/Python data pipelines, KPI and forecasting dashboards, and retention/cohort metrics that improved reporting reliability, forecast accuracy, and planning speed.”
Mid-level Data Analyst specializing in business intelligence and cloud data platforms
“Healthcare analytics professional with TCS/Humana experience turning messy claims and eligibility data into reliable reporting assets using SQL and Python. They combine strong data engineering and analytics execution with stakeholder management, including automating monthly claims reporting from half a day to under 5 minutes and driving a provider outreach effort that reduced claim rejection rates by about 20%.”
Mid-level Business Analyst specializing in healthcare and enterprise technology
“Analytics professional with healthcare experience at United Health Group, focused on turning messy claims and transaction data into reliable reporting assets. They combine SQL, Python, and Power BI to automate analysis, define operational KPIs, and build dashboards that improved stakeholder visibility and helped reduce processing time by about 22%.”
Senior AI/ML Engineer specializing in Generative AI, NLP, and regulated industries
“Built end-to-end ML and GenAI systems at Northern Trust, including a production RAG-based document intelligence platform for financial reports and contracts. Stands out for combining strong MLOps execution with practical product judgment—improving forecast accuracy by 22%, document review accuracy by 38%, and cutting deployment time by 45% while keeping latency and reliability production-ready.”
Junior Software Engineer specializing in backend distributed systems
“Backend engineer in airport operations who built a highly customizable BFF-based system connecting airport staff workflows to a baggage sortation engine. Their architecture cut per-airport customization from 100-150 engineering hours to 1-5 hours, improved long-running operation performance by 45%, and shipped in 4 months instead of 6. They also explored AI-assisted backend customization with human validation and test-based safeguards.”
“Engineer with a thoughtful, hands-on approach to AI-assisted software development, treating AI as a force multiplier for debugging, prototyping, and large-codebase work rather than a substitute for judgment. Particularly strong in multi-agent coding workflows, contract-driven development, and maintaining consistency across backend, frontend, and testing through shared schemas and OpenAPI-based coordination.”
Mid-level AI/ML Engineer specializing in NLP and Generative AI
“Built and deployed a production LLM-powered RAG assistant for healthcare teams (care managers/support) to answer questions from clinical and policy documentation, emphasizing trustworthiness via improved retrieval, reranking, and strict grounding prompts to reduce hallucinations. Also has hands-on orchestration experience with Apache Airflow for end-to-end ETL/ML workflows and applies rigorous testing/metrics (hallucination rate, tool-call accuracy, latency, cost) to ensure reliable AI agent behavior.”
Mid-level Full-Stack Developer specializing in web platforms and cloud (AWS)
“Full-stack engineer with financial services experience (Lincoln Financial) who owned a customer-facing financial portal end-to-end using TypeScript/React and Node/Express. Has hands-on microservices and RabbitMQ event-driven workflows, addressing scale issues like retries/duplicates with idempotency and traceable logging, and built an internal real-time ops/support dashboard to improve monitoring and incident response.”
Mid-Level Software Engineer specializing in FinTech and cloud microservices
“Backend/platform engineer with hands-on ownership of high-stakes data migrations in regulated domains (core banking and insurance), including a Python ETL framework that migrated 100,000+ customer records within a cutover window. Strong DevOps/GitOps background deploying Python and Spring Boot microservices to Kubernetes with Helm and ArgoCD, plus real-time Kafka transaction streaming for fraud/analytics with reliability patterns (DLQs, retries, partition tuning).”
Mid-level Data Scientist specializing in predictive modeling, NLP/LLMs, and RAG search systems
“Built production LLM/RAG platforms for financial services to enable natural-language Q&A over large policy/compliance document sets stored in Snowflake and SharePoint. Strong in MLOps and orchestration (Airflow, ADF, Step Functions, MLflow) and in solving real production issues like stale embeddings and model performance, including an incremental Snowflake Streams sync that cut processing time from hours to minutes.”