Pre-screened and vetted.
Mid-level Business Analyst specializing in BI, reporting, and data analytics
“Finance data and reporting professional with PwC experience who bridges accounting and technology, especially around GL-related reconciliations, reporting accuracy, and close support. While not a direct PeopleSoft GL owner, they bring strong SQL-driven troubleshooting, ETL/data mapping remediation, and process automation experience that helped shorten close cycles and improve audit readiness.”
Junior Business Analyst specializing in pricing and data analytics
“Analytics candidate with hands-on experience turning messy pricing and competitor data into reporting-ready SQL tables, plus building Python automation workflows that replaced manual processing across 40,000 images at roughly 89% accuracy. They also led a price elasticity analysis that informed differentiated pricing strategies and supported reporting through Power BI dashboards.”
Mid-level Data Analyst specializing in healthcare and financial analytics
“Analytics professional with Deloitte experience building SQL and Python workflows for revenue, pipeline, and opportunity analytics at scale. They combine strong data engineering and modeling skills with business-facing delivery, citing impacts including 8-10% conversion improvement, ~$700K revenue protected, 12% YoY project acquisition growth, and 15% retention improvement in financial services.”
Junior data and product analyst specializing in machine learning and analytics
“Senior at the University of Michigan who led most of the technical build for a real client-facing Medicare fraud detection system with explainable ML and an analyst-ready Streamlit dashboard. Also builds practical LLM tools independently, including a market sentiment pipeline over Reddit/news data and a resume parser/grader, showing strong product instinct alongside applied ML and data engineering depth.”
Intern Electrical Engineer and Robotics Researcher specializing in automation and embedded systems
“Robotics researcher/team lead from ASU’s Robotics and Intelligent Systems Lab who rebuilt a failing soft-robotics test bench into a modular 8-Arduino, ROS 2/Python-controlled data-collection system, doubling sampling performance and dramatically reducing downtime. Currently doing thesis work on physics-informed neural network (PIRNN/RNN) modeling of a pneumatically actuated soft robotic arm using experimentally collected trajectory/pressure data.”
Mid-level AI/ML Engineer specializing in fraud detection and risk analytics in Financial Services
“At JP Morgan Chase, built and deployed a production LLM-powered RAG knowledge assistant to help fraud investigators and risk analysts quickly navigate regulatory updates and internal policies, reducing investigation delays and compliance risk. Strong focus on secure retrieval (RBAC filtering), reliability (layered testing + observability), and production constraints (latency/SLOs), with Airflow-orchestrated, auditable ML pipelines.”
Mid-level GenAI/ML Engineer specializing in LLM applications and enterprise automation
“Built and shipped a production LLM-powered healthcare support agent at UnitedHealthGroup, using LangChain + FAISS RAG on AWS SageMaker with CloudWatch monitoring and human-in-the-loop fallbacks for safety. Strong focus on reliability engineering (confidence gating, retries/timeouts, caching) and continuous evaluation loops; reported ~40% improvement in query resolution efficiency while reducing manual support workload.”
Mid-level Data Engineer specializing in streaming and cloud data platforms for financial services
“Data engineering-focused candidate (internship/project experience) who built end-to-end pipelines processing a few million transactional records/day for fraud detection and reporting, using Airflow, Python/SQL, and PySpark with strong emphasis on data quality gates, idempotency, and monitoring. Also implemented an external web/API data collection system with anti-bot tactics and schema-change quarantine, and shipped a versioned Flask API to serve curated warehouse data.”
Mid-level Data Engineer specializing in cloud data warehousing and analytics
“Data engineer at American Express who owned end-to-end pipelines for transaction and customer data used in finance reporting and risk analytics, processing ~5–8M records/day. Built Airflow-orchestrated ingestion (including external APIs/web sources) with strong data quality controls, monitoring/alerts, and resilient backfill/retry patterns, and also shipped a versioned REST API serving aggregated metrics to analytics teams.”
Senior Data Engineer specializing in Databricks, Spark, and AWS for government healthcare data systems
“Python/AWS engineer focused on batch-processing and data workflows, including building reusable S3/boto3 utilities with reliability features and IAM-based auth. Has led low-risk legacy modernizations using parity testing plus a month of parallel production runs, and has owned production issues end-to-end (including fixing a client-side Excel macro) while contributing to significant AWS cost reductions (~$10k/month).”
Senior Business Analyst specializing in financial and research analytics
“Analytics professional with experience spanning HSBC and the University of Buffalo, combining banking risk/portfolio analytics with reproducible Python and SQL reporting workflows. Stands out for cleaning complex multi-source data, standardizing business metrics across dashboards, and delivering measurable impact including an 80% reduction in manual reporting and an estimated GBP 1.2M profit contribution from scorecard redevelopment.”
Mid-level AI/ML Engineer specializing in FinTech risk and fraud systems
“Senior AI/ML engineer focused on production LLM systems, combining RAG, fine-tuning, distributed training, and AI safety to ship scalable real-time moderation and conversational AI platforms. Stands out for pairing deep AWS/Kubernetes MLOps expertise with measurable impact: 40% lower latency/cost, 30-50% fewer hallucinations, and major reliability gains through observability and automation.”
Mid-level AI/ML Engineer specializing in Generative AI and financial services
“ML/AI engineer with hands-on experience shipping regulated financial AI systems at JPMC and Capgemini, spanning credit risk, fraud detection, and generative AI assistants. Stands out for combining modern LLM/RAG architectures with strong MLOps, real-time infrastructure, and explainability/compliance practices, while delivering measurable business impact in latency, accuracy, cost, and risk reduction.”
Executive transformation leader specializing in AI-enabled operations and Lean Six Sigma
“Pre-seed founder (pre-product) building a new enterprise control layer for AI-enabled organizations focused on governed autonomy and decision-cycle compression; thesis formalized in their book "The Decision Velocity Era." Former CEO of Evaluation Solutions who built an internal proprietary operating system, proved it in live operations, commercialized it as SaaS, and enabled growth to $120M+ in contracted revenue.”
Senior Full-Stack/Backend Software Engineer specializing in cloud-native automation and microservices
“Backend/data engineer with strong AWS production experience across containers (ECS) and serverless (API Gateway/Lambda/SQS), plus Glue-based ETL to Parquet for Athena/Redshift. Demonstrates hands-on reliability and security depth (Cognito OAuth2/JWT with JWKS rotation, idempotency/DLQs, monitoring) and measurable performance wins (Redis caching + query tuning), along with legacy-to-services modernization using parallel-run parity and feature-flagged cutovers.”
“Built and deployed a production LLM-powered RAG assistant for semiconductor manufacturing failure analysis, reducing engineer triage effort by grounding outputs in retrieved evidence and gating responses with SPC + ML signals (LSTM anomaly scores, XGBoost probabilities). Experienced with LangChain/LangGraph to ship reliable, observable multi-step agents with branching/fallback logic, and evaluates impact using both technical metrics and business KPIs like mean time to triage and downtime reduction.”
Mid-level Data/ML Engineer specializing in NLP, GenAI, and scalable data pipelines
“AI/ML engineer with production experience building LLM-powered document intelligence and customer support systems in healthcare/insurance, emphasizing high-accuracy RAG, long-document processing, and robust monitoring/fallback mechanisms. Also automates and scales ML lifecycle workflows using Apache Airflow and Kubeflow, and partners closely with non-technical operations stakeholders to drive adoption.”
Mid-level Data Engineer specializing in Analytics & AI/ML
“Data engineer with experience at Sony and Walmart building high-volume, near-real-time analytics and ingestion systems. Has owned end-to-end pipelines from Kafka/Spark streaming through S3/Parquet and Redshift/Looker, emphasizing data quality (Great Expectations), observability (CloudWatch/Azure Monitor), and reliability (Airflow SLAs, retries, checkpointing), including measurable performance and latency improvements.”
Senior Data Engineer specializing in cloud lakehouse platforms and streaming analytics
“Data engineer focused on fraud and banking analytics who has owned end-to-end batch + streaming pipelines at very large scale (hundreds of millions of records/day). Built robust data quality/observability layers (schema validation, anomaly detection, alerting) and delivered low-latency serving via AWS Lambda/API Gateway with DynamoDB + Redis, plus external data ingestion/scraping pipelines orchestrated in Airflow with anti-bot protections.”
Mid-level Business Analyst specializing in BI, reporting, and data insights
“Healthcare analytics professional with experience at UnitedHealth Group, focused on turning messy claims, eligibility, and provider data into clean reporting datasets and Power BI dashboards. Combines SQL and Python automation with strong stakeholder alignment around KPI definitions, helping operations teams improve claim turnaround visibility and cost efficiency.”
Mid-Level Software Engineer specializing in cloud-native distributed systems
“Backend/platform engineer who has built and run production Python/Flask + Kafka microservices processing RFID and camera/RFID fusion streams for near-real-time retail cart updates at ~4–5M events/day. Strong in reliability/performance debugging (p99 latency, Kafka lag, Cosmos DB RU hot partitions) with measurable impact including ~30% database cost reduction, and has also shipped an end-to-end vulnerability scanning workflow with DynamoDB-backed state, idempotency, and robust retry/verification guardrails.”
Mid-level Data Engineer specializing in scalable pipelines, Spark, and cloud data warehousing
“Backend/data platform engineer who recently owned an end-to-end large-scale financial data platform delivering real-time decision support for finance and operations. Has hands-on experience modernizing legacy batch pipelines into AWS cloud-native ELT with parallel-run cutovers, strong data quality controls (dbt-style tests, reconciliation), and measurable improvements in runtime, cost, and SLA compliance. Also builds scalable, secure FastAPI microservices using Docker, ALB-based horizontal scaling, Redis caching, and managed auth with Cognito/Supabase plus Postgres RLS.”
Mid-level Generative AI Engineer specializing in decision intelligence and RAG for regulated enterprises
“Healthcare GenAI engineer who built a HIPAA-compliant, auditable RAG-based claims decision support system at Molina Healthcare, processing 3M claims and delivering major impact (48% faster manual reviews, 43% higher decision accuracy). Deep hands-on experience with LangChain orchestration, vector search (ChromaDB/FAISS), embedding fine-tuning, and safety controls (confidence scoring, rule validation, human-in-the-loop escalation) for clinical workflows.”
Mid-level Data Engineer specializing in cloud data pipelines and real-time streaming
“Data engineer with PNC Bank experience owning high-volume financial transaction pipelines end-to-end (Kafka/REST ingestion through Spark/Glue transformations to Redshift serving) for risk and fraud analytics. Built strong reliability and data quality practices (Great Expectations, reconciliation, Airflow alerting, idempotent retries, incremental/windowed processing), reporting 40% ingestion efficiency gains and ~99.9% data accuracy.”