Pre-screened and vetted.
Mid-Level Software Engineer specializing in FinTech payments and event-driven microservices
“Backend/data engineer focused on fintech payments and fraud systems, owning real-time Kafka-based reconciliation pipelines end-to-end (~13k tx/day). Built audit-friendly validation/reconciliation (SQL + Python), kept lag to seconds, and cut errors ~20%, while also shipping Spring Boot APIs with Redis caching and strong idempotency/versioning. Has early-stage startup experience standing up payment services on AWS with Docker + GitHub Actions and production monitoring/incident handling.”
Mid-level Data Analyst specializing in healthcare and financial analytics
“Healthcare analytics candidate with hands-on experience turning messy claims and CRM data into validated reporting tables, automating monthly reporting in Python/Airflow, and operationalizing churn metrics in SQL and Tableau. They appear especially strong in stakeholder-aligned metric design and delivered a reported ~10% churn reduction through cohort analysis, segmentation, and at-risk member targeting.”
Mid-level Business Intelligence Analyst specializing in SAP and healthcare reporting
“Analytics professional with hands-on experience turning messy SAP enterprise data into trusted reporting layers and building end-to-end Python/Tableau analytics products. Stands out for combining technical rigor with business alignment—improving report accuracy by 30%, cutting refresh times by 25%, and independently delivering a CLV segmentation project across 96,000 customers that informed retention strategy.”
Intern Data Scientist specializing in robotics localization and SLAM
“Robotics/embodied-AI practitioner who built a TurtleBot3 LiDAR-fingerprint localization pipeline end-to-end (autonomous data collection + multi-head NN) achieving ~30 cm error in a 10x10 m space. Also has industry experience at Infineon building large-scale production data/AI pipelines and rapidly fixing a deployed recommendation system by correcting upstream data normalization, improving accuracy by 20%+.”
Mid-level GenAI & Data Engineer specializing in agentic AI systems and AWS Bedrock
“At onedata, built and deployed an LLM-powered, multi-agent analytics platform on AWS Bedrock that lets users create Amazon QuickSight dashboards through natural-language conversation, cutting dashboard build time from ~30 minutes to ~5 minutes. Strong in production concerns (observability, token/cost tracking, model tradeoffs) and in bridging business + technical work, owning pre-sales pitching through delivery with an engineering management background focused on AI product management.”
Mid-level AI/ML Engineer specializing in fraud detection and healthcare predictive analytics
“ML/AI engineer with production experience in high-scale banking fraud detection at Truist, building an end-to-end pipeline (Airflow/AWS Glue/Snowflake, PyTorch/sklearn) with automated retraining and Kubernetes-based deployment; delivered measurable gains (22% fewer false positives, 15% higher recall) and reduced manual ops ~40%. Also partnered with clinicians at Kellton to deploy an LLM system for summarizing/classifying clinical notes, improving review time and decision speed.”
Mid-level Technical Support Engineer specializing in backend troubleshooting and SQL/API diagnostics
“TSE with hands-on experience troubleshooting customer-reported data issues across APIs and SQL, coordinating with engineering on hotfixes, and translating risk to non-technical stakeholders. Has supported application security workflows using Veracode by generating reports, driving remediation via Jira, and tracking exposure metrics; also assisted customers with SSO setup (client ID/secret).”
Mid-level AI/ML Engineer specializing in MLOps, NLP, and scalable model deployment
“Built and deployed a production autonomous AI data analyst agent (LangChain + GPT + Streamlit on AWS) that turns natural-language questions into validated SQL, visualizations, and insights, cutting manual analysis time by ~50%. Emphasizes reliability and MLOps: schema-aware validation/guardrails to prevent hallucinations, scalable large-data processing, and Azure DevOps CI/CD + MLflow for automated deployment and experiment tracking.”
Intern Data Analyst specializing in data pipelines and LLM/RAG applications
“Built and deployed LLM-powered analytics and reporting systems, including a RAG-based assistant over Snowflake that let business users ask questions in plain English instead of writing SQL. Experienced orchestrating LLM agents (LangChain) and serverless reporting pipelines (AWS Lambda/S3/RDS), with a strong focus on grounded outputs, monitoring/evaluation, and data quality—used daily by non-technical finance and operations teams at Cigna.”
Junior Full-Stack Software Engineer specializing in cloud-native microservices
“Backend/data engineer with experience at Assurant and Capgemini, focused on reliability and performance at scale. Improved high-latency backend APIs by adding and iterating on a Redis caching layer driven by CloudWatch/monitoring metrics, and built scalable BI pipelines that normalize messy multi-source enterprise data with strong observability and error handling. Familiar with LLM/RAG architecture and practical guardrails, though has not yet shipped an LLM feature to production.”
Mid-level Implementation Engineer specializing in enterprise integrations and IAM/PAM
“Data/ML engineer with end-to-end ownership of donor-data deployments for a university foundation, delivering major performance and data-quality gains (500K+ records; 24h to 6h processing; duplicates 5% to 1%). Has put an LLM-assisted enrichment workflow into production with retrieval-grounded business rules, versioned outputs for traceability, and strong operational rigor around validation, logging, and CI/CD.”
Junior Data Analyst specializing in BI, ETL, and reporting
“Analytics professional with hands-on experience building SQL and Python workflows across SAP, Oracle, and internal operational systems, processing roughly 5 million records per month. They combine strong data quality rigor with stakeholder-friendly Power BI reporting, and cite a concrete impact of cutting reporting turnaround time from four days to two while surfacing cost anomalies for business teams.”
Mid-level Business Analyst specializing in analytics, operations, and supply chain
“Analytics candidate with hands-on experience improving enterprise reporting and operational decision-making at Reliance and Wendy’s. They combine SQL optimization, Python automation, sentiment analysis, and dashboarding to deliver measurable impact, including cutting report runtimes from 3 minutes to 1 minute, improving model accuracy from 70% to 80%, and reducing supplier past dues by 30%.”
Senior Backend Developer specializing in Python and AWS cloud-native systems
“Backend/data engineer with production experience building Python FastAPI services and AWS-native data pipelines. Has delivered containerized and serverless workloads (ECS/EKS/Lambda) with Terraform-based IaC, strong reliability patterns (JWT/RBAC, retries/circuit breakers, observability), and AWS Glue ETL into S3/Redshift. Demonstrated measurable SQL performance wins (40–50s to <4s) and owned real pipeline incidents through detection, mitigation, and prevention.”
“Backend/data engineer who builds Python (FastAPI) data-processing API services for internal analytics/reporting, emphasizing modular architecture, async performance tuning, and reliability patterns (health checks, retries, observability). Also migrated legacy on-prem ETL pipelines to Azure using ADF/Data Lake/Functions and implemented a near-real-time ingestion flow with Event Hubs plus watermarking to handle late events and deduplication.”
Director-level Marketing & Communications leader specializing in internal comms and change management
“Marketing leader who has repeatedly been the first marketing hire, building the function from scratch while aligning sales, product, and leadership around shared positioning and goals. Strong in marketing ops and executive analytics—integrated multiple data platforms to track the full customer lifecycle (CLV, renewals, attribution) and drive KPI-focused decision-making while reducing reliance on vanity metrics.”
Mid-level Data Scientist specializing in NLP and predictive modeling
“AI/ML practitioner in healthcare/insurance (Blue Cross Blue Shield) who built and deployed a production NLP system to classify patient risk from unstructured clinical notes. Experienced in end-to-end pipeline orchestration (Airflow, AWS Step Functions/Lambda/SageMaker) and real-time optimization (BERT to DistilBERT on AWS GPUs), with strong clinician collaboration to drive adoption.”
Junior Machine Learning Engineer specializing in NLP, computer vision, and MLOps
“ML/LLM engineer with Meta experience building production AI systems for near real-time user-report classification and summarization under strict latency (<250ms), safety, cost, and privacy constraints. Has hands-on MLOps/orchestration experience (Airflow, Spark, MLflow, Kubernetes, Docker, GitHub Actions) plus observability (Prometheus/Grafana) and applies rigorous evaluation, staged rollouts, and A/B testing to keep agent workflows reliable in production.”
Junior Data & Insights Analyst specializing in BI, dashboards, and automation
“Worked on taking an LLM-based system at Soundmakr from prototype to production by adding prompt constraints, validation/guardrails, deterministic ranking, and robust logging/monitoring with feedback loops. Also partnered with product/marketing during an internship on Thea: Study Smart to analyze onboarding drop-offs and run A/B tests on AI-driven flows, translating results into actions that improved retention and conversion.”
Mid-level Data Engineer specializing in cloud data platforms and AI agents
“Data/Backend engineer who has owned end-to-end merchant analytics systems on AWS: orchestrated multi-source ingestion (FISERV/Shopify/Clover) with Step Functions/Lambda, enforced strong data quality gates, and served curated datasets via Redshift and a FastAPI layer. Also built an early-stage Merchant Insights AI agent that converts natural language questions into SQL using OpenAI models, with full CI/CD and observability.”
Mid-level AI/ML Engineer specializing in predictive modeling and cloud ML pipelines
“LLM engineer/data engineer who has deployed production RAG systems for internal-document Q&A, building end-to-end ingestion, embedding, vector search, and FastAPI serving while actively reducing hallucinations and latency through rigorous retrieval tuning and caching. Also experienced in orchestrating cloud data pipelines (Airflow, AWS Glue, Azure Data Factory) and partnering with non-technical business teams to deliver AI solutions like automated document review.”
Mid-level Data Engineer and Business Analyst specializing in cloud ETL and analytics
“Data analyst with cross-industry experience spanning insurance analytics at L&T Infotech and experimental imaging analytics at Mylyser. Stands out for building scalable SQL/PySpark data pipelines, standardizing business-critical metrics like claims lifecycle and policy retention, and delivering measurable impact such as 50%+ faster query performance and a 15% reduction in claims settlement time.”
Intern Software Engineer specializing in cloud, full-stack, and AI systems
“Built a production LLM-assisted workflow for customer configuration data migrations, combining agentic parsing with deterministic validation and fail-safe pipeline design. Stands out for turning messy ERP and operational data into reliable, repeatable transformations while improving accuracy and cutting manual effort by more than 80%.”