Pre-screened and vetted.
Mid-level Data Engineer specializing in cloud-native healthcare and enterprise data platforms
“Data Engineer (TCS) who owned an end-to-end CRM analytics pipeline for Bayer’s eSalesWeb integration, ingesting from Salesforce APIs/databases/S3 and serving analytics-ready datasets via PostgreSQL/S3 for Tableau. Drove measurable outcomes: ~60% reduction in manual data-quality effort, ~30% lower latency through SQL optimization, and ~35% improved stability via monitoring, retries, and idempotent processing.”
Mid-level Data Analyst specializing in financial and customer analytics
“Analytics professional with experience at KPMG and Robosoft Technologies, working across financial and customer engagement data. They combine SQL, Python, experimentation, and BI dashboards to turn messy multi-source data into decision-ready insights, including a pricing test that improved conversion rates by 9%.”
Senior AI/ML Engineer specializing in healthcare AI and MLOps
“Healthcare AI engineer with hands-on ownership of production ML and LLM systems at McKesson, spanning clinical risk prediction and RAG-based documentation tools. Stands out for combining deep clinical-data experience, HIPAA-aware deployment practices, and measurable impact through reduced readmissions, clinician workflow gains, and 20% to 30% faster ML delivery for engineering teams.”
Mid-level Data Scientist specializing in MLOps and Generative AI
“Robotics software/ML engineer who built perception and navigation-related ML systems for autonomous supermarket carts, including object detection, shelf recognition, and obstacle avoidance. Strong ROS/ROS2 practitioner who optimized real-time performance (reported 50% latency reduction) and deployed containerized ROS/ML pipelines at scale using Docker, Kubernetes, and CI/CD.”
Senior Python Backend Engineer specializing in scalable APIs and cloud-native microservices
“Backend/data platform engineer who has built and operated a cloud-native media ingestion/processing platform in Python (Django/DRF, FastAPI) with Kafka, Postgres, and Redis, emphasizing multi-tenant security and reliability. Delivered AWS production systems combining EKS and Lambda with Terraform + GitHub Actions/Helm, and built Glue-based ETL pipelines with strong schema-evolution and data-quality practices; also modernized SAS analytics into Python on AWS. Seeking fully remote roles with a $120K–$140K base range.”
Mid-level Data Scientist specializing in ML, NLP, and Generative AI
“GenAI/ML engineer with production experience at Cognizant and Ally Financial, building end-to-end LLM/RAG systems and ML pipelines. Delivered a domain chatbot trained from 90k tickets and 45k docs, improving intent accuracy (65%→83%), scaling to 800+ concurrent users with 99.2% uptime and sub-150ms latency, and driving +14% customer satisfaction. Strong in Azure ML + DevOps CI/CD, Dockerized deployments, and explainable/PII-safe modeling using SHAP/LIME to satisfy stakeholder trust and GDPR needs.”
Mid-level Full-Stack Java Developer specializing in cloud-native microservices
“Software engineer with deep healthcare claims domain experience who has owned customer-facing portals end-to-end (Java/Spring Boot + React/TypeScript) and improved usability/performance based on real user feedback. Built microservices using REST and RabbitMQ with strong observability (Splunk/cloud metrics), and delivered an internal claims investigation dashboard that streamlined operations through centralized data, search, and filtering.”
Senior Data & Platform Engineer specializing in cloud-native streaming and distributed systems
“Financial data engineer who has built and operated high-volume batch + streaming pipelines (200–300 GB/day; 5–10k events/sec) using AWS, Spark/Delta, Airflow, Kafka, and Snowflake, with strong emphasis on data quality and reliability. Demonstrated measurable impact via 99.9% SLA adherence, major reductions in bad records/nulls, MTTR improvements, and significant latency/runtime/query performance gains; also built a distributed web-scraping system processing 5–10M records/day with anti-bot and schema-drift defenses.”
Mid-level Data Engineer specializing in multi-cloud data platforms for healthcare and finance
“Data engineer with Cigna experience building and operating an end-to-end AWS-based healthcare claims pipeline processing ~2TB/day, using Glue/Kafka/PySpark/SQL into Redshift. Strong focus on data quality and reliability (schema validation, monitoring/alerting, retries/checkpointing/backfills), reporting improved accuracy (~99%) and reduced latency, plus experience serving real-time Kafka/Spark data to downstream analytics with documented data contracts.”
Mid-level Data Engineer specializing in cloud ETL and real-time streaming
“Data engineer focused on AWS + Spark/Databricks pipelines, including an end-to-end nightly loan-data ingestion flow (~2.2M records) from Postgres/S3 through Glue and Databricks into a DWH with layered validation and alerting. Also built real-time streaming with Kafka + Spark Structured Streaming and a master’s project streaming Reddit data for sentiment analysis under ambiguous requirements and tight budget constraints.”
Mid-level AI/ML Engineer specializing in FinTech risk, fraud detection, and GenAI/RAG systems
“Built and productionized Azure-based LLM/RAG systems for regulatory/compliance use cases, including automating analyst research and compliance report generation across large unstructured document sets. Demonstrates strong practical depth in hallucination mitigation, hybrid retrieval tuning (BM25 + embeddings), and production MLOps (Databricks, Cognitive Search, AKS, Airflow/MLflow), plus proven ability to deliver auditable, explainable solutions with non-technical compliance teams.”
Mid-level Data Analyst specializing in financial services and fraud analytics
“Analytics candidate currently at Facteus with hands-on experience turning messy transactional data into trusted reporting layers in Snowflake and Power BI. They combine SQL and Python automation with strong validation, performance tuning, and stakeholder-facing metric design, including cohort-based retention and segmentation work that improved trust and adoption of analytics.”
Executive IT and Operations leader specializing in digital transformation and security
“Candidate is very familiar with the venture capital and broader investment landscape, but is not interested in founding a company. They have worked with several TPG-backed or TPG-owned organizations, helping drive business scaling, cost reduction, and execution against investor governance requirements.”
Entry-level Full-Stack Software Engineer specializing in FinTech and web applications
“Built end-to-end internal and user-facing automation/data features, including a Selenium-based BU course scraper with around 1,800 users and a CSV export system that became the company standard at Triple. Shows initiative in ambiguous environments, working directly with business stakeholders and resolving production infrastructure issues involving AWS and Terraform.”
Mid-level Data Engineer specializing in Lakehouse, Streaming, and ML/LLM data systems
“Built and productionized an enterprise retrieval-augmented generation platform for internal knowledge over large unstructured corpora, emphasizing trust via strict citation/grounding and hybrid retrieval (BM25 + FAISS + cross-encoder re-ranking). Demonstrates strong scaling and cost/latency optimization through incremental indexing/embedding and index partitioning, plus disciplined evaluation/observability practices. Has experience operationalizing pipelines with Airflow/Databricks/GitHub Actions and partnering closely with risk & compliance stakeholders on auditability requirements.”
Mid-level AI/ML Engineer specializing in cloud data engineering and GenAI
“AI/LLM engineer with production experience in legal tech: built a GPT-4 + LangChain RAG summarization system at Govpanel that reduced legal case-file review time by 50%+. Previously at LexisNexis, orchestrated end-to-end Airflow data/AI pipelines processing 5M+ legal documents daily, improving ETL runtime by 35% with robust validation, monitoring, and SLAs.”
Mid-level Data Scientist specializing in Generative AI and NLP for financial risk
“Built and shipped production generative AI/RAG assistants in regulated financial contexts (S&P Global), automating compliance-oriented Q&A over earnings reports/filings with grounded answers and citations. Experienced across the full stack—AWS-based ingestion (PySpark/Glue), vector retrieval + LangChain agents, GPT-4/Claude model selection, and production reliability (monitoring, caching, retries) plus rigorous evaluation and regression testing.”
Mid-level Data Engineer specializing in cloud data platforms and real-time analytics
“Customer-facing data engineering professional who builds and deploys real-time reporting/dashboard solutions, gathering reporting and compliance requirements through direct stakeholder engagement. Experienced with Google Cloud IAM governance, secure integrations (encryption, audit logging), and fast production troubleshooting of ETL/pipeline failures with follow-on monitoring and automated recovery improvements; motivated by hands-on, travel-oriented customer work.”
Senior Data Engineer specializing in data infrastructure and marketing/CRM analytics
“Salesforce-focused implementation/solutions engineer from Full Circle Insights who owned end-to-end campaign attribution and reporting deployments for multiple customers at once (3–5 concurrently), including sandbox testing, KPI monitoring, and rollback-safe migrations from legacy reporting. Also builds personal multi-agent workflows and uses Claude Code to rapidly scaffold data/analytics scripts like an advertising optimization parser over CSV/XLSX inputs.”
Mid-level Data Analyst/Data Engineer specializing in BI, ETL pipelines, and cloud analytics
“Data engineer focused on marketing/web analytics and external API pipelines, handling ~10M records/week. Built Azure-based ingestion and PySpark transformations with rigorous data quality checks, then served curated datasets into Synapse/Redshift for Power BI. Also designed an Airflow-orchestrated crypto REST API pipeline with monitoring, retries/exponential backoff, schema-change detection, and backfill-friendly reprocessing.”
Mid-level AI/ML Engineer specializing in GenAI, NLP, and financial systems
“GenAI/ML engineer with hands-on experience building production financial intelligence and document summarization systems at Citibank. Stands out for combining LLM fine-tuning, hybrid RAG, multi-agent workflows, and strong MLOps/observability practices to deliver measurable business impact, including 60% faster analyst retrieval, 31% higher precision, and 99%+ uptime.”
Mid-level AI & Machine Learning Engineer specializing in FinTech
“ML/AI engineer with hands-on experience building production systems in financial services, including a real-time underwriting analytics platform at Hartford Financial Services. Stands out for combining classic ML, low-latency API deployment, monitoring, and emerging LLM/RAG design patterns, with measurable impact including 20% better decision accuracy, sub-200ms latency, and 5M+ records processed daily.”
Senior AI/ML Engineer specializing in Generative AI, LLMs, and MLOps
“Telecom (Verizon) AI/ML practitioner who built a production multimodal system that ingests messy customer issue reports (calls, chats, emails, screenshots, videos) and turns them into confidence-scored incident summaries with reproducible steps and evidence links. Also built KPI/alarm-to-ticket correlation to rank likely root-cause domains (RAN/Core/Transport), cutting triage from hours to minutes and improving MTTR.”