Pre-screened and vetted.
Mid-level AI Engineer specializing in generative AI, multimodal evaluation, and agentic RAG systems
“Built and productionized an agentic LLM automation system for an insurance client to determine medication eligibility, using prompt-chaining plus a RAG pipeline over policy rules and deploying on AWS (Lambda/Step Functions, Bedrock) with a serverless architecture. Addressed major data/schema mismatch issues via a semantic matching pipeline and validated performance through human agreement scoring, A/B testing, KPI monitoring, and confidence-based human-in-the-loop review.”
Mid-level AI/ML Engineer specializing in MLOps, NLP, and real-time ML pipelines
“Built a production, real-time insurance claims document-understanding and fraud-detection pipeline using TensorFlow + fine-tuned BERT, deployed on AWS (SageMaker/Lambda/API Gateway) with automated retraining via MLflow and Jenkins. Addressed noisy documents and latency using augmentation and model distillation (3x faster), cutting claims ops manual review by ~50% and reducing fraudulent payouts.”
Senior Data Engineer specializing in cloud lakehouse and streaming data platforms
“Data platform/data engineer with cross-industry experience in banking and healthcare, building cloud-native lakehouse architectures across AWS/Azure/GCP. Has owned high-volume (millions of records; TB/day) pipelines with strong data quality automation (dbt/Great Expectations), observability (Grafana/Prometheus), and real-time streaming (Kafka/Spark) for fraud monitoring; also delivered an early-stage migration from SQL Server to BigQuery with 40% batch latency reduction.”
Senior DevSecOps Engineer specializing in Azure cloud infrastructure and CI/CD
“GCP-focused database/infrastructure engineer with hands-on production support for Cloud SQL and Firestore, spanning provisioning, IAM, scaling, backups, and performance tuning. They also described supporting a hybrid GCP architecture for a monolithic on-prem PostgreSQL workload and resolving a major latency incident by tracing cascading failures and fixing indexing issues.”
Junior AI/ML Engineer specializing in LLM agents and RAG systems
“Backend/data engineer who built a production-ready multi-agent financial intelligence system (Mycroft) that orchestrates specialized AI agents to analyze real-time market data using FastAPI and Pinecone vector search. Brings strong security/reliability instincts (rate limiting, JWT/OAuth2, retries/backoff, health checks) and has caught high-impact data integrity issues in financial migrations (timezone normalization across global legacy systems).”
Principal Data Scientist specializing in cybersecurity ML and MLOps
“ML/NLP engineer (Beyond Identity) who built production semantic search and entity-resolution systems over internal security documentation, using LDA + BERT embeddings with FAISS/Pinecone to cut search time by 30%. Also scaled a real-time anomaly detection pipeline to millions of events/day with Spark and AWS Lambda, with strong emphasis on measurable validation (Precision@k, MRR, F1, ARI).”
Mid-level AI/ML Engineer specializing in LLMs, RAG, and MLOps for financial services
“Built and deployed a production Llama 3-based RAG document Q&A system using FAISS, addressing context-window limits through chunking and keeping retrieval accurate by regularly refreshing embeddings. Has hands-on orchestration experience with LangChain and LlamaIndex for multi-step LLM workflows (including memory management) and collaborates with non-technical teams (e.g., marketing) to deliver AI solutions like recommendation systems.”
Principal Enterprise Architect specializing in AI, cloud modernization, and cybersecurity
“Senior technologist (25 years experience) who served as chief architect/CTO for a patented software startup that was acquired. Strong at building scalable, robust, technology-agnostic systems and translating technical value into investor-ready narratives (forecasts, roadmaps, documentation). Currently prefers joining an existing founding team as a key technical leader/mentor rather than leading entrepreneurship solo.”
Mid-level AI/ML Engineer specializing in healthcare ML and generative AI
“AI/LLM engineer at Humana who built and deployed a HIPAA-aware RAG system for clinical record retrieval, cutting search time dramatically and improving retrieval efficiency by 30%. Experienced with Spark-scale data preprocessing, QLoRA fine-tuning, LangChain orchestration, and MLflow+SageMaker integration, with a strong testing/evaluation discipline (A/B tests, human eval) to hit 95%+ accuracy and production latency targets.”
“Backend/data engineer who builds Python (FastAPI) data-processing API services for internal analytics/reporting, emphasizing modular architecture, async performance tuning, and reliability patterns (health checks, retries, observability). Also migrated legacy on-prem ETL pipelines to Azure using ADF/Data Lake/Functions and implemented a near-real-time ingestion flow with Event Hubs plus watermarking to handle late events and deduplication.”
Mid-level Data Engineer specializing in cloud data platforms and AI agents
“Data/Backend engineer who has owned end-to-end merchant analytics systems on AWS: orchestrated multi-source ingestion (FISERV/Shopify/Clover) with Step Functions/Lambda, enforced strong data quality gates, and served curated datasets via Redshift and a FastAPI layer. Also built an early-stage Merchant Insights AI agent that converts natural language questions into SQL using OpenAI models, with full CI/CD and observability.”
Mid-level Data Engineer and Business Analyst specializing in cloud ETL and analytics
“Data analyst with cross-industry experience spanning insurance analytics at L&T Infotech and experimental imaging analytics at Mylyser. Stands out for building scalable SQL/PySpark data pipelines, standardizing business-critical metrics like claims lifecycle and policy retention, and delivering measurable impact such as 50%+ faster query performance and a 15% reduction in claims settlement time.”
Mid-level Data Analyst specializing in FinTech and e-commerce analytics
“Analytics professional with experience spanning financial services and e-commerce, including NAV reconciliation/reporting at State Street and revenue optimization work on checkout and retention. Stands out for combining strong SQL/Python data engineering with business-facing analytics, delivering measurable gains like 3-5% conversion improvement and 8-10% payment success uplift.”
Mid-level AI/ML Engineer specializing in healthcare imaging and GenAI/LLM systems
“Built and deployed a production LLM/RAG clinical document understanding and summarization system for healthcare, focused on reducing manual review time while meeting strict accuracy, latency, and compliance needs. Demonstrates strong MLOps/orchestration depth (Airflow, Kubernetes, Azure ML Pipelines) and a rigorous approach to hallucination mitigation through layered, source-grounded safeguards and stakeholder-driven requirements with physicians/compliance teams.”
Mid-level Automation Developer specializing in RPA, test automation, and data/ETL pipelines
“Python backend engineer who owned an end-to-end Django/DRF authentication and account-management module (JWT, RBAC, email verification) and optimized token validation performance. Has hands-on Kubernetes + Helm delivery with GitOps via ArgoCD (multi-environment app-of-apps, drift detection/rollback) and has supported a cloud-to-on-prem migration using staged testing and phased cutover. Also built and scaled a Kafka-based real-time user activity tracking pipeline with reliability and backpressure controls.”
Mid-level Data Engineer specializing in multi-cloud real-time data pipelines
“Data engineer with healthcare/clinical trial domain experience who owned a 100TB+/month AWS pipeline end-to-end (Glue/S3/Redshift/Airflow) and drove measurable outcomes (20% lower latency, 99.9% reliability, 40% less manual reporting). Also built production data services and API-based ingestion on GCP (Cloud Run/Functions/BigQuery) with strong validation, versioning, and safe migration practices, and launched an early-stage RAG solution (LangChain + GPT-4) for researchers.”
Mid-level Data Engineer specializing in Azure, Spark, and scalable ETL/ELT pipelines
“Data engineer with banking FP&A experience who led an end-to-end migration of 10+ TB from Teradata to Azure (ADF + Data Lake + Databricks/PySpark + Synapse). Emphasizes reliability (multi-stage validation, monitoring/alerts) and performance (Spark tuning, incremental loads, autoscaling), reporting ~99.5% pipeline reliability while supporting downstream consumers with stable schemas and clear change management.”
Mid-level Data Engineer specializing in cloud ETL and streaming data pipelines
“Data engineer in healthcare/clinical data platforms (HarmonCare) who built and operated an end-to-end lakehouse pipeline ingesting HL7/FHIR at ~2–3M records/day on AWS (Glue/Lambda/S3/Spark) and serving trusted datasets in Snowflake. Implemented strong validation/reconciliation gates and a data quality framework that reduced discrepancies ~40%, plus CI/CD (GitHub Actions/Terraform) and monitoring (Airflow/CloudWatch).”
Mid-level Data Engineer specializing in multi-cloud data pipelines and real-time analytics
Mid-level Data Scientist specializing in ML, NLP, and cloud data platforms
Mid-level AI/ML Engineer specializing in LLM fine-tuning, RAG, and MLOps
Mid-level Data Engineer specializing in cloud lakehouse, streaming, and Snowflake/Databricks
Mid-level Software Engineer specializing in Python, cloud, and ML applications