Pre-screened and vetted.
Mid-level Data Scientist specializing in Generative AI and NLP for financial risk
“Built and shipped production generative AI/RAG assistants in regulated financial contexts (S&P Global), automating compliance-oriented Q&A over earnings reports/filings with grounded answers and citations. Experienced across the full stack—AWS-based ingestion (PySpark/Glue), vector retrieval + LangChain agents, GPT-4/Claude model selection, and production reliability (monitoring, caching, retries) plus rigorous evaluation and regression testing.”
Mid-level AI/ML Engineer specializing in cloud data engineering and GenAI
“AI/LLM engineer with production experience in legal tech: built a GPT-4 + LangChain RAG summarization system at Govpanel that reduced legal case-file review time by 50%+. Previously at LexisNexis, orchestrated end-to-end Airflow data/AI pipelines processing 5M+ legal documents daily, improving ETL runtime by 35% with robust validation, monitoring, and SLAs.”
Mid-level Data Engineer specializing in cloud data platforms and real-time analytics
“Customer-facing data engineering professional who builds and deploys real-time reporting/dashboard solutions, gathering reporting and compliance requirements through direct stakeholder engagement. Experienced with Google Cloud IAM governance, secure integrations (encryption, audit logging), and fast production troubleshooting of ETL/pipeline failures with follow-on monitoring and automated recovery improvements; motivated by hands-on, travel-oriented customer work.”
Senior Full-Stack Java Developer specializing in capital markets and trading systems
“Backend/data engineer with production experience in payment initiation/processing services built in Python/FastAPI, emphasizing reliability patterns (JWT/RBAC, timeouts, retries, circuit breakers). Has delivered AWS deployments on ECS (ALB, autoscaling, CI/CD to ECR) plus Lambda-based reporting, and built AWS Glue ETL pipelines with schema evolution and CloudWatch monitoring. Also modernized a legacy SAS reporting platform to Python/PostgreSQL with regression parity testing and parallel-run migration, and achieved a 70% SQL performance improvement.”
Senior Data Engineer specializing in data infrastructure and marketing/CRM analytics
“Salesforce-focused implementation/solutions engineer from Full Circle Insights who owned end-to-end campaign attribution and reporting deployments for multiple customers at once (3–5 concurrently), including sandbox testing, KPI monitoring, and rollback-safe migrations from legacy reporting. Also builds personal multi-agent workflows and uses Claude Code to rapidly scaffold data/analytics scripts like an advertising optimization parser over CSV/XLSX inputs.”
Mid-level AI Engineer specializing in LLMs, MLOps, and healthcare NLP
“Built a production, real-time clinical documentation system at HCA that converts doctor–patient conversations into structured clinical summaries using speech-to-text, LLM summarization, and RAG. Demonstrated measurable gains from medical-domain fine-tuning (clinical concept recall +18%, ROUGE-L 0.62 to 0.74) while meeting HIPAA constraints via PHI anonymization and encryption, and deployed via Docker/FastAPI with CI/CD and monitoring.”
Mid-level GenAI Engineer specializing in LLM fine-tuning, RAG, and MLOps
“Healthcare-focused LLM engineer who deployed a production triage and clinical knowledge retrieval assistant using RAG and LangGraph-orchestrated multi-agent workflows. Emphasizes clinical safety and compliance with robust hallucination controls, HIPAA/PHI protections (tokenization, encryption, audit logging, zero-retention), and human-in-the-loop escalation; reports a 75% latency reduction in a healthcare agent system.”
“Built a production multi-agent orchestration platform to automate healthcare claims and HR workflows, combining LangChain/CrewAI/AutoGPT with RAG (FAISS/Pinecone) and fine-tuned open-source LLMs (LLaMA/Mistral/Falcon) in private Azure ML environments to meet HIPAA requirements. Emphasizes rigorous agent evaluation/observability (trajectory eval, adversarial testing, LLM-as-judge, drift monitoring) and reports measurable outcomes including 35% faster claims processing and 40% fewer chatbot errors.”
Senior AI/ML Engineer specializing in Generative AI, LLMs, and MLOps
“Telecom (Verizon) AI/ML practitioner who built a production multimodal system that ingests messy customer issue reports (calls, chats, emails, screenshots, videos) and turns them into confidence-scored incident summaries with reproducible steps and evidence links. Also built KPI/alarm-to-ticket correlation to rank likely root-cause domains (RAN/Core/Transport), cutting triage from hours to minutes and improving MTTR.”
Mid-level AI/ML Engineer specializing in Generative AI and data engineering
“IBM engineer who built and deployed a production RAG-based LLM assistant using LangChain/FAISS with a fine-tuned LLaMA model, served via FastAPI microservices on Kubernetes, achieving 99%+ uptime. Demonstrates strong practical expertise in reducing hallucinations (semantic chunking + metadata-driven retrieval) and managing latency, plus mature MLOps practices (Airflow/dbt pipelines, MLflow tracking, monitoring, A/B and shadow deployments) and effective collaboration with non-technical stakeholders.”
Intern Data Scientist specializing in AI, analytics, and cloud data engineering
“Built a production multimodal LLM-based vendor risk assessment platform that ingests SOC reports and other documents, uses a strict RAG pipeline with grounded evidence (page/paragraph citations), and dramatically reduces analyst review time. Experienced with LangGraph/LangChain/AutoGen for stateful, fault-tolerant agent workflows, and emphasizes reliability (schema validation, guardrails) plus low-latency delivery (~1–2s) through hybrid retrieval, reranking, caching, and model tiering.”
Mid-level Data Engineer specializing in cloud data platforms and scalable ETL pipelines
“Data engineer (~4 years) with full-stack delivery experience (Next.js App Router/TypeScript + React) building a real-time operations monitoring dashboard backed by Kafka and orchestrated data pipelines. Strong production focus: Airflow + CloudWatch monitoring, automated Python/SQL validation (99.5% accuracy), and CI/CD with Jenkins/Docker; has delivered measurable improvements in latency, pipeline reliability, and query performance (Postgres/Redshift).”
Mid-level Data Engineer specializing in cloud ETL/ELT and lakehouse architecture
“Data engineer focused on sales/marketing analytics pipelines, owning ingestion from CRMs/ad platforms through warehouse serving and dashboards at ~hundreds of thousands of records/day. Built reliability-focused systems including dbt/SQL/Python data quality gates with alerting, a resilient web-scraping pipeline (retries/backoff, anti-bot tactics, schema-change detection, backfills), and a versioned internal REST API with caching and strong developer usability.”
Mid-level Data Engineer specializing in real-time streaming and cloud data platforms
“Data engineer with Wells Fargo experience owning an end-to-end lakehouse ETL pipeline on Databricks/Azure Data Factory, processing ~480GB daily and implementing robust data quality/reconciliation across 40+ tables to reach ~99.3% reliability. Strong in performance optimization (cut runtime 5.5h→3.8h), CI/CD and monitoring, and resilient external/API ingestion with retries, schema validation, and backfills.”
Intern Data Scientist specializing in ML engineering and LLM agentic workflows
“Built an agentic, multi-step LLM system that generates full-stack code for API integrations using LangChain orchestration, Pinecone/SentenceBERT RAG, and a human-in-the-loop feedback loop for iterative code refinement. Also collaborated with non-technical content writers and PMs during a Contentstack internship to deliver a Slack-based AI workflow that generates and brand-checks articles with one-click approvals.”
Senior Full-Stack Developer specializing in Python, AWS serverless, and data workflows
“Backend/data engineer from ALDI Tech Hub who modernized legacy analytics (Excel/SAS) into production-grade Python services on AWS serverless (FastAPI on Lambda behind API Gateway with Step Functions). Strong in reliability and operations (Cognito auth, retries/timeouts, structured logging, CloudWatch alarms) and data pipelines (Glue ETL with schema evolution); delivered measurable SQL tuning gains (30s to 2s, 70% CPU reduction).”
Mid-level AI/ML Engineer specializing in GenAI and financial risk & compliance analytics
“Built and deployed a production LLM-powered financial risk and compliance platform to reduce manual trade exception handling and speed up insights from regulatory documents. Implemented a LangChain multi-agent workflow with structured/unstructured data integration (Redshift + vector DB) and emphasized hallucination reduction for regulatory safety using Amazon Bedrock. Strong MLOps/orchestration background across Kubernetes, Airflow, Jenkins, and monitoring/testing with MLflow, Evidently AI, and PyTest.”
Mid-level Data Analyst specializing in business intelligence and cloud data platforms
“Healthcare analytics professional with TCS/Humana experience turning messy claims and eligibility data into reliable reporting assets using SQL and Python. They combine strong data engineering and analytics execution with stakeholder management, including automating monthly claims reporting from half a day to under 5 minutes and driving a provider outreach effort that reduced claim rejection rates by about 20%.”
“Senior AI/ML engineer focused on production ML, LLMs, and MLOps, with concrete experience shipping fraud detection and enterprise RAG systems. They combine strong deployment and monitoring discipline with measurable business impact, including 31% precision improvement in fraud detection and 37% better answer relevance in a financial-document QA system.”
Mid-level Full-Stack Software Engineer specializing in cloud and data engineering
“Backend engineer with experience at Cigna evolving REST API services backed by PostgreSQL, emphasizing reliability/correctness, scalability, and observability. Has hands-on production experience with FastAPI (contract-first design, Pydantic schemas), performance tuning (indexes, caching), and secure auth patterns (OAuth/JWT, RBAC, row-level security via Supabase), plus low-risk incremental rollouts using feature flags and dual writes.”
Senior Full-Stack Software Engineer specializing in modern web apps and cloud platforms
“Backend/data engineer focused on production-grade Python microservices and AWS platforms, including a hybrid Lambda + ECS Fargate architecture managed with Terraform and CI/CD. Has hands-on reliability experience (JWT/OAuth, timeouts, retries, centralized error classification) and built AWS Glue/PySpark ETL pipelines consolidating PostgreSQL/RDS, MongoDB, and S3 sources into curated partitioned Parquet datasets. Demonstrated measurable SQL tuning impact (8 minutes to 25 seconds) and disciplined legacy-to-modern migrations with parity validation and UAT sign-off.”
Mid-level AI/ML Engineer specializing in healthcare NLP and MLOps
“ML/AI engineer with healthcare payer experience (Signal Healthcare, Cigna) who has shipped production fraud/claims prediction systems using Python/TensorFlow and exposed them via FastAPI/Flask microservices integrated with EHR and Salesforce. Emphasizes operational reliability and trust—Airflow-orchestrated pipelines with data quality gates plus SHAP-based interpretability, A/B testing, and drift/debug workflows—backed by reported outcomes of 22% lower false payouts and 17% higher model accuracy.”
Mid-level Data Scientist specializing in predictive modeling, NLP/LLMs, and RAG search systems
“Built production LLM/RAG platforms for financial services to enable natural-language Q&A over large policy/compliance document sets stored in Snowflake and SharePoint. Strong in MLOps and orchestration (Airflow, ADF, Step Functions, MLflow) and in solving real production issues like stale embeddings and model performance, including an incremental Snowflake Streams sync that cut processing time from hours to minutes.”