Pre-screened and vetted.
“Built an AI-driven insurance policy summarization platform at Marsh, taking it end-to-end from messy PDF ingestion/OCR and custom extraction through LLM fine-tuning and AWS SageMaker deployment. Delivered measurable impact (25% reduction in manual review time, 99% uptime) and demonstrated strong production MLOps/LLMOps practices with Airflow/Step Functions orchestration, rigorous evaluation (ROUGE + human review), and continuous monitoring for drift, latency, and hallucinations.”
Mid-level Data Scientist specializing in ML, MLOps, and customer analytics
“ML/NLP practitioner focused on insurance/claims analytics for a large financial firm, working with millions of fragmented structured and unstructured records. Built production-grade pipelines for entity extraction, entity resolution, and semantic search using Sentence-BERT + vector DB, including fine-tuning with contrastive learning (reported ~15% recall lift) and scalable ETL/containerized deployment on Kubernetes.”
Entry-level AI/ML Engineer specializing in LLMs, RAG, and DevOps automation
“Built and owned a production-scale AI-driven software release/version intelligence platform orchestrated via GitHub Actions that tracks 1000+ upstream repositories and automatically generates SLA-bound JIRA upgrade tickets for hardened container images. Replaced brittle regex/PEP440 parsing with an LLM-based semantic filtering layer plus deterministic validation to handle noisy/inconsistent GitHub tags at scale, with monitoring for coverage, latency, and correctness validated against upstream ground truth.”
Mid-Level Full-Stack Software Engineer specializing in cloud microservices and real-time analytics
“Software engineer who built a reusable React component package (UI modules, auth helpers, API client wrappers) for an AI SaaS background-removal project, emphasizing performance (tree shaking/dynamic imports) and reliability (Jest + Storybook). Also delivered a unified REST API for Samsung Big Data Portal, resolving cross-team issues by standardizing schemas, improving validation/logging, and operating effectively amid shifting requirements.”
Mid-level Data Engineer specializing in cloud lakehouse, streaming, and MLOps
“Data engineer at AT&T focused on large-scale telecom (5G/IoT) data platforms, owning end-to-end pipelines from Kafka/Azure ingestion through Databricks/Delta Lake transformations to serving analytics and ML. Has operated at very high volumes (~50+ TB/day) and delivered measurable performance gains (25–30% faster processing) plus improved reliability via Airflow monitoring, robust data quality checks, and resilient external data collection patterns (rate limiting, retries, dynamic schemas).”
Junior Data Analyst specializing in analytics, BI, and machine learning
“Analytics-focused candidate with experience owning end-to-end data projects across AI transcription, retail forecasting, and transportation revenue analytics. They combine strong SQL/Python pipeline skills with dashboarding and stakeholder alignment, citing measurable impact including 60% lower ETL latency, 18% better forecast accuracy, and 25% operational efficiency gains.”
Senior Applications Engineer specializing in legal technology and eDiscovery
“Early-stage founder candidate exploring an AI-enabled legal tech startup focused on document intelligence, secure workflows, and enterprise automation. Brings a rare blend of technical architecture fluency and product/business thinking, with clear firsthand insight into legal and document-heavy operational pain points.”
Mid-level Machine Learning & GenAI Engineer specializing in LLMs, RAG, and NLP
“Built and deployed an LLM-powered customer support assistant (“Notable Assistant”) focused on automating common post-customer queries while maintaining multi-turn context and meeting scalability/latency needs. Experienced with production orchestration and operations using Kubernetes and Apache Airflow (DAG-based ETL, scheduling, monitoring/alerts), and has partnered closely with customer service stakeholders to align chatbot behavior with brand voice through iterative testing.”
Senior AI/ML & Data Engineer specializing in Generative AI and RAG systems
“GenAI/RAG engineer who has deployed a production policy/regulatory search assistant for a financial client using LangChain + Vertex AI, FastAPI, Docker/Kubernetes, and Airflow-orchestrated data pipelines. Demonstrated measurable impact with 50–60% latency reduction and 70% fewer pipeline failures, plus KPI-driven grounding evaluation (90%+ target) and strong cross-functional collaboration with compliance/business teams.”
Senior Engineering Manager specializing in Big Data and Cloud Data Platforms
“Engineering leader focused on developer platforms and open-source frameworks/SDKs, with strong community and release-engineering chops. Drove major reliability and DX improvements (30–50% faster release cycles; 2–3x repeat contributors; ~50% faster onboarding) and led an incremental Python monolith to TypeScript event-driven migration using Protobuf contracts, feature flags, and a plugin architecture to preserve backward compatibility.”
Mid-Level Software Engineer specializing in backend systems and applied machine learning
Mid-level AI/ML Engineer specializing in risk modeling, NLP, and generative AI (RAG/LLMs)
Mid-level Data Analyst specializing in banking analytics and machine learning
Mid-level Data Engineer specializing in financial risk, compliance, and real-time streaming
Mid-level Full-Stack Software Engineer specializing in cloud microservices (FinTech & Healthcare)
Mid-level AI/ML Engineer specializing in NLP, recommender systems, and Generative AI
Mid-level AI/ML Engineer specializing in scalable ML, NLP, and time-series forecasting
Junior Full-Stack Software Engineer specializing in APIs, microservices, and AI applications
Mid-level Full-Stack Java Developer specializing in cloud-native microservices
Mid-level AI/ML Engineer specializing in LLMs, RAG systems, and MLOps
Mid-level Machine Learning Engineer specializing in NLP, Computer Vision, and LLMs
Mid-level Data Engineer specializing in cloud-native data pipelines and streaming