Pre-screened and vetted.
Junior Data Engineer / Analyst specializing in AI/ML data infrastructure
“Built and deployed a compliance-sensitive LLM pipeline that extracts rebate logic from hospital–supplier medical contracts, using multi-layer redaction (regex/NER/dictionary), schema-validated structured outputs, and secure placeholder reinsertion. Hosted models on Amazon Bedrock to avoid retraining on sensitive data and improved both accuracy and cost by splitting the workflow into a lightweight section classifier plus a fine-tuned extraction model, orchestrated with LangChain and evaluated via layered, test-driven agent assessments.”
Principal Data Scientist & Software Engineer specializing in space mission data systems
“Space/heliophysics ML engineer who built a PyTorch GRU model to propagate solar wind from L1 to the magnetopause with probabilistic outputs for uncertainty quantification, achieving ~25% better CRPS than standard approaches. Also developed production-grade Python ETL and an open-source telemetry processing package for a mission (LEXI), using Docker and GitHub Actions CI/CD and iterating with scientist/engineer stakeholders.”
Mid-level Data Engineer specializing in scalable ETL, streaming analytics, and cloud data platforms
“At Dreamline AI, built and productionized an AWS-based incentive intelligence platform that uses Llama-2/GPT-4 to extract eligibility rules from unstructured state policy documents into structured JSON, then processes them with Glue/PySpark and serves results via Lambda/SageMaker/API Gateway. Designed state-specific ingestion connectors plus schema validation and automated checks/alerts to handle frequent policy/format changes without breaking the pipeline, and partnered with business/analytics stakeholders to deliver interpretable eligibility decisions via explanations and dashboards.”
Mid-level AI/ML Engineer specializing in NLP, Generative AI, and MLOps in Financial Services
“ML/LLM engineer at Charles Schwab who built a production loan-advisor chatbot integrated with internal knowledge and loan-calculator APIs, adding strict numeric validation to prevent rate hallucinations and optimizing context to control costs. Also runs ~40 Airflow DAGs orchestrating retraining/ETL/drift monitoring with an automated Snowflake→SageMaker→auto-deploy pipeline, and uses rigorous testing plus canary rollouts tied to business metrics and compliance constraints.”
Mid-level AI/ML Engineer specializing in GenAI agents, RAG pipelines, and MLOps
“AI/ML engineer who built a production RAG-based internal document intelligence assistant (LangChain + Pinecone) to let employees query enterprise reports in natural language. Demonstrated hands-on pipeline orchestration with Apache Airflow and tackled real production issues like retrieval grounding and latency using tuning, caching, and token optimization, while partnering closely with non-technical business stakeholders through iterative demos.”
Junior QA Automation Engineer specializing in banking and trading platforms
“QA automation engineer with Barclays digital banking experience who owned an end-to-end regression suite across UI, API, and database layers (Selenium/TestNG, REST Assured, SQL) and integrated it into CI/CD (Jenkins/GitLab). Known for preventing high-impact financial defects like duplicate transaction postings by adding backend SQL validations, negative/edge-case coverage, and converting production issues into automated regression tests; also strong in Cypress flake reduction using cy.intercept/cy.session and stable selectors.”
Senior Data Engineer specializing in cloud-native data platforms for finance and healthcare
“Data engineer/backend data services practitioner with Bank of America experience building real-time and batch transaction-monitoring pipelines and APIs (Kafka + databases, REST/GraphQL). Highlights include a reported 45% response-time improvement through performance optimizations and use of Delta Lake schema evolution plus CI/CD (GitHub Actions/Jenkins) and operational reliability patterns like CloudWatch monitoring and dead-letter queues.”
Junior Machine Learning Engineer specializing in LLM evaluation and GenAI pipelines
“LLM/agent engineer who built a production LangGraph multi-agent orchestrator connecting GitHub and APM/observability signals with a chain-of-verification loop for root-cause analysis. Emphasizes pragmatic architecture (start simple with state summaries), performance tuning (async LLM calls, Docker), and rigorous evaluation (LLM-as-judge, adversarial testing, hallucination/instruction adherence metrics, tool-call tracing) while iterating with non-technical stakeholders via A/B testing.”
Mid-level Data Scientist specializing in ML, MLOps, and customer analytics
“ML/NLP practitioner focused on insurance/claims analytics for a large financial firm, working with millions of fragmented structured and unstructured records. Built production-grade pipelines for entity extraction, entity resolution, and semantic search using Sentence-BERT + vector DB, including fine-tuning with contrastive learning (reported ~15% recall lift) and scalable ETL/containerized deployment on Kubernetes.”
Mid-Level Software Developer specializing in Java, Cloud, and Microservices
“Backend/Python engineer who owned an end-to-end FastAPI + AWS internal natural-language document Q&A system (Textract extraction, embeddings/vector DB, LLM integration) with strong focus on reliability and latency. Hands-on with Kubernetes + GitOps (Argo CD, Helm, rolling updates/auto-rollback) and built/optimized Kafka streaming pipelines using Prometheus/Grafana. Also supported a zero-downtime on-prem to cloud migration with parallel run and gradual traffic cutover.”
Senior Data Engineer specializing in cloud data platforms and big data pipelines
“Data engineer focused on building reliable, production-grade pipelines and external data collection systems on AWS (S3/Lambda/SQS/Glue/EMR) using PySpark/SQL, serving curated datasets to Snowflake/Redshift for finance and fraud teams. Has operated a large-scale crawler ingesting millions of records/day with anti-bot tactics, schema versioning/quarantine, and CloudWatch/Datadog monitoring, and also shipped a versioned REST API with caching and query optimization.”
Intern Full-Stack/Software Engineer specializing in web apps, cloud, and data/ML systems
“Built and productionized LLM-driven content intelligence/SEO agents for a high-traffic media platform, automating tagging/summarization/metadata with FastAPI + async orchestration and strict JSON-schema outputs. Demonstrated measurable impact (40% faster publishing, +20% organic traffic in 3 months) and strong reliability practices (offline evals, shadow mode, canaries, fallbacks, idempotency, and monitoring).”
Mid-level Software Engineer specializing in LLM agents and ERP-integrated workflow automation
“Built and shipped a production LLM-powered agent that automated purchasing and inventory operations by integrating with live ERP data and returning structured, machine-readable outputs usable by downstream systems. Emphasizes real-world reliability through orchestration, strict schemas/validation, confidence-based fallbacks with human handoff, and monitoring/evaluation feedback loops to reduce silent failures and make issues observable.”
Mid-level Data Engineer specializing in cloud ETL/ELT and big data pipelines
“Data engineer focused on production-grade pipelines and data services: ingests millions of records/day into S3, performs SQL/Python quality validation and PySpark/SQL transformations, and serves curated datasets via Athena/Redshift. Has experience hardening external data collection with retries/rate-limit handling and shipping versioned internal data APIs with backward compatibility, monitoring, and CI/CD in early-stage environments.”
Mid-level ML Data Engineer specializing in MLOps and scalable healthcare data pipelines
“Data/ML platform engineer with healthcare (Cigna) experience owning an end-to-end pipeline spanning Airflow + Debezium CDC ingestion, PySpark/SQL transformations, rigorous data quality gates, and feature-store/API serving for ML training and inference. Worked at 10+ TB scale and cites a ~30% latency reduction plus stronger reliability via idempotent design, monitoring, and backfill-safe reprocessing; also built pragmatic early-stage data pipelines at Frankenbuild Ventures.”
Senior Backend Software Engineer specializing in microservices, Kafka, and cloud-native AWS platforms
“LLM/agent engineer with production experience in the insurance claims domain, integrating OpenAI + LangChain into a claims platform to automate unstructured document extraction/classification and cut manual effort by 35%. Built reliable, fault-tolerant AWS/Kubernetes microservices with CloudWatch monitoring plus circuit breakers/retries/fallbacks, and implemented multi-step Spring Boot orchestration with schema validation, confidence gating, and human-in-the-loop handling for low-confidence cases.”
Mid-level Data Engineer specializing in cloud lakehouse, streaming, and MLOps
“Data engineer at AT&T focused on large-scale telecom (5G/IoT) data platforms, owning end-to-end pipelines from Kafka/Azure ingestion through Databricks/Delta Lake transformations to serving analytics and ML. Has operated at very high volumes (~50+ TB/day) and delivered measurable performance gains (25–30% faster processing) plus improved reliability via Airflow monitoring, robust data quality checks, and resilient external data collection patterns (rate limiting, retries, dynamic schemas).”
Mid-level Data Engineer specializing in cloud data platforms and streaming pipelines
“Data engineer currently at American Airlines who built and owned end-to-end flight operations and booking data pipelines (batch + real-time) using Azure Data Factory, Kafka, Spark/Databricks, Synapse, and Snowflake—processing hundreds of GBs/day. Strong focus on reliability and data quality (idempotency, checkpointing, retries, validation/alerts) and delivered near-real-time analytics powering Power BI dashboards; previously helped stand up an early-stage data platform at Sysco on AWS (Glue/S3/Redshift) with Airflow and Jenkins CI/CD.”
Junior Business & Operations Analyst specializing in banking analytics
“Analytics professional with Bank of America experience analyzing the digital card replacement journey at scale, transforming 20M event records across 900K clients into actionable reporting and recommendations. Stands out for combining SQL/Python data engineering, customer journey analytics, and executive-facing storytelling strong enough to get senior leadership to adopt most of their proposed app improvements.”
Mid-level Data Analyst specializing in business intelligence and customer analytics
“Healthcare-focused data analyst with hands-on experience at Molina Healthcare building SQL and Python workflows for retention and churn analytics. They combined enrollment, CRM, and claims data into Power BI reporting, automated predictive churn analysis, and tied their work to measurable outcomes including faster processing, better reporting accuracy, and reduced churn.”
Mid-level NetSuite Consultant specializing in ERP implementations and integrations
“Enterprise NetSuite Financials professional with hands-on ownership of GL configuration, month-end close, reporting, and production support in a multi-subsidiary environment at Bank of America. Particularly strong in stabilizing finance operations under deadline pressure, improving close efficiency, and building audit-ready reporting and controls across NetSuite and integrated systems.”
Intern Software Engineer specializing in full-stack development and AI/ML
“Built and maintains an AI Finance Tracker end-to-end as a solo full-stack product owner, from Figma designs and React frontend to Flask APIs, Firestore, auth, deployment, and AI insights. Stands out for combining product instinct with pragmatic engineering decisions like pre-aggregating financial data to control LLM costs and adding OCR receipt scanning based on real user feedback.”
Mid-level Salesforce Developer specializing in CRM automation and integrations
“Salesforce-focused engineer with hands-on depth across Service Cloud, OmniScript/DataRaptor, LWC, Aura, and Apex. Particularly strong in building metadata-driven workflows that let operations teams change intake flows without developer tickets, while still making disciplined architecture decisions around when custom code is actually warranted.”
Mid-level Software Engineer specializing in FinTech trading platforms
“Built and deployed internal trading tools at Wells Fargo that reduced manual production-support dependency for trader configuration workflows. Brings hands-on experience in financial systems, data quality, and production incident resolution, including building 400+ SQL validation rules and designing an internal RAG assistant for engineering documentation.”