Pre-screened and vetted.
Mid-level Business Intelligence Analyst specializing in SAP and healthcare reporting
“Analytics professional with hands-on experience turning messy SAP enterprise data into trusted reporting layers and building end-to-end Python/Tableau analytics products. Stands out for combining technical rigor with business alignment—improving report accuracy by 30%, cutting refresh times by 25%, and independently delivering a CLV segmentation project across 96,000 customers that informed retention strategy.”
Mid-level Data Engineer specializing in cloud data platforms and AI/ML analytics
“Backend/data engineer in healthcare who built an AWS-based clinical analytics platform from scratch (DynamoDB/S3/Airflow/dbt) with sub-second clinician query goals, 99.9% uptime, and HIPAA-grade controls (KMS encryption, IAM RBAC, audit trails). Also modernized ML delivery by replacing a manual 4-hour deployment with a 30-minute Docker/GitHub Actions CI/CD pipeline using parallel runs, parity testing, and rollback, and caught critical EHR data edge cases (date formats/timezones) that could have impacted patient care.”
Senior Python Developer specializing in FastAPI, Django, and cloud-native web applications
“Backend engineer working on Plumas Bank’s digital modernization, building a FastAPI-based loan origination/processing system with OAuth2/JWT security, AWS Lambda-driven PDF document generation to S3, and MongoDB integration. Has led a legacy workflow migration to a new microservice using dual-write/dual-read and monitoring, and emphasizes multi-tenant isolation via layered API controls plus row-level security.”
Junior AI/ML Engineer specializing in LLM agents and RAG systems
“Backend/data engineer who built a production-ready multi-agent financial intelligence system (Mycroft) that orchestrates specialized AI agents to analyze real-time market data using FastAPI and Pinecone vector search. Brings strong security/reliability instincts (rate limiting, JWT/OAuth2, retries/backoff, health checks) and has caught high-impact data integrity issues in financial migrations (timezone normalization across global legacy systems).”
Mid-level AI & Machine Learning Engineer specializing in Generative AI and MLOps
“Built a production GPT-4/LangChain/Pinecone RAG “AI Copilot” at Northern Trust to automate financial report generation and analyst Q&A over internal structured (SQL warehouse) and unstructured policy data. Focused on real-world production challenges—grounding and latency—achieving major speed gains (seconds to milliseconds) via MiniLM embedding optimization and Redis caching, and implemented rigorous testing/evaluation with MLflow-backed metrics while aligning compliance and finance stakeholders for deployment.”
Entry-Level Software Engineer specializing in distributed systems and backend infrastructure
“Built and operated an end-to-end customer-facing "Record Platform" web product as both engineer and primary user, focusing on reliability and correctness in core flows like search and checkout. Implemented a TypeScript/React frontend with a multi-service backend and Kafka-based event-driven architecture, and created internal tooling to automate risky ops like Kubernetes TLS certificate rotation with k6 load/chaos testing (including HTTP/2 and HTTP/3 validation).”
Intern AI Researcher specializing in NLP, LLMs, and knowledge graphs
“Built and shipped “LabMate,” a production AI assistant specialized in laboratory hardware, using a weighted multi-source RAG pipeline with reranking and reasoning-focused query decomposition to handle complex user questions. Deployed on a local GPU cluster with vLLM and NVIDIA MPS (plus OCR/VLM components), and established evaluation using synthetic + public reasoning datasets while collaborating weekly with non-technical admins to align requirements and resource constraints.”
Senior Backend Engineer specializing in Python microservices and cloud-native systems
“Backend/data platform engineer who owned a FastAPI + Kafka microservice in Verizon’s billing pipeline, handling high-volume usage ingestion/validation/enrichment with strong observability and CI/CD on AWS EKS. Demonstrated measurable performance gains (latency down to ~120–150ms; Kafka throughput +30–40%; DB CPU -25%) and led an on-prem ETL-to-AWS migration using Terraform, parallel validation, and phased cutover with zero downtime.”
Mid-level QA Testing Analyst specializing in healthcare claims adjudication and PBM workflows
“QA automation engineer with strong Cypress/JavaScript experience in healthcare claims and eligibility systems, owning end-to-end regression suites that combine UI, API, and SQL/database validations. Known for catching subtle pricing/benefit calculation defects (copay/deductible/accumulator issues) before release, stabilizing flaky CI tests via API synchronization, and shaping requirements early to improve testability and reduce downstream rework.”
Mid-level Software Engineer specializing in Healthcare IT & HL7 FHIR interoperability
“Backend/platform engineer with Optum experience owning a production FHIR Member Access API aligned to CMS interoperability requirements. Built and scaled Spring Boot/HAPI FHIR microservices on AWS (Docker/Kubernetes) with zero-downtime CI/CD, and operated them with strong observability (Dynatrace, logs/metrics, alerting) and incident response. Also implemented a Kafka-based FHIR bulk data pipeline with schema versioning, idempotent processing, and reliable backfills/replays.”
Software Engineering Intern specializing in real-time analytics and distributed systems
“Built a production AI legal search platform that uses a retrieval-first, source-grounded LLM pipeline with confidence-based fallbacks and structured, traceable outputs to reduce hallucinations and improve trust. Also has experience at Discover Excellence building real-time analytics and identity stitching systems, emphasizing conservative data validation, idempotent processing, and fault-tolerant queue-based workflows.”
Mid-Level Full-Stack Software Engineer specializing in cloud-native data platforms and AI apps
“Software engineer who has owned customer-facing/internal platforms end-to-end, emphasizing fast iteration through small releases backed by monitoring and rollback safety. Built SurveyAI with reusable React/TypeScript components and a stateless Node.js REST backend with clear API contracts/validation, and created an internal Airflow + AWS Lambda automation tool integrated with Slack alerts to reduce manual work and improve response time.”
Entry Mechatronics Engineering student specializing in robotics and embedded control
“Hands-on builder with projects spanning IoT/backend data systems and robotics: created a flow meter pipeline that sends readings over WiFi to a database and web server with real-time averaging and easy data download. Also diagnosed a hard-to-catch AGV/AMR reliability issue during long autonomous runs (battery voltage drop causing sensor failures) and mitigated it with voltage monitoring and a buffer circuit; familiar with Python/FastAPI API best practices (Pydantic, async, OAuth2/JWT).”
Senior Full-Stack Software Engineer specializing in cloud-native systems and AI/ML
“Backend engineer who significantly evolved an internal Resource Manager platform, moving from a monolith to microservices and improving onboarding speed while reducing integration errors. Has hands-on experience building reliable and secure Python/FastAPI APIs (Pydantic schemas, circuit breakers, caching, metrics/alerts) and leading zero-downtime migrations with strong data integrity patterns (dual writes, idempotency, reconciliation checks).”
Mid-level Full-Stack Developer specializing in cloud data engineering and analytics
“Software developer with hands-on experience owning customer-facing work end-to-end (requirements, implementation, testing, and feedback-driven iteration) using Python and React.js. Also described remodeling an internal legacy page/tool to improve performance and accuracy, and has exposure to microservices and RabbitMQ plus ETL-based system work.”
Mid-level AI & Data Scientist specializing in LLMs, RAG, and healthcare NLP
“Built a production LLM/RAG solution for healthcare operations teams to query large policy and care-guideline repositories in natural language. Improved domain alignment using vector retrieval plus parameter-efficient fine-tuning and prompt optimization, validated through internal user testing and metrics, cutting manual lookup time by ~40%. Also has hands-on experience orchestrating automated ML pipelines with Apache Airflow.”
Intern Data Engineer specializing in Snowflake pipelines and AI/ML analytics
“Built and operated an end-to-end TypeScript/Node AI agent platform for high-volume financial data that generates explainable investment signals and automates execution via resilient Playwright browser automation. Uses Postgres + pgvector/Prisma for RAG retrieval, Redis for async orchestration, Zod-based boundary validation as a circuit breaker, and OpenTelemetry for tracing/latency monitoring; also designed a TypeScript SDK with semver, scoped bearer-token auth, CLI key rotation, and interactive Swagger docs.”
Mid-level Implementation Engineer specializing in enterprise integrations and IAM/PAM
“Data/ML engineer with end-to-end ownership of donor-data deployments for a university foundation, delivering major performance and data-quality gains (500K+ records; 24h to 6h processing; duplicates 5% to 1%). Has put an LLM-assisted enrichment workflow into production with retrieval-grounded business rules, versioned outputs for traceability, and strong operational rigor around validation, logging, and CI/CD.”
Junior Data Analyst specializing in BI, ETL, and reporting
“Analytics professional with hands-on experience building SQL and Python workflows across SAP, Oracle, and internal operational systems, processing roughly 5 million records per month. They combine strong data quality rigor with stakeholder-friendly Power BI reporting, and cite a concrete impact of cutting reporting turnaround time from four days to two while surfacing cost anomalies for business teams.”
Mid-level Business Analyst specializing in analytics, operations, and supply chain
“Analytics candidate with hands-on experience improving enterprise reporting and operational decision-making at Reliance and Wendy’s. They combine SQL optimization, Python automation, sentiment analysis, and dashboarding to deliver measurable impact, including cutting report runtimes from 3 minutes to 1 minute, improving model accuracy from 70% to 80%, and reducing supplier past dues by 30%.”
Senior Backend Developer specializing in Python and AWS cloud-native systems
“Backend/data engineer with production experience building Python FastAPI services and AWS-native data pipelines. Has delivered containerized and serverless workloads (ECS/EKS/Lambda) with Terraform-based IaC, strong reliability patterns (JWT/RBAC, retries/circuit breakers, observability), and AWS Glue ETL into S3/Redshift. Demonstrated measurable SQL performance wins (40–50s to <4s) and owned real pipeline incidents through detection, mitigation, and prevention.”
“Backend/data engineer who builds Python (FastAPI) data-processing API services for internal analytics/reporting, emphasizing modular architecture, async performance tuning, and reliability patterns (health checks, retries, observability). Also migrated legacy on-prem ETL pipelines to Azure using ADF/Data Lake/Functions and implemented a near-real-time ingestion flow with Event Hubs plus watermarking to handle late events and deduplication.”
Senior QA Automation Engineer specializing in test automation and CI/CD quality gates
“QA automation engineer focused on end-to-end quality for a CMS lien registration workflow, owning a Playwright-based regression suite covering high-risk paths (creation, amendments, cancellation, batch file validation). Demonstrated impact by catching a UI change that bypassed required-field validation pre-release, stabilizing flaky CI tests using network-response signals, and driving clearer acceptance criteria and observability improvements (request IDs in logs) through cross-functional collaboration.”