Pre-screened and vetted.
Director-level Performance Marketing leader specializing in growth strategy and behavioral science
Director of Enterprise Analytics specializing in AI/ML for healthcare and insurance
Staff AI & Data Engineer specializing in LLM systems and real-time data platforms
Junior Data Scientist & Data Engineer specializing in ML and scalable data pipelines
Director-level Business Operations & GTM Strategy leader specializing in analytics and performance
“ZS consultant/product owner who repeatedly turns vague GTM performance goals into decision-centric analytics products and operating rhythms. Has scaled analytics initiatives from pilots into $1M+ platforms with 100+ leader adoption, and tied measurement tools to material business outcomes (e.g., $10M+ incremental revenue impact) through change management and cross-functional alignment.”
Mid-level Software Engineer specializing in cloud data platforms and distributed systems
“Backend/data engineer with production experience building FastAPI services with strong reliability patterns (circuit breaker, rate limiting, caching, graceful degradation) and JWT/OAuth2 auth. Has delivered AWS EKS deployments via Terraform with Secrets Manager/IRSA and HPA autoscaling, and built Glue/Spark ETL pipelines on S3 Parquet with schema-evolution and idempotent reruns; also demonstrated measurable SQL tuning impact (20–30s to <10s).”
Intern Software Engineer specializing in data engineering and LLM/RAG systems
“Built and productionized enterprise LLM/RAG systems, including a Boeing internal solution that gave 400+ program managers conversational access to 1M+ rows of schedule data, with strong emphasis on governance, reliability, and reducing hallucinations in tabular domains. Also has experience running developer-focused workshops (UC Berkeley computer architecture) and partnering with customer-facing stakeholders to drive adoption of a compliance-sensitive NLP product (SEC-aligned) at Penserra.”
Mid-level Backend & Reliability Engineer specializing in AWS, Kubernetes, and automation
“Meta engineer focused on reliability/operations tooling who built a unified real-time health dashboard and scalable telemetry pipelines (AWS + Datadog) for thousands of devices. Also shipped an internal LLM-powered knowledge assistant using RAG over wikis/runbooks/logs with strong guardrails and a rigorous eval loop that drove measurable accuracy improvements via automated doc ingestion and embedding updates.”
Director of Enterprise Architecture specializing in finance systems, data platforms, and AI
“Architect/engineering leader who built a multi-tenant AI platform end-to-end, including a secure FastAPI orchestrator (JWT, RBAC, tenant isolation, auditing) and an extensible MCP tool-routing layer, then productionized it via fully containerized microservices (Docker, Postgres/pgvector, Redis). Also has strong governance and compliance experience (ARB with security/privacy/SOX) and has owned high-severity incidents through mitigation and RCA/RCCA, plus prior high-volume payments/accounting data pipeline design with audit-grade integrity checks.”
Junior AI Engineer specializing in healthcare analytics and compliance AI
“Built and shipped a production LLM-driven multi-agent platform (ciATHENA) at CustomerInsights.AI to automate analytics/ML/compliance workflows in healthcare and life sciences. Implemented LangGraph/LangChain orchestration with strong backend-style rigor (schemas, Pydantic validation, retries, auditability) and optimized latency/cost while keeping the system usable for non-technical users via guided natural-language interactions and structured/visual outputs.”
Senior Full-Stack Software Engineer specializing in workflow automation and healthcare AI
“Backend/data engineer who has owned production Python APIs and high-throughput async workflows on AWS (FastAPI, Docker, ECS/EKS/Lambda) with mature reliability practices like idempotency, bounded retries, circuit breakers, and strong observability. Also built AWS Glue ETL into an S3/Redshift lakehouse and modernized legacy batch systems via parallel-run parity testing and feature-flagged migrations, including a SQL tuning win cutting a multi-minute query to under 10 seconds.”
Director-level Data Engineering & MDM leader specializing in enterprise data platforms
“Former Xendit (YC-backed fintech) operator applying outcome-driven, scalable product-building experience to a new startup: a real-time, personalized real estate investment intelligence platform. Has begun prototyping, completed ROI analysis, and validated the space with US market research plus broker interviews in Chicago and Los Angeles, aiming to differentiate from Zillow/Redfin with goal-based ranked investment recommendations.”
VP of Regulatory Reporting specializing in FR Y-14 and SEC filings
“Finance/real-estate professional currently at Citi supporting FRB exam work (transaction testing, data-quality root cause analysis) and coordinating cross-functional regulatory responses with executive communications. Previously at JPMorgan Global Real Estate, automated an Excel-based depreciation forecast for a commercial office portfolio using complex formulas to handle overlapping project-phase data.”
Senior Data Engineer specializing in cloud data platforms and large-scale ETL
Mid-level Software Engineer specializing in backend microservices, data pipelines, and QA
Mid-level Backend/API Software Engineer specializing in identity and data observability SaaS
Mid-level Software Engineer specializing in cloud-native microservices and FinTech
Mid-level Machine Learning Engineer specializing in LLM inference and MLOps
Director of Product specializing in AI-powered data platforms for connected vehicles and MLOps
Mid-Level Python Developer specializing in Django, data pipelines, and automation
Senior Backend/Platform Engineer specializing in Python and AWS
“Backend/data engineer with hands-on production experience across Python/FastAPI services and AWS (Lambda, API Gateway, SQS, ECS) delivered via Terraform and GitHub Actions. Built Glue-to-Redshift ETL pipelines with Step Functions retry/catch patterns, schema evolution safeguards, and data quality checks; also modernized a legacy SAS monthly reporting system into Python microservices with rigorous side-by-side parity validation. Demonstrated strong SQL tuning skills with a reported improvement from 5 minutes to 15 seconds.”