Pre-screened and vetted.
Mid-level Full-Stack Software Engineer specializing in FinTech and cloud platforms
“Software engineer who built and launched an end-to-end Ad Scheduler that automated campaign creation across Google Ads and Meta using Azure Functions/Service Bus, PostgreSQL, and a React frontend—reducing manual marketing ops work. Also shipped a production internal RAG chatbot leveraging a data warehouse + Cube semantic layer, Gemini embeddings, vector search, and Claude, with Langfuse tracing and brand-based access controls; work was cut short due to layoffs.”
Director-level Data Science & Analytics Leader specializing in cloud data platforms and AI/ML
“Candidate states they are very familiar with the venture capital/studio/accelerator landscape and expresses strong willingness to pursue entrepreneurship "at all costs," but did not provide details on a current startup, business plan, fundraising, or prior accelerator/VC involvement during the interview.”
Mid-level Data Engineer specializing in cloud data pipelines for healthcare and financial services
“Data engineer with ~4 years of experience (Cigna) building and operating Azure Data Factory pipelines for healthcare claims/member/provider data at 2–3M records/day. Emphasizes reliability and downstream safety via schema/data-quality validation, quarantine workflows, idempotent processing, and backfills; also improved runtime ~20% through SQL optimization and served curated datasets through versioned views and well-documented, analyst-friendly interfaces.”
Mid-level Data Engineer specializing in multi-cloud data platforms for healthcare and finance
“Data engineer with Cigna experience building and operating an end-to-end AWS-based healthcare claims pipeline processing ~2TB/day, using Glue/Kafka/PySpark/SQL into Redshift. Strong focus on data quality and reliability (schema validation, monitoring/alerting, retries/checkpointing/backfills), reporting improved accuracy (~99%) and reduced latency, plus experience serving real-time Kafka/Spark data to downstream analytics with documented data contracts.”
Mid-level Data Engineer specializing in cloud ETL and real-time streaming
“Data engineer focused on AWS + Spark/Databricks pipelines, including an end-to-end nightly loan-data ingestion flow (~2.2M records) from Postgres/S3 through Glue and Databricks into a DWH with layered validation and alerting. Also built real-time streaming with Kafka + Spark Structured Streaming and a master’s project streaming Reddit data for sentiment analysis under ambiguous requirements and tight budget constraints.”
Mid-level Data Analyst specializing in financial services and fraud analytics
“Analytics candidate currently at Facteus with hands-on experience turning messy transactional data into trusted reporting layers in Snowflake and Power BI. They combine SQL and Python automation with strong validation, performance tuning, and stakeholder-facing metric design, including cohort-based retention and segmentation work that improved trust and adoption of analytics.”
Executive IT and Operations leader specializing in digital transformation and security
“Candidate is very familiar with the venture capital and broader investment landscape, but is not interested in founding a company. They have worked with several TPG-backed or TPG-owned organizations, helping drive business scaling, cost reduction, and execution against investor governance requirements.”
Mid-level AI/ML Engineer specializing in cloud data engineering and GenAI
“AI/LLM engineer with production experience in legal tech: built a GPT-4 + LangChain RAG summarization system at Govpanel that reduced legal case-file review time by 50%+. Previously at LexisNexis, orchestrated end-to-end Airflow data/AI pipelines processing 5M+ legal documents daily, improving ETL runtime by 35% with robust validation, monitoring, and SLAs.”
Mid-level Supply Chain Analyst specializing in strategic sourcing and supply chain analytics
“Sourcing/procurement professional focused on electronic components, leading end-to-end RFQ and supplier selection through negotiation and delivery. Demonstrates strong data-driven cost management (Power BI modeling, benchmark pricing) and measurable results including ~18% cost reduction, avoiding production delays during shortages, and automating RFQ comparisons to cut cycle time ~40%.”
“Analytics professional with Northern Trust experience focused on investment portfolio reconciliation and reporting. They combine SQL, Python, and Power BI to clean and validate high-volume financial data, automate manual processes, and align operations and accounting teams on shared metrics—driving roughly 20% improvement in reconciliation accuracy.”
Mid-level Data Analyst specializing in BI, analytics, and healthcare data
“Analytics professional at Optum with hands-on experience turning messy healthcare claims data from SQL, Excel, and CRM systems into validated reporting datasets and Power BI dashboards. They also built reproducible Python workflows for claims analysis and owned an end-to-end project focused on improving claims processing efficiency through metric design, segmentation, and stakeholder-driven operational improvements.”
Mid-level AI/ML Engineer specializing in Generative AI and data engineering
“IBM engineer who built and deployed a production RAG-based LLM assistant using LangChain/FAISS with a fine-tuned LLaMA model, served via FastAPI microservices on Kubernetes, achieving 99%+ uptime. Demonstrates strong practical expertise in reducing hallucinations (semantic chunking + metadata-driven retrieval) and managing latency, plus mature MLOps practices (Airflow/dbt pipelines, MLflow tracking, monitoring, A/B and shadow deployments) and effective collaboration with non-technical stakeholders.”
Senior Solutions Engineer specializing in blockchain governance and compliance analytics
“Consulting background (Accenture) delivering technically complex solutions involving on-chain data and strict government security standards, including building isolated sandbox environments to move from PoC to production. Experienced in debugging agentic/LLM-style workflows (e.g., document scanning issues) with deterministic guardrails, preprocessing, and strong logging/monitoring. Has led large-scale crypto wallet workshops (including for the CFTC) and helped win business via clear, layered technical demos; also built internal marketing taxonomy tooling and drove adoption through cross-functional alignment.”
Mid-level Data Engineer specializing in cloud data platforms and scalable ETL pipelines
“Data engineer (~4 years) with full-stack delivery experience (Next.js App Router/TypeScript + React) building a real-time operations monitoring dashboard backed by Kafka and orchestrated data pipelines. Strong production focus: Airflow + CloudWatch monitoring, automated Python/SQL validation (99.5% accuracy), and CI/CD with Jenkins/Docker; has delivered measurable improvements in latency, pipeline reliability, and query performance (Postgres/Redshift).”
Mid-level Data Engineer specializing in cloud ETL/ELT and lakehouse architecture
“Data engineer focused on sales/marketing analytics pipelines, owning ingestion from CRMs/ad platforms through warehouse serving and dashboards at ~hundreds of thousands of records/day. Built reliability-focused systems including dbt/SQL/Python data quality gates with alerting, a resilient web-scraping pipeline (retries/backoff, anti-bot tactics, schema-change detection, backfills), and a versioned internal REST API with caching and strong developer usability.”
Mid-level Data Engineer specializing in real-time streaming and cloud data platforms
“Data engineer with Wells Fargo experience owning an end-to-end lakehouse ETL pipeline on Databricks/Azure Data Factory, processing ~480GB daily and implementing robust data quality/reconciliation across 40+ tables to reach ~99.3% reliability. Strong in performance optimization (cut runtime 5.5h→3.8h), CI/CD and monitoring, and resilient external/API ingestion with retries, schema validation, and backfills.”
Senior Full-Stack Software Engineer specializing in .NET, Python, and cloud-native systems
“Full-stack engineer who owned an end-to-end production feature for a Piraeus Bank stock exchange module, spanning React/TypeScript, backend services, and cloud operations with Docker + CI/CD, delivering reported 90% faster API responses and improved uptime. Also built a Smartwound research MVP on AWS, creating a Python image-processing/scoring pipeline to ship despite unclear image-analysis specs.”
Mid-level Full-Stack & Data Engineer specializing in AWS cloud and real-time streaming
“Backend engineer with experience at Cigna evolving REST API services backed by PostgreSQL, emphasizing reliability/correctness, scalability, and observability. Has hands-on production experience with FastAPI (contract-first design, Pydantic schemas), performance tuning (indexes, caching), and secure auth patterns (OAuth/JWT, RBAC, row-level security via Supabase), plus low-risk incremental rollouts using feature flags and dual writes.”
Mid-Level Software Engineer specializing in backend APIs and distributed systems
“JavaScript engineer with Walmart experience contributing to the Yup validation library—reproduced a nested-object validation bug, fixed merge logic, and added test coverage. Strong in systematic debugging/performance isolation (DevTools + timing logs), plus end-to-end ownership including documentation, monitoring, and issue triage.”
Staff RPA & Automation Engineer specializing in Financial Services
“Blue Prism RPA developer in a small FinTech-aligned team who owned ~20 production bots and drove both delivery and reliability. Built a shared VDI/locking design that cut infrastructure cost ~20–30% and routinely handled ServiceNow-driven production incidents end-to-end, including hotfixes and longer-term SDLC fixes. Also acted as a player-coach, training junior hires and maintaining high bot success rates (up to 99% within SLA).”
Senior Data Engineer specializing in Spark, Kafka, and Databricks Lakehouse platforms
“Data engineer at Fidelity who built and operated a real-time financial transactions lakehouse on AWS/Databricks, processing millions of records daily with Kafka streaming. Demonstrated strong reliability and data quality practices (watermarking, idempotent Delta writes, validation/reconciliation, observability) and delivered measurable improvements (~30% faster jobs and ~30% fewer data issues) while enabling trusted gold-layer analytics for downstream teams.”
Mid-level Data Analyst specializing in financial and healthcare analytics
“Analytics professional with experience at JPMorgan and Deloitte, focused on financial and risk data. They stand out for building scalable SQL/Python data pipelines, KPI and forecasting dashboards, and retention/cohort metrics that improved reporting reliability, forecast accuracy, and planning speed.”
Mid-level Data Engineer specializing in AWS cloud data platforms
“Data engineer with Charter Communications experience modernizing large-scale AWS data lake pipelines: ingesting S3 data, validating against legacy systems, transforming with PySpark/Spark SQL, and serving via Iceberg/Delta tables. Worked at 50M–300M record scale, delivered >99.5% data match, and built monitoring/alerting (CloudWatch/SNS) plus retry orchestration (Step Functions) and data quality gates (Great Expectations).”
Junior AI/ML Engineer and Instructor specializing in deep learning, computer vision, and NLP
“Computer-vision practitioner and educator who built a real-time license plate recognition system (OpenCV/Python + KNN) optimized to run on a Raspberry Pi with camera integration. Also designs hands-on deep learning coursework, incorporating recent transformer-based vision research (Vision Transformers) into practical labs on real datasets.”