Pre-screened and vetted.
Mid-level Data Scientist specializing in Generative AI and NLP for financial risk
“Built and shipped production generative AI/RAG assistants in regulated financial contexts (S&P Global), automating compliance-oriented Q&A over earnings reports/filings with grounded answers and citations. Experienced across the full stack—AWS-based ingestion (PySpark/Glue), vector retrieval + LangChain agents, GPT-4/Claude model selection, and production reliability (monitoring, caching, retries) plus rigorous evaluation and regression testing.”
Mid-level Data Engineer specializing in cloud data platforms and real-time analytics
“Customer-facing data engineering professional who builds and deploys real-time reporting/dashboard solutions, gathering reporting and compliance requirements through direct stakeholder engagement. Experienced with Google Cloud IAM governance, secure integrations (encryption, audit logging), and fast production troubleshooting of ETL/pipeline failures with follow-on monitoring and automated recovery improvements; motivated by hands-on, travel-oriented customer work.”
Senior Business Analytics Analyst specializing in product and customer analytics
“Darwinbox team member who supported talent/recruiting operations while also driving product improvements across HR modules (recruitment, onboarding, payroll, performance). Led a small team (5–6) and implemented discovery-driven configuration and BI reporting (Power BI/Tableau/Confluence), including a reported 30% reduction in recruitment configuration issues and real-time funnel reporting to support fast hiring.”
Mid-level Supply Chain Analyst specializing in strategic sourcing and supply chain analytics
“Sourcing/procurement professional focused on electronic components, leading end-to-end RFQ and supplier selection through negotiation and delivery. Demonstrates strong data-driven cost management (Power BI modeling, benchmark pricing) and measurable results including ~18% cost reduction, avoiding production delays during shortages, and automating RFQ comparisons to cut cycle time ~40%.”
Mid-level Data Scientist specializing in Generative AI, NLP, and MLOps
“Built and deployed an LLM-powered claims-document summarization system (insurance domain) that cut agent review time from 4–5 minutes to under 2 minutes and saved 1,200+ hours per quarter. Hands-on across orchestration and production infrastructure (Airflow retraining DAGs, Kubernetes, SageMaker endpoints, FastAPI) and recent RAG workflows using n8n + Pinecone, with a strong focus on reliability, cost, and explainability for non-technical stakeholders.”
Mid-level Data Analyst/Data Engineer specializing in BI, ETL pipelines, and cloud analytics
“Data engineer focused on marketing/web analytics and external API pipelines, handling ~10M records/week. Built Azure-based ingestion and PySpark transformations with rigorous data quality checks, then served curated datasets into Synapse/Redshift for Power BI. Also designed an Airflow-orchestrated crypto REST API pipeline with monitoring, retries/exponential backoff, schema-change detection, and backfill-friendly reprocessing.”
Mid-level Data Analyst specializing in BI, analytics, and healthcare data
“Analytics professional at Optum with hands-on experience turning messy healthcare claims data from SQL, Excel, and CRM systems into validated reporting datasets and Power BI dashboards. They also built reproducible Python workflows for claims analysis and owned an end-to-end project focused on improving claims processing efficiency through metric design, segmentation, and stakeholder-driven operational improvements.”
Senior IT Business Systems Analyst specializing in UAT, project delivery, and regulated platforms
“Worked on a P&C insurance integration project at Accenture, using SQL to unify policy, billing, and claims data from APIs and ETL pipelines into clean reporting tables. Demonstrated hands-on experience with data quality validation, window-function-based transformations, and query performance tuning, helping business teams get a single reliable view for faster claims processing and management reporting.”
Senior Analytics and Business Intelligence professional specializing in e-commerce and digital analytics
“Analytics professional with hands-on experience unifying marketing-platform data through Fivetran and Snowflake, building reporting views, and catching source-to-report issues like timezone-driven spend discrepancies. They also owned subscription LTV/cohort analysis and engagement tracking initiatives, partnering with e-commerce, product, and senior leadership to turn behavioral and demographic data into dashboards, lead-qualification metrics, and lifecycle marketing insights.”
Mid-level Data Analyst specializing in financial risk and data automation
“Analytics professional from Capital One with strong experience automating risk, reconciliation, and regulatory reporting workflows in financial services. They combine deep SQL/Python pipeline skills with stakeholder-facing dashboard and KPI design, delivering measurable impact like 30% performance gains, sub-24-hour anomaly detection, and 100% data integrity for regulatory filings.”
Entry-level Software Engineer specializing in FinTech distributed systems
“Game developer with early-stage startup experience who worked directly with a CEO to integrate an AI-based API into Skyrim Elder Scrolls V, helping showcase the product and win Riot Games as a client. Currently owns multiple financial reporting ingestion workflows and has driven meaningful time savings through cross-functional execution, combining gaming/AI experience with operational impact in fintech.”
Mid-level Business Analyst specializing in supply chain and logistics
“Analytics professional with hands-on experience in supply chain and logistics transformation, including enterprise data preparation in SQL, Python automation, and Power BI reporting. They highlight ownership of end-to-end digitization work at Blue Dart, where they defined operational metrics, aligned cross-functional stakeholders, and delivered measurable gains in transparency, reporting efficiency, and implementation quality.”
Junior Machine Learning Engineer specializing in Generative AI and analytics automation
“AI/LLM engineer who built a production intelligent support system using RAG over a vectorized documentation library, addressing real-world issues like lost-in-the-middle context failures and doc freshness via automated GitHub-driven re-embedding pipelines. Emphasizes rigorous agent evaluation (component/E2E/ops) and prefers lightweight, decoupled workflow automation using message brokers (Redis/RabbitMQ) over heavyweight orchestration frameworks.”
Intern Data Scientist specializing in AI, analytics, and cloud data engineering
“Built a production multimodal LLM-based vendor risk assessment platform that ingests SOC reports and other documents, uses a strict RAG pipeline with grounded evidence (page/paragraph citations), and dramatically reduces analyst review time. Experienced with LangGraph/LangChain/AutoGen for stateful, fault-tolerant agent workflows, and emphasizes reliability (schema validation, guardrails) plus low-latency delivery (~1–2s) through hybrid retrieval, reranking, caching, and model tiering.”
Mid-level Data Engineer specializing in cloud data platforms and scalable ETL pipelines
“Data engineer (~4 years) with full-stack delivery experience (Next.js App Router/TypeScript + React) building a real-time operations monitoring dashboard backed by Kafka and orchestrated data pipelines. Strong production focus: Airflow + CloudWatch monitoring, automated Python/SQL validation (99.5% accuracy), and CI/CD with Jenkins/Docker; has delivered measurable improvements in latency, pipeline reliability, and query performance (Postgres/Redshift).”
Mid-level Data Engineer specializing in cloud ETL/ELT and lakehouse architecture
“Data engineer focused on sales/marketing analytics pipelines, owning ingestion from CRMs/ad platforms through warehouse serving and dashboards at ~hundreds of thousands of records/day. Built reliability-focused systems including dbt/SQL/Python data quality gates with alerting, a resilient web-scraping pipeline (retries/backoff, anti-bot tactics, schema-change detection, backfills), and a versioned internal REST API with caching and strong developer usability.”
Mid-level Data Engineer specializing in real-time streaming and cloud data platforms
“Data engineer with Wells Fargo experience owning an end-to-end lakehouse ETL pipeline on Databricks/Azure Data Factory, processing ~480GB daily and implementing robust data quality/reconciliation across 40+ tables to reach ~99.3% reliability. Strong in performance optimization (cut runtime 5.5h→3.8h), CI/CD and monitoring, and resilient external/API ingestion with retries, schema validation, and backfills.”
Mid-level Regulatory Affairs Manager specializing in cosmetics, OTC, and global compliance
“Sourcing/product professional with hands-on experience managing end-to-end contract manufacturing for consumer products (including a pressed eyeshadow palette) from vendor selection and GMP-focused audits through formula approval, pilot batches, claims testing, and launch. Demonstrates strong commercial negotiation (MOQs, tiered pricing, component cost-downs tied to service metrics) and pragmatic risk mitigation, including a COVID-era nutrition drink mix launch pivot to a flavor extension leveraging existing inventory.”
Junior Product Manager / APM specializing in data tools, CMS platforms, and AI-enabled products
“Data Software Tools Analyst at Q.ai through rapid growth and a $2B Apple acquisition who led an internal CMS for participant/PII workflows using Next.js (App Router) + FastAPI/Postgres with strong security controls (JWT + Postgres RLS). Also drove a major frontend architecture shift toward React Server Components, reporting ~4x faster page loads, and has experience building durable realtime collaboration systems with Supabase/SvelteKit and server-centric state management.”
Senior Data Engineer specializing in Spark, Kafka, and Databricks Lakehouse platforms
“Data engineer at Fidelity who built and operated a real-time financial transactions lakehouse on AWS/Databricks, processing millions of records daily with Kafka streaming. Demonstrated strong reliability and data quality practices (watermarking, idempotent Delta writes, validation/reconciliation, observability) and delivered measurable improvements (~30% faster jobs and ~30% fewer data issues) while enabling trusted gold-layer analytics for downstream teams.”
Mid-level Data Analyst specializing in financial and healthcare analytics
“Analytics professional with experience at JPMorgan and Deloitte, focused on financial and risk data. They stand out for building scalable SQL/Python data pipelines, KPI and forecasting dashboards, and retention/cohort metrics that improved reporting reliability, forecast accuracy, and planning speed.”
Mid-level Data Analyst specializing in business intelligence and cloud data platforms
“Healthcare analytics professional with TCS/Humana experience turning messy claims and eligibility data into reliable reporting assets using SQL and Python. They combine strong data engineering and analytics execution with stakeholder management, including automating monthly claims reporting from half a day to under 5 minutes and driving a provider outreach effort that reduced claim rejection rates by about 20%.”
Mid-level Business Analyst specializing in healthcare and enterprise technology
“Analytics professional with healthcare experience at United Health Group, focused on turning messy claims and transaction data into reliable reporting assets. They combine SQL, Python, and Power BI to automate analysis, define operational KPIs, and build dashboards that improved stakeholder visibility and helped reduce processing time by about 22%.”
Mid-level Software Engineer specializing in backend systems and workflow automation
“Early-career AI engineer currently pursuing a Master's, with hands-on experience building and improving RAG pipelines using LangChain. They stand out for moving beyond naive retrieval into multi-step retrieval and feedback-loop designs to reduce hallucinations, and are now exploring multi-agent systems with distinct retrieval, coding, and validation roles.”