Pre-screened and vetted.
“ML/NLP engineer with recent Scotiabank experience building production-grade indexing automation over large-scale emails and customer databases, combining LLM fine-tuning (Mistral, XLM-R) with fuzzy matching to exceed 95% accuracy under strict banking constraints. Also built a RAG-based chat agent using Gecko embeddings, Vertex AI Search, Gemini, and cross-encoder reranking, and delivered a text-to-SQL chatbot at SOTI through iterative fine-tuning and benchmark-driven experimentation.”
Director-level Data Science & Analytics Leader specializing in cloud data platforms and AI/ML
“Candidate states they are very familiar with the venture capital/studio/accelerator landscape and expresses strong willingness to pursue entrepreneurship "at all costs," but did not provide details on a current startup, business plan, fundraising, or prior accelerator/VC involvement during the interview.”
Mid-level Data Engineer specializing in cloud data pipelines for healthcare and financial services
“Data engineer with ~4 years of experience (Cigna) building and operating Azure Data Factory pipelines for healthcare claims/member/provider data at 2–3M records/day. Emphasizes reliability and downstream safety via schema/data-quality validation, quarantine workflows, idempotent processing, and backfills; also improved runtime ~20% through SQL optimization and served curated datasets through versioned views and well-documented, analyst-friendly interfaces.”
Mid-level Data Engineer specializing in cloud-native healthcare and enterprise data platforms
“Data Engineer (TCS) who owned an end-to-end CRM analytics pipeline for Bayer’s eSalesWeb integration, ingesting from Salesforce APIs/databases/S3 and serving analytics-ready datasets via PostgreSQL/S3 for Tableau. Drove measurable outcomes: ~60% reduction in manual data-quality effort, ~30% lower latency through SQL optimization, and ~35% improved stability via monitoring, retries, and idempotent processing.”
Mid-level Data Analyst specializing in financial and customer analytics
“Analytics professional with experience at KPMG and Robosoft Technologies, working across financial and customer engagement data. They combine SQL, Python, experimentation, and BI dashboards to turn messy multi-source data into decision-ready insights, including a pricing test that improved conversion rates by 9%.”
Mid-level Performance Marketing & Analytics professional specializing in PPC lead generation
“Performance marketer centered on Google Ads lead generation, with hands-on experience across Google, Meta, and LSA. They stand out for a disciplined testing approach, practical troubleshooting across tracking and landing pages, and strong operational rigor through MCC dashboards, custom alerts, and bidding-strategy adjustments when campaign performance stalls. Contract/freelance work is strongly preferred.”
Mid-level Data Scientist specializing in MLOps and Generative AI
“Robotics software/ML engineer who built perception and navigation-related ML systems for autonomous supermarket carts, including object detection, shelf recognition, and obstacle avoidance. Strong ROS/ROS2 practitioner who optimized real-time performance (reported 50% latency reduction) and deployed containerized ROS/ML pipelines at scale using Docker, Kubernetes, and CI/CD.”
Mid-level Data Scientist specializing in ML, NLP, and Generative AI
“GenAI/ML engineer with production experience at Cognizant and Ally Financial, building end-to-end LLM/RAG systems and ML pipelines. Delivered a domain chatbot trained from 90k tickets and 45k docs, improving intent accuracy (65%→83%), scaling to 800+ concurrent users with 99.2% uptime and sub-150ms latency, and driving +14% customer satisfaction. Strong in Azure ML + DevOps CI/CD, Dockerized deployments, and explainable/PII-safe modeling using SHAP/LIME to satisfy stakeholder trust and GDPR needs.”
Mid-level Business Analyst specializing in Healthcare IT and Banking operations
“Cross-functional operator who regularly leads globally distributed work and acts as a bridge between product, UX, and analytics. Has driven reporting/dashboard and workflow automation initiatives with senior leadership, using data-backed communication and quick wins to improve adoption and efficiency.”
Mid-level Data Scientist specializing in machine learning and analytics
“Data scientist with hands-on experience building an XGBoost-based customer segmentation/churn risk scoring model used by sales and marketing teams. Emphasizes production-grade practices—efficient SQL for large-scale data pulls, rigorous data validation/testing, and scalable, modular Python code designed to support multiple customer types.”
Mid-level Data Scientist specializing in real-time fraud detection and MLOps
“ML/NLP engineer with experience at Charles Schwab building an NLP + graph (Neo4j) entity-resolution system to unify fragmented user/device/transaction data and improve downstream model quality and analyst querying. Has applied embeddings (SentenceTransformers + FAISS) with domain fine-tuning to boost hard-case matching recall by ~12% while maintaining precision, and has a track record of hardening scalable Python/Spark pipelines and productionizing fraud models via A/B tests and shadow-mode monitoring.”
Mid-level Data Engineer specializing in multi-cloud data platforms for healthcare and finance
“Data engineer with Cigna experience building and operating an end-to-end AWS-based healthcare claims pipeline processing ~2TB/day, using Glue/Kafka/PySpark/SQL into Redshift. Strong focus on data quality and reliability (schema validation, monitoring/alerting, retries/checkpointing/backfills), reporting improved accuracy (~99%) and reduced latency, plus experience serving real-time Kafka/Spark data to downstream analytics with documented data contracts.”
Mid-level Data Engineer specializing in cloud ETL and real-time streaming
“Data engineer focused on AWS + Spark/Databricks pipelines, including an end-to-end nightly loan-data ingestion flow (~2.2M records) from Postgres/S3 through Glue and Databricks into a DWH with layered validation and alerting. Also built real-time streaming with Kafka + Spark Structured Streaming and a master’s project streaming Reddit data for sentiment analysis under ambiguous requirements and tight budget constraints.”
Mid-level Data Scientist & Generative AI Engineer specializing in LLMs and RAG
“Built production LLM + hybrid RAG and multi-agent orchestration systems at Wells Fargo to automate complaint document/audio transcript understanding and categorization, addressing vocabulary drift via embedding + vector index updates instead of frequent retraining. Strong in LLM workflow reliability (testing/benchmarks/observability) and stakeholder-facing delivery with explainability (citations/SHAP-style justifications) and Tableau dashboards.”
Senior Data Engineer specializing in scalable data pipelines and API-driven data services
“Data engineer focused on building scalable, reliable end-to-end data pipelines and backend REST data services, spanning API ingestion plus batch/stream processing with Airflow, Kafka, Spark/PySpark, and SQL. Emphasizes strong data quality validation, monitoring/fault tolerance, and performance tuning for large datasets, with experience deploying in cloud environments using containerization and CI/CD.”
Entry-level Data Engineer specializing in ETL, analytics, and anomaly detection
“Worked on industrial pump analytics at SitePro, where they built an anomaly detector using messy sensor and pump data and used historical failure and maintenance cost analysis to make the business case to stakeholders. They combine SQL/Python data preparation with practical stakeholder communication around metrics like churn and operational impact.”
Mid-level Data Analyst specializing in financial services and fraud analytics
“Analytics candidate currently at Facteus with hands-on experience turning messy transactional data into trusted reporting layers in Snowflake and Power BI. They combine SQL and Python automation with strong validation, performance tuning, and stakeholder-facing metric design, including cohort-based retention and segmentation work that improved trust and adoption of analytics.”
Mid-level Data Analyst specializing in healthcare and business intelligence
“Healthcare analytics candidate with hands-on experience turning messy EHR, billing, and operational data into validated SQL datasets and automated Python/Airflow pipelines. They appear strongest in hospital KPI reporting—especially length of stay, readmissions, retention, and bed utilization—and have owned projects from metric definition through Power BI delivery and impact measurement.”
Mid-level Business Analyst specializing in healthcare and banking compliance
“Healthcare analytics professional with Cigna experience turning complex claims, eligibility, and provider data into trusted reporting layers using SQL, Python, and Power BI. Stands out for combining deep data-quality rigor with end-to-end ownership of operational analytics projects, including standardized retention/churn metric design and automated reporting workflows.”
Mid-level Data Scientist specializing in machine learning, MLOps, and cloud analytics
“Senior data scientist with ~5 years’ experience building production ML/NLP systems in finance (Wells Fargo) and deep learning for sensor analytics in connected vehicles (Medtronic). Has delivered end-to-end platforms combining time-series forecasting with transformer-based NLP, including automated drift monitoring/retraining (MLflow + Airflow) and standardized Docker/CI/CD deployments; achieved a reported 22% precision improvement after domain fine-tuning.”
Mid-level Full-Stack Software Engineer specializing in microservices and scalable backend systems
“Backend/microservices engineer (Java/Spring Boot, Kafka, Angular microfrontends) with Teradata experience building distributed analytics/query routing platforms and delivering 20–30% latency reductions through event-driven redesign and reliability hardening. Also built and shipped an end-to-end multimodal medical imaging AI feature (LLaVA/Mistral 7B + LoRA) with production guardrails like confidence-based human review, drift monitoring, and audit logs.”
Mid-level Data Engineer specializing in Lakehouse, Streaming, and ML/LLM data systems
“Built and productionized an enterprise retrieval-augmented generation platform for internal knowledge over large unstructured corpora, emphasizing trust via strict citation/grounding and hybrid retrieval (BM25 + FAISS + cross-encoder re-ranking). Demonstrates strong scaling and cost/latency optimization through incremental indexing/embedding and index partitioning, plus disciplined evaluation/observability practices. Has experience operationalizing pipelines with Airflow/Databricks/GitHub Actions and partnering closely with risk & compliance stakeholders on auditability requirements.”
Mid-level AI/ML Engineer specializing in Generative AI and production ML systems
“Built and deployed a production SecureAIChatBot (RAG-based) for secure internal information retrieval, using embeddings/vector search, GPT models, monitoring, and safety filters. Focused on real-world production challenges like latency and output consistency, applying caching, retrieval scoping, smaller models, and controlled prompting, and used LangChain to orchestrate the end-to-end workflow.”