Pre-screened and vetted.
Mid-level Data Engineer specializing in cloud data platforms and FinTech analytics
Senior ETL/Data Engineer specializing in cloud data platforms and AI/ML-ready pipelines
Senior Data Engineer specializing in Azure, Databricks, and BI/ETL platforms
Mid-level Data Scientist / ML Engineer specializing in LLMs and predictive analytics
Senior Data Engineer specializing in cloud data platforms and real-time streaming pipelines
Senior Data Engineer specializing in multi-cloud data platforms and real-time analytics
Senior Data Scientist specializing in healthcare analytics and scalable ML pipelines
Senior Data Engineer specializing in multi-cloud data platforms and generative AI
Mid-level Data Scientist specializing in financial ML, NLP, and MLOps
Senior Hockey Video & Strategy Analyst specializing in player development and scouting
“Hockey analytics and scouting professional who started in college doing video tracking for an analytics vendor contracting with NHL teams, then evaluated amateur prospects with the Chicago Blackhawks. Currently with Prodigy, delivering data-plus-video player development support (Zoom reviews, strengths/areas-to-improve clips) and has driven measurable improvement for a Chicago Mission U14 client, leading to renewed engagement.”
Mid-level Data Scientist specializing in AI/ML, MLOps, and LLM-powered analytics
“Built and deployed a production LLM-powered document Q&A system enabling natural-language querying of large PDFs, focusing on retrieval quality (overlapped chunking) and low-latency performance (optimized embeddings + vector search). Experienced with scaling ML/LLM workflows using async/batch processing, caching, cloud storage, and orchestration via Apache Airflow with robust testing, monitoring, and failure handling.”
Mid-level Data Scientist specializing in ML, MLOps, and Generative AI
“ML/NLP engineer who built a RAG-based technical assistant for Caterpillar field engineers, transforming PDF keyword search into intent-based semantic retrieval across manuals, logs, sensor reports, and technician notes. Strong in productionizing data/ML systems (Airflow, PySpark) with rigorous preprocessing, entity resolution, and evaluation—delivering measurable gains in accuracy, relevance, and duplicate reduction.”
Senior Data Scientist specializing in data engineering and analytics
“Data/NLP practitioner with experience in both financial services (Truist) and government (USDA), including an NLP-driven analysis of EU regulations to anticipate US regulatory focus and a major redesign/cleaning of complex pathogen lab-test public datasets. Built production data-quality pipelines with Dagster, Pandera, and Azure Synapse, and is comfortable validating hypotheses with historical backtesting and SME-driven quality controls.”
Director of Revenue Analytics specializing in forecasting, deal desk, and GTM strategy
“Operated as a data-driven cross-functional leader during a major company transition, owning initiatives across sales forecasting, pipeline analytics, and GTM process changes. Built an executive operating cadence with dashboards and weekly briefing packets that reduced information-chasing and improved decision-making, and successfully mediated Sales/Finance forecast assumption disputes using a single-page, fact-based recommendation.”
Senior Basketball Coach and Video Analyst specializing in player development and performance analytics
“Former Division I scholarship athlete and professional player who transitioned into professional coaching in Europe, including coaching at Triglav across U12–U18 and the pro team, plus the U20 Women’s Slovenian National Team. Experienced in identifying and recruiting talent via film, in-person scouting, and social media, and leverages connections with major university coaching staffs to track the best recruiting tournaments/camps.”
Mid-level Data Scientist specializing in fraud detection and healthcare ML
“Applied NLP/ML in healthcare and financial services, including fine-tuning BERT on unstructured EHR text and building embedding-based similarity search for clinical concepts. Also redesigned a Wells Fargo fraud detection data pipeline using modular Python + AWS Glue/Step Functions, cutting runtime ~40% with improved monitoring and reliability.”
Intern Software Engineer specializing in backend systems and data engineering
“Backend/AI engineer who has built and shipped two products: Know Founder (Python/SQL/AWS) scaling to 2,000+ users in the first month, and Unifr (unifr.online), an AI search visibility engine that queries multiple LLMs and turns responses into structured brand insights. Strong in production reliability/performance (Redis caching, indexing, precomputation) and in designing agentic workflows with guardrails, validation, retries, and human escalation.”
Mid-level Data Engineer specializing in real-time pipelines and cloud data platforms
“Backend engineer with hands-on experience building secure Python/Flask services (sessions, JWT, RBAC) and optimizing PostgreSQL/SQLAlchemy performance, including custom SQL using CTEs/window functions profiled via EXPLAIN ANALYZE. Also integrates LLM features via OpenAI/Azure into backend systems and improves scalability with RabbitMQ-driven async processing, caching, and multi-tenant data isolation patterns.”
Mid-level Data Analyst specializing in financial risk and healthcare analytics
“AI/ML engineer focused on real-time, production-grade LLM systems, with a robotics-adjacent mindset around latency/accuracy tradeoffs and modular pipelines. Built a scalable RAG-based assistant orchestrated as microservices on Kubernetes with Kafka async messaging, ONNX/quantization optimizations, and monitoring (Prometheus/Grafana), citing a ~35% hallucination reduction; has also experimented with ROS Noetic/Gazebo to understand ROS concepts.”
Mid-level Data Analyst specializing in healthcare and finance analytics
“Built an end-to-end Alexa smart-home IoT application controlling a Wi-Fi bulb, including ESP32 firmware (MQTT) and an AWS serverless backend (IoT Core/Device Shadow, Lambda, DynamoDB) with a REST API. Demonstrates strong real-time scalability patterns (streaming ingestion, stateless processing, partition-key design) and full-stack delivery with Spring Boot + React (JWT auth, CORS, data-heavy dashboards).”
Mid-level Data Analyst specializing in AI/ML and advanced analytics
“Accenture data/ML practitioner who deployed a retail churn prediction and BERT-based sentiment analysis system to production, integrating behavioral + feedback data and operationalizing it with ETL automation, orchestration, and CI/CD. Experienced managing 2TB+ multi-source data, monitoring drift in Databricks, and translating results into Power BI dashboards for marketing teams (including K-means customer segmentation).”
Mid-level Data Scientist specializing in MLOps, LLM/RAG applications, and deep learning
“Built and deployed a production compliance automation RAG system (at Citi) that generates citation-backed, schema-validated risk summaries for regulatory document review. Emphasizes regulated-environment reliability with retrieval-only grounding, abstention, confidence thresholds, and immutable audit logging, plus orchestration using LangChain/LangGraph and Airflow. Reported ~60% reduction in compliance review effort while maintaining high precision and traceability.”
Junior Data Scientist specializing in healthcare ML and clinical NLP/LLMs
“Healthcare-focused LLM engineer who has built two production clinical applications: an automated structured clinical report generator from physician-patient conversations and a RAG-based chatbot for retrieving patient history (procedures, allergies, etc.). Demonstrates strong applied RAG expertise (overlapping chunking, entity dependency graphs, temporal filtering, graph RAG) to reduce hallucinations/omissions and partners closely with clinicians to automate hospital workflows.”
Junior Data Scientist specializing in fraud analytics and cloud data platforms
“Built and deployed production LLM-powered document summarization/classification systems using embeddings, vector databases (RAG-style retrieval), and automated evaluation (BERTScore/ROUGE), with a focus on monitoring and scalable cloud pipelines. Also partnered with a fraud analytics team to deliver a transaction anomaly detection solution, translating model outputs into Power BI dashboards and actionable KPIs while iterating on thresholds and alerts based on stakeholder feedback.”