Pre-screened and vetted.
Senior Data Engineer specializing in cloud lakehouse and streaming data platforms
“Data platform/data engineer with cross-industry experience in banking and healthcare, building cloud-native lakehouse architectures across AWS/Azure/GCP. Has owned high-volume (millions of records; TB/day) pipelines with strong data quality automation (dbt/Great Expectations), observability (Grafana/Prometheus), and real-time streaming (Kafka/Spark) for fraud monitoring; also delivered an early-stage migration from SQL Server to BigQuery with 40% batch latency reduction.”
Junior Data Analyst specializing in sports analytics and business intelligence
“Analytics professional in the sports industry who has owned high-impact revenue and compliance data projects for the Colts, turning fragmented Ticketmaster and Salesforce data into trusted real-time reporting. Stands out for combining strong SQL/Snowflake engineering, rigorous validation practices, and stakeholder-facing metric design that drove a record 98% compliance rate and meaningful revenue recovery.”
Senior AI/ML Engineer specializing in Generative AI, LLMs, and RAG systems
“AI/ML engineer with hands-on experience shipping production systems across fintech, travel, and legal use cases. They’ve built end-to-end chatbot, generative content, and RAG solutions on AWS with CI/CD, monitoring, and guardrails, including a loan application platform that generated $3,000 in sales in its first month.”
Junior Data Scientist and Full-Stack Developer specializing in healthcare and matching systems
“Solo builder developing their own app and using generative AI tools like Codex to both troubleshoot and write code. They taught themselves React and React Native over the past year through hands-on AI-assisted development and handle all technical work independently.”
Intern AI/ML Software Engineer specializing in RAG and medical AI
“ML/LLM engineer with production experience building medical RAG systems to automate chart review, including retrieval + re-ranking and rigorous evaluation. Notably uncovered errors/bias in physician-curated ground truth by tracing answers back to source note chunks and presented evidence to an academic partner, accelerating deployment. Also built a RAG-based FAQ chatbot for a health insurance company and delivered it to non-technical stakeholders via demos.”
Intern AI Researcher specializing in NLP, LLMs, and knowledge graphs
“Built and shipped “LabMate,” a production AI assistant specialized in laboratory hardware, using a weighted multi-source RAG pipeline with reranking and reasoning-focused query decomposition to handle complex user questions. Deployed on a local GPU cluster with vLLM and NVIDIA MPS (plus OCR/VLM components), and established evaluation using synthetic + public reasoning datasets while collaborating weekly with non-technical admins to align requirements and resource constraints.”
Mid-Level Full-Stack Software Engineer specializing in cloud-native data platforms and AI apps
“Software engineer who has owned customer-facing/internal platforms end-to-end, emphasizing fast iteration through small releases backed by monitoring and rollback safety. Built SurveyAI with reusable React/TypeScript components and a stateless Node.js REST backend with clear API contracts/validation, and created an internal Airflow + AWS Lambda automation tool integrated with Slack alerts to reduce manual work and improve response time.”
Entry-level Software Engineer specializing in full-stack web development and applied systems work
“Full-stack developer with hands-on experience building an end-to-end automated trading platform that combines web scraping, relational data storage, Flask/React architecture, and LLM-based decisioning via Google Vertex AI. Also brings production experience at CALEC, where they contributed frontend improvements including welcome-page redesigns, multilingual support, and accessibility-related fixes.”
Junior Software Engineer specializing in AI/ML, data pipelines, and cloud APIs
“Hands-on AI/LLM practitioner who built a RAG-based customer support chatbot and tackled production issues like data chunking complexity and response-time lag. Uses techniques such as overlapping chunks, semantic search, context engineering, and query routing, and has experience presenting technical demos/workshops to developer audiences.”
Junior Data Scientist and Robotics Perception Engineer specializing in GenAI and autonomous systems
“Robotics software architect who built an automated pick-and-place palletizing prototype at BLACK-I-ROBOTICS, spanning perception (multi-RealSense fusion, segmentation, 6D pose, ICP), GPU-accelerated motion planning (MoveIt 2 + NVIDIA CuRobo), grasp generation, and safety (human detection + safe mode). Also brings cloud/CI/CD depth from VERIDIX AI (AWS Cognito/Lambda/ECS and CodePipeline stack) and demonstrated strong debugging chops by reducing outdoor rover EKF drift to ~5 cm via Allan variance-based IMU tuning.”
Junior Full-Stack Software Developer specializing in AI-powered web and health applications
“Built and launched “Language AI,” a Next.js/TypeScript app that clones a user’s voice (via ElevenLabs) to deliver language lessons in their own voice, using Supabase for auth/Postgres/storage and hosting backend on Render. Post-launch, identified ElevenLabs voice-clone limits after initial users and reworked the pipeline to store audio assets and delete clones to support more concurrent users; also added Google auth to improve adoption. Previously worked in a high-growth startup environment (Study Park) taking concepts from ideation to production.”
Intern AI/ML Engineer specializing in computer vision and time-series forecasting
“Undergrad who built a production RAG chatbot for a messy college website using OpenAI embeddings + FAISS, overcoming hard-to-crawl/non-selectable site content and strict API budget limits. Applies information-retrieval best practices (section-based chunking with overlap, precision/recall evaluation) and reliability techniques (edge-case testing, similarity thresholds, fallback responses), and has experience scaling similar indexing work to ~300,000 Wikipedia pages.”
Mid-level Full-Stack AI Engineer specializing in agentic systems and security-hardened pipelines
“Founding/early engineer experience across Asante and a Series A startup (Adgency), shifting from data science/ML into owning production full-stack systems end-to-end. Built core product flows (registration, business profiles, map service), AWS-deployed gRPC microservices with CI/CD, and operated low-latency agent/video ad generation workflows with retries/fallbacks and PostHog-based observability.”
Mid-level Backend Engineer specializing in APIs, microservices, and data platforms
“Software engineer who built JobIntel, an end-to-end Python ETL pipeline integrating ATS data from platforms like Greenhouse using Scrapy and FastAPI. Stands out for production reliability work: designing async fault-tolerant architecture, optimizing PostgreSQL write-heavy upserts, and building a Prometheus/Splunk observability stack that cut debugging from hours to minutes.”
Junior AI & Data Engineer specializing in LLM systems and analytics platforms
“Backend/ML engineer who built a job-search automation SaaS using a modular Selenium ETL pipeline, rigorous testing/observability, and a cost-optimized two-pass LLM ranking approach. Has led high-integrity data extraction from messy multi-city PDF records (95% integrity) and managed modular production rollouts for a 20+ engineer team, with a strong security focus (deny-by-default, row-level access control) in an AI-assisted grading platform.”
Mid-level Full-Stack Developer specializing in React, monorepos, and AWS
“Frontend/product engineer who has led end-to-end builds across automotive and healthcare: created a multi-tenant, high-performance Next.js luxury inventory platform and a secure, Stripe-powered sick-note workflow integrated with an EMR. Known for data-driven UX decisions (A/B testing) and pragmatic modernization of critical systems (3DS2 upgrade) with measurable conversion and risk improvements.”
Mid-level GenAI/ML Engineer specializing in LLM systems and RAG chatbots
“Built and shipped a production agentic LLM analytics platform that lets non-SQL business users query relational databases in plain English via a RAG + LangChain/LangGraph workflow and FastAPI service. Emphasizes safety and reliability with guardrails (validation/access control), testing/evaluation frameworks, and performance optimization (caching, monitoring, Dockerized scalable deployment), reducing dependency on data teams and speeding analytics turnaround.”
Mid-level Data Scientist / AI-ML Engineer specializing in RAG, MLOps, and real-time analytics
“Software/ML engineer who built a production automated job-finding and cold-email personalization system for Fortune 500 outreach, using JobSpy for dynamic scraping, LangChain orchestration, and LLM+vector DB semantic search with grounding/relevance metrics and guardrails. Also delivered a predictive investment analytics platform for financial advisors, communicating results via Tableau dashboards and portfolio KPIs like Sharpe ratio and drawdowns.”
Mid-level Software Engineer specializing in Data Science and Machine Learning
“Robotics/AV perception engineer who built a semantic-segmentation road detection system and integrated it into a ROS-based real-time pipeline (ROS bag camera feed to live monitor) achieving ~12 FPS. Strong in practical deployment work: solved multi-library versioning issues (ROS/OpenCV/TensorFlow), containerized the stack with Docker, and optimized inference by shifting runtime to C++ for large latency gains on NVIDIA hardware.”
Senior Data Scientist specializing in ML, NLP, and production AI systems
“Machine learning/NLP engineer with deep Azure stack experience (Data Factory, Databricks/Spark, Delta Lake, Azure OpenAI, Azure AI Search) who built end-to-end production systems for semantic clustering, entity resolution, and hybrid search. Demonstrated measurable gains from embedding fine-tuning (~15% retrieval precision, ~10–12% nDCG@10) and designed scalable, quality-checked pipelines with MLOps best practices.”
Executive Product & Technology Leader specializing in AI, analytics, and regulated industries
“Serial startup product/technology leader who previously exited a company to Green Street and has accelerator experience via Notre Dame’s IDEA Center. Now pursuing a commercial real estate analytics concept focused on deep demand analysis for better capital allocation, with a provisional patent filed and experience supporting VC funds as an operating partner on product vision and strategy.”
Mid-level ML & Data Engineer specializing in GenAI, graph modeling, and fraud/risk analytics
“Built a production AI fraud/risk scoring platform at BlueArc that ingests web business/product/site data, generates text+image embeddings, and connects entities in a graph to detect reuse patterns and links to known bad actors. Optimized for scale with incremental graph re-scoring and delivered investigator-friendly explainability by surfacing the exact signals/relationships behind each score; orchestrated workflows with Airflow and GCP event-driven components (Pub/Sub, Dataflow, Cloud Run) and has recent LLM workflow orchestration experience (retrieval, prompting, scoring).”
Mid-level Data Engineer specializing in cloud big data and streaming pipelines
“Data engineer focused on large-scale financial data platforms, with hands-on ownership of an AWS + Databricks + Snowflake pipeline processing ~2TB/day. Strong in data quality (Great Expectations), schema drift automation, and production reliability (99.9%), plus measurable performance/cost wins (4h→1.2h, ~25% cost reduction). Also built an async Python crawling/ingestion framework with anti-bot mitigation, retries, and Airflow-driven backfills.”
Junior Data Scientist specializing in agentic AI and RAG pipelines
“LLM/agentic systems builder who shipped production workflows at Angel Flight West and Eureka AI, combining LangGraph + RAG (Postgres/pgvector) with strong observability (LangSmith/Langfuse). Delivered large operational gains (address lookup cut from 10 minutes to 60 seconds; accuracy to 92%) and has a track record of quickly stabilizing customer-critical pipelines (Pydantic-enforced JSON for ETL) while partnering with sales/ops to drive adoption.”