Pre-screened and vetted.
Executive AI & Technology Leader specializing in enterprise data, GenAI, and digital transformation
Mid-level AI/ML Engineer specializing in GenAI, LLMs, and RAG pipelines
Mid-level Machine Learning Engineer specializing in LLMs, RAG, and MLOps
Mid-level Data Scientist specializing in NLP, deep learning, and big data analytics
Mid-Level Software Engineer specializing in fraud analytics and cloud-native microservices
Executive CTO and engineering leader specializing in cloud, AI, and FinTech
Mid-level Machine Learning Engineer specializing in MLOps and scalable ML pipelines
Finance transformation leader specializing in FP&A, analytics, and shared services
Director of Product specializing in AI-powered data platforms for connected vehicles and MLOps
Mid-level Python Backend Developer specializing in FinTech and ML-driven fraud detection
Senior Data/GenAI Engineer specializing in cloud-native ML, RAG, and real-time data platforms
Executive Chief Data & Analytics Officer specializing in data modernization and AI governance
Mid-level Business Analyst specializing in data analytics and enterprise reporting
Senior Software Engineer specializing in data engineering, BI analytics, and AI/ML
Executive Technology Leader (CTO) specializing in digital products, cloud platforms, and AI/ML
“IoT/connected-device and SaaS product builder with deep connected health/telehealth experience, including patient care delivery and remote patient monitoring. In their current org, they expanded a nutrition product business by prototyping and piloting a digital pre/post-surgical coaching care-delivery service with real providers/clinics/patients, then building an MVP and taking it to market.”
Senior Data Engineer specializing in cloud data platforms and real-time streaming for financial services
“Data engineer with experience at Bloomberg, UBS, and Bank of America building high-volume financial data platforms and services. Owned an end-to-end pipeline processing ~150–200M records/day (Kafka/Cassandra/S3 → Spark/PySpark → Snowflake) with strong data quality controls and Airflow reliability practices, reporting ~99% reliability and major performance gains. Also built large-scale external API ingestion with compliance-minded rate limiting, schema versioning, and quarantine/validation layers.”
Senior Full-Stack/Data Engineer specializing in cloud data pipelines for legal and financial platforms
“Data/analytics engineer who built and operated a DocuSign-based real-time analytics platform end-to-end, processing 20–50k webhook events/day with ~99.5% reliability. Strong in idempotent event processing, schema-evolution-safe ingestion (raw JSON + dynamic parsing), and serving data via versioned, low-latency REST APIs with solid CI/CD and observability.”
Senior AI/ML Engineer specializing in GenAI, MLOps, and computer vision
“ML/AI engineer with hands-on ownership of production document intelligence and GenAI systems, spanning model experimentation, AWS deployment, monitoring, and iterative optimization. Stands out for turning document-heavy workflows into reliable, near real-time products with measurable gains in accuracy, latency, and manual-effort reduction, while also shipping citation-grounded RAG features that drove user trust and adoption.”
Executive Technology Leader specializing in SaaS platforms, data ecosystems, and product engineering
“Technology leader who drove end-to-end modernization at Dogtopia—building a proprietary SaaS POS/CRM and operations platform plus an AI-powered customer app—using OKR-driven roadmaps and Agile/DevOps delivery. Previously at GE, led a cloud-native AWS data fabric re-architecture with strong security/governance (RBAC, classification, encryption, lineage, virtualization), cutting processing time 60%+ and enabling AI workloads tied to $400M in business value.”
Mid-Level AI Engineer specializing in data pipelines and scalable ML systems
“Data engineer/backend developer with experience owning end-to-end, high-volume data pipelines for ML/analytics using Python, Airflow, SQL, and PySpark, reporting ~30% error reduction through improved reliability and data quality checks. Has also built Django-based REST APIs with caching/pagination and strong versioning practices, and operated external data collection/web scraping pipelines with anti-bot measures, monitoring, retries, and idempotent backfills.”