Pre-screened and vetted.
Executive AI Platform & Product Leader specializing in commercialization and multimodal AI
“Entrepreneur building an applied-AI tool for geological resource exploration (critical minerals, oil & gas) that overlays proprietary and public data from reports/logs/maps to generate evidence-based greenfield profiling insights. Has spent ~2 years on industry research, built a POC, validated demand with purchasing signals, and developed partnerships/network including USGS, DARPA, and ESRI.”
Mid-level Applied AI & Data Engineer specializing in automation and enterprise analytics
“Backend engineer with experience evolving a high-volume agricultural loan processing platform (APMS) at HDFC Bank, emphasizing transactional integrity, auditability, and modularity while integrating with credit bureaus, document management, and risk engines. Also improved automation/reporting robustness at Trend Micro by catching duplicate-event retry edge cases and adding idempotency safeguards.”
Mid-level Generative AI Engineer specializing in LLMs, RAG, and multimodal generation
“Open-source JavaScript contributor focused on performance and maintainability in data visualization libraries—refactored legacy ES5 into modular ES6, added tests/docs, and delivered ~30% faster load times with positive community adoption. Also optimized a React dashboard (~40% load-time reduction) and took ownership in an ambiguous AI product initiative by setting milestones, standing up an initial ML pipeline, and shipping a prototype in ~6 weeks that became the basis for production.”
Junior Full-Stack Machine Learning Engineer specializing in production ML systems
“Software engineer who owned end-to-end delivery of customer-facing agricultural forecast reporting (crop yield/health) and iterated quickly via rigorous edge-case testing and customer feedback. Also built an internal ML training platform (TypeScript/React + Flask/Python + MongoDB) used by every developer, with architecture designed to stay responsive under heavy compute load.”
Junior Software Engineer specializing in full-stack and cloud infrastructure
“Software engineer with hands-on AWS operations experience who owned an end-to-end manufacturing image ingestion pipeline (on-prem to AWS S3) integrated with MES/WMS. In an early-stage SaaS internship, diagnosed a load bottleneck using K6/New Relic and shipped an NGINX least-connection load-balancing solution that scaled to ~4000 RPS while reducing latency. Also improved maintainability and performance in a React/Node e-commerce codebase, cutting page load time from ~10s to 2.8s.”
Senior Data Analyst specializing in data pipelines, web scraping, and legal data enrichment
“Data engineer focused on reliable, scalable analytics pipelines and external data collection. Has owned end-to-end pipelines processing 5–10M records/day, serving Snowflake data marts to Power BI/Tableau, and reports ~99% reliability through strong validation/monitoring. Also shipped versioned REST APIs for curated data with query optimization and caching.”
Intern Software Engineer specializing in cloud, big data, and test automation
“Internship experience at Qualitest building and deploying an LLM-powered test automation system that reduced manual test creation and improved efficiency (~40%). Demonstrates strong production engineering for LLM systems (timeouts/retries/monitoring/caching, prompt optimization, batching) and has scaled workflows to 100+ concurrent jobs; also has orchestration experience with AWS Step Functions and Kubernetes.”
Mid-Level Software Engineer specializing in cloud-native microservices and FinTech platforms
“Backend/platform engineer who led an end-to-end Python (FastAPI) transaction analytics microservice for real-time financial monitoring, including SQS ingestion, scoring/aggregation, and low-latency APIs. Strong AWS + Kubernetes/GitOps background (EKS, ArgoCD, Jenkins, ECS/ECR, CloudWatch) with hands-on experience scaling event-driven systems and executing phased on-prem to AWS migrations.”
Mid-level Data Engineer specializing in big data pipelines and real-time streaming
“Data engineer who has owned end-to-end production pipelines processing a few million records/day, using Python/Airflow/SQL/PySpark with Snowflake serving to BI (Power BI). Built resilient external web data collection systems (anti-bot, schema-change detection, backfills) and shipped versioned REST APIs for internal consumers, improving pipeline success rates to 99% through monitoring, retries, and idempotent design.”
Mid-level Cloud Data Engineer specializing in Azure/AWS pipelines and medallion architecture
“Data engineer focused on reliability and data quality, owning end-to-end pipelines processing ~100k–300k records/day. Implemented robust validation and monitoring that cut reporting issues by ~30%, and built stable external data collection with anti-bot measures, backfills, and schema-change detection while maintaining backward-compatible internal data services.”
Mid-level Software Engineer specializing in cloud infrastructure and data platforms
“Infrastructure/data platform engineer with hands-on GCP production experience, especially Bigtable, who led a migration from Azure Cosmos DB Cassandra API to Bigtable that removed throttling and cut costs by 50%+. Stands out for combining distributed data architecture, zero-downtime Kafka migration strategy, and Terraform/Python automation for deterministic multi-region GKE operations.”
Junior Full-Stack Software Engineer specializing in AI, FinTech, and e-commerce
“Built both traditional internal tooling and LLM-powered systems during an internship, including a React/Python/AWS calculator onboarding platform and a production-style ROS2 RAG assistant over 10K+ documents. Stands out for combining full-stack delivery, stakeholder coordination, and practical AI reliability work like retrieval tuning, source-grounded answers, and low-confidence fallbacks.”
Mid-level Full-Stack Software Engineer specializing in FinTech and backend platforms
“Built an AI-native legal research platform that automated analysis across 100,000+ dense legal documents, combining LLM workflows, async backend architecture, and conversational retrieval in production. Also brings cross-domain experience in investment-analysis agents and healthcare claims/billing systems, with a strong emphasis on reliability, deterministic orchestration, and safe handling of messy operational data.”
Junior Data Scientist / Big Data Engineer specializing in ML, LLMs, and analytics platforms
“Backend/data platform engineer who led a major redesign of a hybrid streaming+batch analytics platform processing 10+ TB/day (Airflow/Hive/BigQuery) with strong data-quality automation. Also built a production RAG PDF assistant with concrete mitigations for hallucinations and prompt injection (re-ranking, grounding, verifier step) and has deep experience executing low-risk migrations (dual-write, blue-green, rapid rollback) and implementing JWT-based row-level security.”
Mid-level Marketing Analytics & Performance Marketing Analyst specializing in paid media and attribution
“Performance creative/growth marketer with hands-on experience running full-funnel paid social for e-commerce and other brands, focused on combating creative fatigue and scaling efficiently. Uses structured A/B testing and modular creative systems across Meta, TikTok, and YouTube; recently delivered a 22% CPA reduction and 28% ROAS lift by shifting to problem-solution and social-proof storytelling.”
Junior Full-Stack Software Engineer specializing in backend APIs and data systems
“Backend engineer who built an async FastAPI data pipeline at GHN Career Academy to replace a manual Excel-based workflow, migrating 30k+ contact records into Airtable with validation/deduplication and best-effort GPT-based enrichment. Emphasizes reliability under messy real-world data and partial failures via structured logging, retries, and resumable processing, unlocking downstream automations (e.g., Zapier and chatbots).”
Mid-level Data & AI Engineer specializing in healthcare data pipelines and MLOps
“Built and deployed a production LLM-powered clinical note summarization system used by care managers to speed review of 5–20 page unstructured medical records. Implemented safety-focused validation (prompt constraints, rule-based and section-level checks, human-in-the-loop) to reduce hallucinations while maintaining low latency and meeting privacy/regulatory constraints, integrating via APIs into existing clinical tools.”
Mid-level Data Engineer specializing in scalable ETL, streaming analytics, and cloud data platforms
“At Dreamline AI, built and productionized an AWS-based incentive intelligence platform that uses Llama-2/GPT-4 to extract eligibility rules from unstructured state policy documents into structured JSON, then processes them with Glue/PySpark and serves results via Lambda/SageMaker/API Gateway. Designed state-specific ingestion connectors plus schema validation and automated checks/alerts to handle frequent policy/format changes without breaking the pipeline, and partnered with business/analytics stakeholders to deliver interpretable eligibility decisions via explanations and dashboards.”
Mid-level AI/ML Engineer specializing in NLP, Generative AI, and MLOps in Financial Services
“ML/LLM engineer at Charles Schwab who built a production loan-advisor chatbot integrated with internal knowledge and loan-calculator APIs, adding strict numeric validation to prevent rate hallucinations and optimizing context to control costs. Also runs ~40 Airflow DAGs orchestrating retraining/ETL/drift monitoring with an automated Snowflake→SageMaker→auto-deploy pipeline, and uses rigorous testing plus canary rollouts tied to business metrics and compliance constraints.”
Executive Enterprise Architect & CTO specializing in cloud, digital transformation, and AI/ML
“Senior enterprise architecture and engineering leader (Sr. Director / Principal Architect) who has owned enterprise IT strategy and governance for a $100M budget and partnered directly with C-suite stakeholders. Led a cruise-industry employee/crew digital transformation, scaling to 10 agile teams (~70 people) using SAFe/TOGAF and making architecture decisions optimized for low-connectivity environments (local database to avoid internet authentication).”
Mid-level AI/ML Engineer specializing in GenAI agents, RAG pipelines, and MLOps
“AI/ML engineer who built a production RAG-based internal document intelligence assistant (LangChain + Pinecone) to let employees query enterprise reports in natural language. Demonstrated hands-on pipeline orchestration with Apache Airflow and tackled real production issues like retrieval grounding and latency using tuning, caching, and token optimization, while partnering closely with non-technical business stakeholders through iterative demos.”
Senior Data Engineer specializing in cloud-native data platforms for finance and healthcare
“Data engineer/backend data services practitioner with Bank of America experience building real-time and batch transaction-monitoring pipelines and APIs (Kafka + databases, REST/GraphQL). Highlights include a reported 45% response-time improvement through performance optimizations and use of Delta Lake schema evolution plus CI/CD (GitHub Actions/Jenkins) and operational reliability patterns like CloudWatch monitoring and dead-letter queues.”
Junior Machine Learning Engineer specializing in LLM evaluation and GenAI pipelines
“LLM/agent engineer who built a production LangGraph multi-agent orchestrator connecting GitHub and APM/observability signals with a chain-of-verification loop for root-cause analysis. Emphasizes pragmatic architecture (start simple with state summaries), performance tuning (async LLM calls, Docker), and rigorous evaluation (LLM-as-judge, adversarial testing, hallucination/instruction adherence metrics, tool-call tracing) while iterating with non-technical stakeholders via A/B testing.”