Pre-screened and vetted.
Mid-level Business Analyst specializing in financial systems and reconciliation
Mid-level Data Analyst / Business Analyst specializing in healthcare and operations analytics
Senior Software QA & Data Systems Specialist in nuclear and regulated environments
Mid-level SQL Developer specializing in MySQL, ETL, and cloud data pipelines
Mid-level Data Scientist specializing in GenAI, RAG, and predictive modeling
“Backend engineer who built and evolved Python/FastAPI services (including AWS-deployed ML prediction APIs) for real-time profitability and risk insights at TenXengage. Emphasizes pragmatic architecture, strong validation/observability, and secure access controls (RBAC + row-level filtering), and has led safe migrations via parallel runs and incremental rollouts; reports ~20% forecasting accuracy improvement.”
Junior Data Engineer specializing in Azure, CRM data pipelines, and marketing personalization
“LLM/AI engineer who has deployed production RAG conversational analytics and Text-to-SQL systems over Snowflake and curated data marts, emphasizing enterprise-grade guardrails for accuracy, security, and cost. Notable for a structured approach to reducing hallucinations (curated metric/table registry, SQL validation, RBAC, and citation-backed responses) and for building resilient, observable multi-step agent workflows using LangChain/LlamaIndex and Airflow.”
Junior Data & AI Engineer specializing in cloud AI and analytics
“Built production AI backend systems in healthcare and e-commerce, including a healthcare agent that automated clinical workflows like medication refills, immunizations, and scheduling using FHIR APIs and cloud-native infrastructure. Strong in end-to-end backend ownership, LLM orchestration, and adding guardrails/validation for high-stakes and customer-facing AI workflows.”
Mid-level Data Scientist specializing in Generative AI and Healthcare Analytics
“Built a LangGraph-based, tool-routing LLM chatbot to deliver fast, trustworthy investment-stock insights (including tariff impact) and deployed it to production on Snowflake after initially developing in Azure with AI Search and the Microsoft Agent Framework. Improved routing robustness by moving from LLM-based decisions to a deterministic router backed by schema-relationship graphs and YAML metadata, and ran the project iteratively with non-technical stakeholders over an 8-month engagement.”
Mid-level Data Engineer and Analytics Analyst specializing in business growth and marketing insights
“Analytics professional with operations-grounded experience at WWEX Group who built a Snowflake/dbt fleet-efficiency data model combining telematics, ERP, and driver logs into near real-time executive reporting. They pair strong SQL/Python workflow automation with practical stakeholder enablement, and cite measurable impact including cutting reporting time from 72 hours to 15 minutes and helping drive $450K in quarterly fuel savings.”
Mid-level AI Data Engineer specializing in GenAI, RAG, and cloud data pipelines
“LLM/agentic AI builder who deployed a production ITSM automation agent on Google ADK integrating ServiceNow and FreshService, with strong safety guardrails (human-approval gating and runbook-only command execution) and rigorous evaluation (500 synthetic tickets; 80%+ false-positive reduction). Also partnered with finance to deliver an AI agent that automated invoice/SOW retrieval and monthly reporting to account managers, reducing manual back-and-forth.”
Mid-level Data Engineer specializing in cloud-native batch and streaming pipelines
“Data/ML platform engineer with ~6 years in financial services and enterprise data platforms, building regulated fraud/credit-risk pipelines on AWS (Airflow, EMR/Spark, MLflow) and an Azure lakehouse ingesting 50+ sources and serving ~100M records/day. Also led an early-stage deployment of a RAG-based internal AI search tool using AWS Bedrock and LangChain with automated evaluation to validate LLM accuracy.”
Junior AI Data Engineer specializing in Azure Databricks lakehouse and GenAI RAG systems
“Backend/applied AI engineer from Cloud Rack Systems who built production GenAI/RAG and data platforms on Azure/Databricks at enterprise scale (2.5M records/day). Known for making LLM systems behave like deterministic services via strict retrieval contracts, citation-based validation, and strong observability—shipping a knowledge assistant used daily by 50+ users while driving hallucinations near zero and materially improving latency and cost.”
Junior Data Analyst specializing in BI, SQL, and business analytics
“Analytics professional with experience across Dreamline AI, Ultron Technologies, and Infolabz, building SQL/Python data pipelines and BI dashboards for incentive, FMCG, and retail use cases. Stands out for turning messy multi-source data into trusted reporting, automating recurring analytics, and tying dashboard adoption to measurable business outcomes like 50% faster reporting and 30% ROI improvement.”
Mid-level Data Analyst specializing in analytics engineering and financial services
“Data-driven growth and partnerships professional with experience leading an analytics/reporting vendor rollout end-to-end (vendor selection via stakeholder interviews and PoC, then negotiating scope/pricing/support and tracking adoption/efficiency/accuracy KPIs). At PC Financial, built regression and segmentation models to optimize multi-channel targeting (in-app/email/push), driving +15% campaign engagement and +10% PC Optimum offer loads, and ran behavior-triggered lifecycle experiments that lifted upsell conversion by 20%.”
Director-level Applied AI & Data Analytics Engineer specializing in real-time decisioning systems
“Built and shipped a production AI/LLM agent-based, event-driven credit underwriting/decisioning workflow that automated document understanding, retrieval, risk scoring, and compliance checks—cutting turnaround from ~90 days to ~5 minutes while boosting throughput 200x+ and approvals ~50%. Experienced with Airflow/Prefect orchestration, Redis/RabbitMQ queues, rigorous eval/monitoring, and close collaboration with non-technical underwriting teams.”
Senior Data Scientist / AI Engineer specializing in LLMs, RAG, and production ML
“Data science professional who has built a production RAG-based LLM question-answering system ("Flash Query") to deliver fast, accurate answers over large document collections, focusing on retrieval quality and grounded responses. Also collaborates with non-technical retail/jewelry stakeholders to turn business questions into predictive models and dashboards for decision-making.”
Junior Business & Data Analyst specializing in automation, BI, and implementation
“Operations- and growth-oriented candidate who improves external partner workflows through standardization and measurement (cut turnaround time ~40% while maintaining 99% accuracy). Also launched and scaled a university Excel/data analysis workshop using ICP-driven GTM and a tracked acquisition loop, increasing attendance 15% and generating 95% repeat-demand intent.”
Intern Data Scientist specializing in GenAI agents, RAG, and ML platforms
“LLM/agent systems builder who deployed a production hybrid router for immerso.ai that dynamically selects retrieval vs reasoning vs generative pathways, achieving an 82% factual-accuracy lift. Deep hands-on experience optimizing local Mistral 7B inference (4–5 bit GGUF quantization, KV-cache reuse) and building reliable RAG/agent workflows with LangChain/LangGraph/AutoGen across GCP Cloud Run and AWS (ECS/Lambda).”
Mid-level Data Analyst/Data Engineer specializing in SQL, ETL pipelines, and BI dashboards
“Built and supported a production analytics backend (Python, PostgreSQL/Teradata, Airflow) powering KPI/reporting dashboards, and resolved peak-time latency/timeouts through systematic SQL tuning (EXPLAIN ANALYZE, indexing, query rewrites, pre-aggregations). Also shipped an applied AI-style feature that generates plain-language report summaries from pre-computed metrics with validation, monitoring, and fallback to manual review.”
Junior Software Developer specializing in AI data labeling and full-stack web development
“Frontend-focused builder who has led multiple projects end-to-end, including a React/Vite/TypeScript weather app and an internal analytics dashboard optimized for large, time-based datasets. Also created and shipped AetherGrid, a full-stack Windows desktop app, iterating with 5–10 testers and implementing pixel-perfect native UI details plus installer/uninstaller packaging; mentions starting a full-time role at Meta.”
Mid-level IT Business Analyst specializing in QA, UAT, and ERP/CRM systems
“Manual QA tester with experience in black-box/white-box and positive/negative testing, using meeting minutes to translate requirements into executable test scenarios. Comfortable working in a structured workflow using Trello for task tracking and prioritizing urgent issues first; open to learning new standards and workflows.”
Senior IT Support & Data Operations Specialist specializing in AI/LLM data labeling
“Console game testing experience focused on controller/input responsiveness (e.g., passing/shooting delays) with bug documentation shared to developers via notes/Excel. Also worked as an AI data tagging agent for an e-commerce LLM and built a Python + OpenAI API micro-tool to pre-label queries, improving labeling throughput by ~30% with a human-in-the-loop QA process.”
Junior Data Analyst specializing in analytics, BI, and machine learning
“Analytics professional with experience spanning infrastructure, energy, and digital engagement data. They have built SQL and Python workflows to turn messy operational data into trusted reporting assets, and led a wind turbine SCADA analysis that quantified roughly $1M in cumulative performance loss and translated findings into actionable Power BI dashboards.”
Intern Data Scientist specializing in analytics, BI, and machine learning
“Marketing and product-focused analytics candidate with hands-on experience turning messy large-scale data from Hadoop/HDFS, Azure Data Lake, and transaction systems into validated reporting tables. They combine SQL and Python automation with strong metric design, cohort/retention analysis, and stakeholder-friendly dashboards, including a reported 30% query performance improvement and weekly reporting automation.”