Pre-screened and vetted in Remote.
Senior Data Engineer specializing in cloud data platforms and real-time analytics
Mid-level Data Engineer specializing in cloud lakehouse and streaming analytics
Senior Data Engineer specializing in cloud lakehouse platforms and healthcare data
Mid-Level Software Engineer specializing in data infrastructure and LLM applications
Mid-level Data Engineer specializing in cloud-native analytics and enterprise integrations
“Built and productionized an LLM-powered clinical assistant at a healthcare startup, re-architecting a prototype into a robust RAG system on AWS with guardrails, citations, monitoring, and automated tests for clinical reliability. Works closely with clinicians to convert workflow feedback into evaluation criteria and iterative system improvements, and has hands-on experience debugging agentic systems in real time (including during live client demos).”
Mid-level Data Engineer specializing in cloud lakehouse, ETL, and streaming pipelines
Mid-level Data Engineer specializing in cloud-native ETL and data warehousing
Mid-level Data Engineer specializing in cloud-native ETL and data warehousing
Principal Cloud & Infrastructure Engineer specializing in reliability and regulated data platforms
“Founder/CTO-type startup leader who has built cloud-native data and AI platforms from scratch while owning both technical vision and product direction. Brings rare end-to-end startup experience spanning zero-to-one building, growth-stage execution, and fundraising from early stage through exit, with a strong ability to translate technical complexity into clear investor narratives.”
Mid-level AI/ML Engineer specializing in GenAI, RAG pipelines, and cloud MLOps
“Built and deployed a production LLM + vector search clinical decision support system at UnitedHealth Group, retrieving medical evidence and patient context in real time for prior authorization and risk scoring. Strong in end-to-end RAG architecture (Hugging Face embeddings, Pinecone/FAISS, SageMaker, Redis) plus orchestration (Airflow/Kubeflow) and rigorous evaluation/monitoring, with demonstrated ability to align solutions with clinical operations stakeholders.”
Mid-level Software Engineer specializing in AI/ML and data platforms
“AI/ML engineer who built a production agentic system to automate computational research experiments (simulation execution, parameter exploration, and numerical analysis) and mitigated context-window failures using constrained tool-calling/prompt-chaining patterns in LangChain with OpenAI tool-enabled models. Also has adtech/big-data pipeline experience at InMobi, orchestrating Spark jobs in Airflow to filter bot-like user IDs and publish clean IDs to an online NoSQL store for live serving, plus Apache open-source collaboration experience.”
Mid-level AI/MLOps Engineer specializing in GenAI, RAG, and production ML platforms
Mid-level Data Engineer specializing in healthcare data platforms and streaming CDC pipelines
Mid-level AI/ML Engineer specializing in healthcare NLP and MLOps
Mid-level Data Engineer specializing in AWS, real-time pipelines, and ML/GenAI data platforms
Mid-level Data Engineer specializing in streaming pipelines and cloud data platforms
Mid-level Data Engineer specializing in real-time pipelines and cloud data platforms
“Backend engineer with hands-on experience building secure Python/Flask services (sessions, JWT, RBAC) and optimizing PostgreSQL/SQLAlchemy performance, including custom SQL using CTEs/window functions profiled via EXPLAIN ANALYZE. Also integrates LLM features via OpenAI/Azure into backend systems and improves scalability with RabbitMQ-driven async processing, caching, and multi-tenant data isolation patterns.”
Mid-level Data Engineer specializing in scalable ETL, streaming analytics, and cloud data platforms
“At Dreamline AI, built and productionized an AWS-based incentive intelligence platform that uses Llama-2/GPT-4 to extract eligibility rules from unstructured state policy documents into structured JSON, then processes them with Glue/PySpark and serves results via Lambda/SageMaker/API Gateway. Designed state-specific ingestion connectors plus schema validation and automated checks/alerts to handle frequent policy/format changes without breaking the pipeline, and partnered with business/analytics stakeholders to deliver interpretable eligibility decisions via explanations and dashboards.”