Pre-screened and vetted.
Mid-level sales and data professional specializing in FinTech, telecom, and insurance
Principal Cloud & Data Architect specializing in AI-enabled AWS platforms
Mid-level AI/ML Engineer specializing in financial risk, NLP, and MLOps
Mid-level Data Scientist specializing in industrial IoT, predictive analytics, and generative AI
“ML/NLP engineer with Industrial IoT experience who built an end-to-end anomaly detection and GenAI explanation system: AWS (S3, PySpark, EC2/Lambda) pipelines feeding dashboards, plus transformer-embedding vector search to connect anomalies to noisy maintenance notes and past events. Demonstrated measurable impact (15% lift in defect detection; ~35% reduction in manual review; 35% fewer preprocessing errors) and strong productionization practices (orchestration, monitoring, rollback, data-quality controls).”
Senior Python Developer specializing in AWS, microservices, and data pipelines
“Backend/data engineer with strong AWS production experience spanning serverless APIs and containerized workers (Lambda, API Gateway, ECS) plus data pipelines (Glue, S3, Athena/Redshift). Has modernized legacy SAS/cron batch systems into Python/AWS with parallel-run parity validation and low-risk cutovers, and has owned ETL incidents end-to-end (CloudWatch detection, backfills, and preventative controls). Targeting $130k–$150k base and strongly prefers remote, with occasional Bethesda onsite acceptable.”
Senior Backend Software Engineer specializing in AWS cloud-native data platforms
“AWS-focused Python backend/data engineer who builds production analytics APIs and ETL pipelines using API Gateway, Lambda, Step Functions, ECS, Glue, S3, and RDS. Strong in operational reliability and performance tuning (including SQL indexing/partitioning) and has modernized legacy SAS statistical processing into validated Python services with phased rollouts and stakeholder sign-off.”
Mid-level Data Scientist specializing in GenAI, RAG, and forecasting
“ML/NLP engineer focused on large-scale data linking for e-commerce-style catalogs and customer records, combining transformer embeddings (BERT/Sentence-BERT), NER, and FAISS-based vector search. Has delivered measurable lifts (e.g., +30% matching accuracy, Precision@10 62%→84%) and built production-grade, scalable pipelines in Airflow/PySpark with strong data quality and schema-drift handling.”
Mid-level Data Scientist specializing in credit risk, fraud detection, and ESG analytics
“AI/LLM practitioner who has deployed production chatbots across e-commerce, HRMS, and real estate, focusing on retrieval-first workflows for factual tasks like product and property search. Optimized intent understanding and significantly improved latency by using lightweight embeddings and tuning the inference pipeline on Groq (Llama 3.3), while applying modular orchestration and measurable production evaluation.”
Junior Data Engineer specializing in data pipelines and streaming ingestion
“Backend/data platform engineer who owned a near-real-time patient feedback ingestion system, building a FastAPI + Kafka service with Snowflake/Airflow orchestration. Demonstrates strong production Kubernetes/GitOps practices on AWS EKS (Helm, Argo CD, Sealed Secrets) and solved real-time data integrity issues via idempotent processing with Redis.”
Mid-Level Data Engineer specializing in cloud data pipelines and big data platforms
“Data engineer with ~4 years of experience building Python-based data ingestion/processing services and real-time streaming pipelines (Kafka/PubSub + Spark Structured Streaming). Has deployed containerized data applications on Kubernetes with GitLab CI/Jenkins pipelines and applied GitOps to cut deployment time ~40% while reducing config drift. Also supported a legacy on-prem data warehouse/backend migration to GCP using phased migration and parallel validation to meet strict reliability/SLA needs.”
Mid-level Data Engineer specializing in cloud data pipelines and analytics engineering
“Built and deployed a production LLM-powered demand and churn forecasting system for an e-commerce client, combining open-source LLMs (LLaMA/Mistral) and Sentence-BERT embeddings to generate business-friendly explanations of forecast drivers. Strong focus on data quality and model trust (validation, baselines, segmented monitoring) and production reliability via Airflow-orchestrated pipelines with readiness checks, retries, and ongoing drift/A-B testing.”
Mid-level Software Engineer specializing in cloud, DevOps, and distributed systems
Mid-level MLOps/ML Engineer specializing in LLMs and financial risk modeling
Mid-level Data Scientist specializing in Generative AI, RAG systems, and MLOps
Senior Cloud Software Engineer specializing in AWS microservices and DevOps
Mid-level AI/ML Data Engineer specializing in secure ML pipelines and AI governance
Mid-level Software Engineer specializing in AI and cloud-native data platforms
Senior Full-Stack Software Engineer specializing in AWS, .NET, and data/telemetry platforms
Mid-level Data Engineer specializing in FinTech and AI-ready data platforms
Mid-level Machine Learning Engineer specializing in LLMs, Generative AI, and MLOps