Pre-screened and vetted.
Mid-level Robotics Researcher specializing in motion planning and vehicle routing
“CMU robotics PhD/PhD researcher and former CMU Robotics Club project lead who built a novel Bayes-filter-based system to localize within music so robotic instruments can follow a human’s tempo in real time. Also works on simulation-heavy multi-agent vehicle routing with traffic-signal scheduling, optimizing for real-time performance via profiling, multithreading, and neural-network surrogates for signal control.”
Mid-level Robotics Engineer specializing in autonomy, computer vision, and reinforcement learning
Mid-level Software Engineer specializing in AI applications and distributed backend systems
Intern Robotics Engineer specializing in autonomy, SLAM, and multi-sensor perception
Mid-level Robotics & AI Researcher specializing in embodied AI and human-robot interaction
Junior Robotics Engineer specializing in ROS2 autonomy and embedded systems
Mid-level Robotics Software Engineer specializing in SLAM and 3D computer vision
“Robotics software engineer focused on outdoor mobile robot localization and navigation, building ROS1/ROS2 systems with NavSat+EKF sensor fusion and custom Nav2/Costmap2D extensions for 3D obstacle clearance. Demonstrates strong real-world troubleshooting by tracing localization drift to a failing IMU connector, repairing it, and then creating sensor-health monitoring tooling; experienced taking features from Gazebo simulation through field testing to Docker/Kubernetes deployment with CI via GitHub Actions.”
Junior Robotics & Controls Engineer specializing in autonomous navigation and UAV systems
“Robotics software engineer with extensive ROS2 experience building and integrating full autonomous navigation stacks (sensor fusion, ICP/occupancy-grid mapping, planning). Hands-on computer vision work for drones, including monocular depth-from-RGB pipelines (Open3D/OpenCV) and model swaps to improve low-light performance. Also experienced in multi-drone coordination with Crazyswarm2 and Vicon-based state estimation, plus Gazebo simulation for TurtleBot, Franka Emika, and a Lunabotics-style excavation rover.”
Mid-level Marketing Analyst specializing in digital marketing analytics and omnichannel attribution
Mid-level Robotics Engineer specializing in SLAM, perception, and state estimation
“Robotics software lead with 4+ years of ROS/ROS2 experience spanning a startup (Inductive Robotics) and General Motors, building autonomous mobile manipulation and AMR material-handling stacks. Has hands-on depth in SLAM/navigation (Cartographer/Nav2), perception, and simulation, and has directly modified Cartographer to handle real-world sensor dropouts. Currently working on fleet-scale mapping capabilities (map merging/editing, trajectory pruning) for multi-robot deployments.”
Mid-level Software & Robotics Engineer specializing in autonomous systems and ROS 2
“Robotics software engineer focused on production-grade autonomy in GPS-denied environments, building full navigation stacks (perception, EKF/UKF sensor fusion, planning, control) in ROS2. Integrated YOLOv8/semantic segmentation/RL policies into real-time NAV2 pipelines via a custom perception-aware costmap layer, with emphasis on deterministic control loops, embedded GPU performance, and robust system observability/fault tolerance.”
Senior Robotics Software Engineer specializing in ROS, CI/CD, and autonomy tooling
“Robotics software engineer with hands-on experience migrating a robotics project from ARM to AMD by building a Dockerized environment with PyTorch/CUDA dependencies, improving data processing and battery efficiency. Has integrated ROS 2 nodes for a Time-of-Flight camera and debugged motion-planning issues (tight-turn stopping) using data collection and iterative tuning; also built custom robots in Webots for sensor/actuator-driven behaviors.”
Junior Robotics & Computer Vision Engineer specializing in perception and autonomy
“Robotics engineer with capstone experience building an autonomous food-assembly robot arm, owning perception/deep learning (SAM2-based segmentation) and a model-based RL manipulation policy for deformable food items while also serving as project manager. As a robotics engineering intern at Salin247, optimized an autonomous farm vehicle perception stack to hit 20 FPS by cutting latency from 200ms+ to ~40ms using GPU acceleration (CUDA OpenCV, CuPy) and multiprocessing, and built ROS 2 nodes for real-time perception and streaming.”
Junior Robotics Engineer specializing in UAV autonomy, SLAM, and motion planning
“Robotics software engineer who led localization/SLAM work on an autonomous indoor security drone operating in a pre-mapped environment. Implemented a robust localization strategy combining visual PnP loop closures with point-cloud ICP to mitigate issues like visual map aging, and uses ROS tooling (rosbag/TF/RViz) plus Gazebo and Docker for repeatable debugging, simulation, and development.”
Mid-level Software Engineer specializing in robotics and low-latency trading systems
Mid-level Robotics Engineer specializing in autonomous aerial systems and perception
Mid-level Robotics Engineer specializing in autonomous systems and satellite operations
Junior Motion Designer specializing in motion graphics and visual storytelling
Junior Robotics Engineer specializing in computer vision and autonomous systems
Mid-level Mechatronics & Robotics Engineer specializing in controls, ROS2, and autonomous systems
Mid-level Robotics Software Engineer specializing in ROS2 autonomy and manipulation
“Robotics software engineer with hands-on experience building ROS2-based distributed multi-robot systems, including task allocation (distance cost matrix) and navigation using Nav2/DWA. Previously on Vicarious’s grasping team, implementing real-world box-picking with vacuum suction and force/torque sensing plus recovery behaviors. Also brings strong engineering hygiene with Gazebo simulation, Dockerized deployments across offices, and GitHub Actions CI/CD with unit/integration testing.”
Intern Electrical Engineer and Robotics Researcher specializing in automation and embedded systems
“Robotics researcher/team lead from ASU’s Robotics and Intelligent Systems Lab who rebuilt a failing soft-robotics test bench into a modular 8-Arduino, ROS 2/Python-controlled data-collection system, doubling sampling performance and dramatically reducing downtime. Currently doing thesis work on physics-informed neural network (PIRNN/RNN) modeling of a pneumatically actuated soft robotic arm using experimentally collected trajectory/pressure data.”
Mid-level Robotics Software Engineer specializing in real-time control and perception
“Robotics software engineer focused on controls and motion planning for autonomous flight systems using ROS 2 (rclcpp), Gazebo/RViz, and BehaviorTree.CPP. Has hands-on real-time control experience (1ms loop rate) and has improved system performance by tracing latency issues and refactoring vision components (singleton camera init). Also built low-latency Ethernet/TCP comms on top of the IgH Ethernet stack and uses digital-twin simulation (Gazebo, MuJoCo; beginner Isaac Sim) to validate algorithms.”
Mid-level Robotics Engineer specializing in ROS, motion planning, and robotic manipulation