Pre-screened and vetted.
Mid-level Robotics Software Engineer specializing in embedded Linux and ROS2
“Robotics/embedded platform engineer who led compute provisioning and ROS2 platform development for a smart electrical panel, building a full ROS2 stack that bridges mobile app requests (via AWS IoT/MQTT) to signal-level CAN commands and returns telemetry/logs to the cloud. Demonstrated strong full-stack debugging by resolving flaky CAN traffic down to a custom PCB transceiver standby-pin grounding issue, and improved real-time behavior by refactoring middleware into C++ with timer-scheduled publishes to meet firmware watchdog expectations.”
Junior Robotics & Mechatronics Engineer specializing in embedded automation and robotic systems
Mid-level Robotics/Mechatronics Engineer specializing in ROS 2, SLAM, and sim-to-real autonomy
“Robotics software engineer focused on sim-to-real deployment: built an Isaac Sim/Isaac Lab PPO training pipeline with domain randomization for vision-conditioned quadruped locomotion and integrated a RealSense D435i into a ROS2 stack on hardware. Also worked on an autonomous surface vessel, standardizing ROS2 interfaces across Jetson, microcontroller, GPS/IMU and motor controllers, using structured logging/replay to debug real-time oscillations and improve path tracking.”
Mid-level Automation & Robotics Engineer specializing in industrial controls and computer vision
“Robotics software engineer with hands-on experience building an AGV for warehouse autonomy at Kick Robotics, working across SLAM, waypoint navigation, computer vision, and ROS2/RViz simulation. Demonstrated strong on-site troubleshooting by diagnosing a real-world mapping stall via log analysis (coordinate reset bug) and deploying a fix. Also has industrial automation experience coordinating SCARA robots via EtherNet/IP and multi-robot/swarm simulations using MQTT pub-sub.”
Mid-level AI/ML Engineer specializing in LLMs, NLP, and AWS MLOps
“Recent master’s graduate in robotics with applied experience across reinforcement learning and ROS 2 autonomy stacks. Built an RL-based drone vertiport traffic controller (PPO) focused on reward design and simulation integration, and has hands-on navigation work in ROS 2 including LiDAR preprocessing, SLAM/path planning, and stabilizing TurtleBot3 wall-following. Also brings deployment experience containerizing robotics nodes and scaling them with Kubernetes on AWS.”
Mid-level Full-Stack Software Engineer specializing in cloud microservices and AI search
“Robotics software engineer focused on backend/integration for indoor autonomous mobile robots, with hands-on ROS 2 experience integrating Nav2/AMCL/TF2 and LiDAR/camera pipelines. Emphasizes production readiness—robust failure recovery, QoS-tuned distributed communication, and strong observability (logging/health checks)—validated through Gazebo simulation, sensor-data replay debugging, and Docker-based CI/CD deployment.”
Mid-level Robotics Software & Systems Engineer specializing in ROS2 multi-robot systems
“Robotics software engineer with ROS2 multi-robot experience spanning decentralized signal source localization (LoRa RSSI on TurtleBot3) and a master’s-thesis project on collaborative object transportation with 4 robots. Strong in sim-to-real debugging—implemented noise modeling (RBF) and practical hardware/coordination fixes (CoG tuning, clock sync/flags) to make algorithms work reliably on real robots.”
Intern Robotics Software Engineer specializing in ROS2 autonomy and LiDAR localization
“Robotics software engineer focused on production-grade autonomous mobile robot (AMR) navigation in warehouse-style environments, with deep hands-on ROS2/ROS Noetic experience across SLAM, AMCL/NDT LiDAR localization, and Nav2 integration. Strong in real-time debugging and performance tuning using rosbag-driven regression workflows, plus containerized deployment (Docker/Compose) and distributed robot/edge-device communication via MQTT/REST.”
Mid-level Mechatronics Engineer specializing in robotics, embedded firmware, and autonomous systems
“Robotics/embedded engineer with hands-on firmware ownership for closed-loop motor/vision systems and strong ROS1 navigation experience (move_base, gmapping, robot_localization), including EKF sensor fusion to eliminate localization drift to sub-10cm accuracy. Also brings IoT distributed pub/sub expertise (30+ devices over MQTT, 99% uptime) plus Unity AR/VR simulation and computer-vision test automation that saved 100+ hours.”
Mid-level Robotics & AI Engineer specializing in autonomous systems
“Robotics software engineer with deep ROS2 experience who owned the perception stack for an automated C. elegans manipulation system—building YOLO-based worm segmentation plus OCR label reading and integrating it into a MoveIt2 pipeline with real-time latency constraints. Also deploying ROS2 on an AgileX Tracer with ZED depth camera for vision-based person following and working on SLAM/sensor fusion, with additional production-style ML deployment experience (Dockerized FastAPI + PyTorch on AWS EC2 with CI/CD).”
Intern AI & Robotics Engineer specializing in reinforcement learning and computer vision
“Robotics/AI engineer focused on multi-agent reinforcement learning for Crazyflie drones, enabling coordination via implicit motion-based communication and a stabilizing FSM layer; reported 98.5% sim and 92% real-world behavior-recognition accuracy. Also built a modular ROS 2 wall-following system (custom nodes/services/actions) and a Raspberry Pi + OpenCV stereo-vision walking robot, emphasizing rigorous logging, stress testing, and sim-to-real deployment.”
Junior Robotics Engineer specializing in controls, simulation, and production debugging
“Robotics software engineer who helped build a startup "robo-chef" system end-to-end, including pick-and-place simulation using ArUco-marked stations and smooth motion planning. Hands-on ROS 2 integrator across LiDAR/IMU/camera perception-to-navigation stacks (Nav2, SLAM Toolbox, ros2_control), with demonstrated ability to debug real-time timing drift and improve repeatable placement through calibration and motion blending. Uses Gazebo simulation plus Docker/CI pipelines to validate and deploy robotics software reliably.”
Junior Robotics Engineer specializing in autonomous navigation and computer vision for agriculture
“Robotics software engineer who led an autonomous nursery management robot project at Auburn University, spanning RGB-D/IMU sensor fusion, SLAM navigation, and real-time ML for plant detection/quality assessment. Strong ROS1/ROS2 background (C++/Python) with deployment on NVIDIA Jetson, including profiling-driven optimization of YOLO segmentation for real-time behavior and multi-robot (UGV/UAV) communication using ROS2.”
Mid-level Robotics Software Engineer specializing in ROS, C++ and embedded Linux
“Robotics software lead at Icor who grew from intern to owning the end-to-end software lifecycle for a mobile manipulator platform deployed to 300+ customers globally. Deep hands-on ROS2/MoveIt2 and navigation-stack integration (URDF/TF, sensors, behavior engine) plus production infrastructure (CI/CD, OTA, field OS upgrades) and real-world performance tuning for motion planning in EOD multi-robot environments.”
Mid-level Robotics Engineer specializing in ROS 2, control systems, and manipulation
“Robotics software engineer with hands-on ROS2 experience across manipulation, SLAM/localization, and sensor fusion. Recently built an end-to-end hybrid force-position control system for a Ufactory xArm7 with a 6-axis force/torque sensor to enable compliant, force-guided shaft insertion, including real-time Jacobian computation, TF pipeline, and MoveIt2 trajectory execution validated on hardware.”
Entry-Level Robotics Software Engineer specializing in ROS 2 autonomy and multi-robot systems
“Robotics software engineer focused on ROS 2 multi-robot coordination, having built task allocation and reliable inter-robot communication for simulated TurtleBot3 fleets. Demonstrates strong integration/debugging skills across Nav2 + gmapping (SLAM drift, TF tree, odometry/sensor fusion) and pairs it with production-minded tooling—Docker/Kubernetes deployments and CI/CD simulation testing via GitHub Actions.”
Junior Machine Learning Engineer specializing in computer vision and robotics
“Research assistant who single-handedly built and integrated an indoor autonomous wheelchair system using NVIDIA Jetson Nano, LiDAR, and a stereo camera. Implemented a multi-sensor perception pipeline (OpenCV/PCL) with ROS-based modular nodes, TF frame management, and robust debugging via RViz/rosbag, plus simulation testing in Gazebo and Dockerized environments for portability.”
Entry-level Robotics Research Assistant specializing in multi-agent autonomy and reinforcement learning
“ROS2/Python robotics engineer who led a 4-person team building a simulated multi-robot warehouse system (SLAM + NAV2 + centralized task allocation) in Gazebo Ignition, including a distance/priority-based controller that reduced task completion time by ~30%. Also has hands-on real-time debugging/tuning experience for both mobile robots and a MyCobot 600 Pro manipulator, plus simulation work in CARLA using RL (TD3) and Social-LSTM for pedestrian behavior modeling.”
Junior Robotics/Mechatronics Engineer specializing in SLAM, motion planning, and autonomy
“Robotics software engineer focused on autonomy stacks for high-payload AMRs using ROS2/Nav2, with hands-on expertise in SLAM/localization and sensor fusion (RTK GPS, IMU, wheel odom, ZED2) to eliminate drift and stabilize real-time behavior on deployed hardware. Also built multi-robot coordination in ROS2/Gazebo and uses Docker + Git/CI-style testing to create reproducible simulation-to-hardware pipelines.”
Intern Software Engineer specializing in full-stack development, cloud, and automation
“Robotics software engineer who built an autonomous debris-clearing rover software stack end-to-end using ROS 2, Python/OpenCV, and YOLOv3, with strong emphasis on real-time reliability (latency instrumentation, stale-data handling, watchdog fail-safes). Also implemented a Docker CI/CD deployment system for remote Raspberry Pi timelapse devices, distributing updates via AWS S3 to handle intermittent connectivity.”
Junior Robotics Engineer specializing in ROS, perception, and robotic manipulation
“Robotics software engineer focused on ROS2 autonomy stacks, with hands-on work spanning semantic 3D SLAM, sensor fusion, and controller customization. Built an indoor GPS-denied semantic SLAM system (>95% accuracy) and extended Nav2’s MPPI controller with a custom C++ critic to keep an agricultural rover centered in crop rows, boosting CO2 laser weeding effectiveness by 40%. Strong in simulation-to-real workflows (Isaac Sim, Gazebo Ignition) and deployment automation (Docker on Jetson Orin NX, GitHub Actions CI/CD).”
Junior Robotics Engineer specializing in ROS2 perception and multi-sensor calibration
“Entry-level robotics software engineer/team lead with hands-on experience spanning multi-robot UAV simulation (Gazebo + PX4 SITL) and autonomous vehicle stack integration (ROS2 Humble + Autoware Universe). Has tackled real-time perception optimization (OpenCV + custom deep learning) and built robust cross-protocol communication interfaces to connect ROS2 systems with embedded ESP32 devices.”
Mid-level Robotics Software Engineer specializing in ROS, motion planning, and perception
“Robotics software engineer who built a ROS/C++ workcell stack to automate coating wooden panels with a 6-DOF arm, including trajectory generation, MoveIt/OMPL planning, and a single launch/config setup that runs in both Gazebo and on real hardware. Strong in debugging real-world planning failures (e.g., intermittent aborted/no-plan regions) through logging, planner swaps, and collision/kinematics tuning, and in designing modular ROS/ROS2 systems with versioned interfaces and translation layers for heterogeneous robots.”
Mid-level Robotics Engineer specializing in ROS2 autonomy, perception, and manipulation
“Deployment engineer at a robotics startup who owned end-to-end field deployments in greenhouse environments, including integrating humanoid robots (XArm 6), tuning perception stacks for real-world lighting shifts, and coordinating rapid fixes with hardware/software teams. Experienced debugging complex robotics integrations (LiDAR + NVIDIA Jetson + ROS2 + networking) and hardening solutions by automating configuration at boot, while also working directly with customers and training operators for ongoing support.”