Skip to content

Deployment of AMR with Jetson (AVerMedia D115W) + Isaac ROS

This tutorial walks you through setting up an Autonomous Mobile Robot (AMR) using Jetson with the AVerMedia D115W platform and NVIDIA Isaac ROS. We cover sensor integration, depth estimation (ESS), Visual SLAM, costmap generation, and autonomous navigation with Nav2.

Prerequisites

AMR robot powered by Jetson Orin NX (AVerMedia D115W) with stereo camera and actuator.

  • Jetson Orin NX (AVerMedia D115W)

  • Isaac ROS v3.2 for JetPack 6.1 (ROS 2 Humble)

  • ZED X Stereo Camera (or other stereo source)

  • 2D LiDAR (optional, e.g. RPLiDAR A1)

Why D115W for AMR?

Figure: Overall Isaac ROS-powered AMR architecture with depth, SLAM, and navigation pipeline.

Building an AMR system often involves fragmented software stacks, sensor synchronization issues, and time-consuming deployment. The AVerMedia D115W, built on Jetson Orin NX, offers a compact and powerful platform that simplifies this process. With full JetPack 6 support, industrial I/O, and excellent GPU acceleration, it's well-suited for running Isaac ROS modules like stereo depth, vSLAM, and Nav2. In our experience, it has proven to be a reliable and efficient platform for real-world AMR deployment.

Autonomous Mobile Robots (AMRs) are becoming increasingly essential in modern robotics applications, including warehouse automation, indoor navigation, and smart factories. These systems require compact yet powerful compute platforms capable of handling complex tasks such as sensor fusion, visual SLAM, 3D reconstruction, and real-time path planning.

The AVerMedia D115W, powered by NVIDIA Jetson Orin NX, provides an ideal embedded compute solution for AMR applications. Its small form factor, industrial I/O interfaces, and full support for JetPack 6 and GPU acceleration make it suitable for real-time workloads in constrained environments. Combined with NVIDIA Isaac ROS, developers can accelerate the deployment of AMR systems with pre-integrated packages for perception, localization, and navigation.

This guide demonstrates how to build a complete AMR system with stereo camera, depth estimation, Visual SLAM, and waypoint navigation using D115W and Isaac ROS.

Demo Showcase

This demo showcases live trajectory planning, costmap generation, and environment reconstruction using Isaac ROS on D115W.

Real-World AMR Deployment Experience

Deploying an Autonomous Mobile Robot (AMR) used to be a complex process, requiring deep expertise in robotics, software integration, and hardware configuration. With Isaac ROS and the AVerMedia D115W platform, this process has become remarkably easy and accessible—even for those without extensive robotics backgrounds.

This solution emphasizes straightforward setup and integration.

  • Modular architecture: Each function—stereo depth, SLAM, navigation—is provided as a ready-to-use module. There's no need to develop or integrate low-level drivers yourself.

  • Plug-and-play experience: Simply connect your sensors, install the Isaac ROS packages, and launch the provided examples to see immediate results.

  • Minimal configuration: Most parameters come with sensible defaults. Only minor adjustments are needed to match your specific hardware, with no need to dive deep into ROS internals.

  • Comprehensive documentation and community support: Isaac ROS offers detailed guides and examples, making troubleshooting and learning straightforward.

  • Versatile for different scenarios: Whether for logistics, education, or research, anyone with a Jetson Orin NX and D115W can quickly build a professional-grade AMR.

In this project, deployment was significantly faster and more straightforward compared to traditional AMR setups. While some familiarity with ROS is still necessary, the modular design and comprehensive documentation of Isaac ROS and the D115W platform greatly reduce integration time and complexity. This streamlined process enables experienced developers to bring up a fully functional AMR system in a much shorter timeframe, making advanced autonomy more accessible than ever before.

References

you can rapidly deploy a fully autonomous mobile robot capable of stereo depth perception, Visual SLAM, and robust navigation via Nav2. This modular pipeline enables flexibility across various sensors and robot platforms.