Amazon Devices & Services Unveils Zero-Touch Manufacturing Breakthrough with NVIDIA AI and Digital Twins
Amazon Devices & Services has made a significant advancement in manufacturing by achieving a major milestone in zero-touch production using NVIDIA’s digital twin technologies and AI. The new system, now deployed at an Amazon facility, enables robotic arms to inspect a wide range of devices for quality and integrate new products into the production line—entirely through synthetic data and simulation, without requiring physical hardware changes. At the core of this innovation is a simulation-first approach that combines Amazon’s custom software with NVIDIA-powered digital twins. These digital twins replicate real-world factory environments and products with high fidelity, allowing robotic systems to be trained in virtual settings before operating on actual assembly lines. This reduces reliance on costly and time-consuming physical prototyping, accelerating product development and deployment. The solution leverages photorealistic, physics-enabled simulations of Amazon devices and factory stations to generate synthetic data. This data is used to train AI models for tasks like object detection, defect identification, and robotic motion planning. By training robots in simulation, Amazon Devices & Services can rapidly adapt its manufacturing lines to new products simply by updating software—eliminating the need for retooling or reconfiguration. NVIDIA Isaac Sim, built on the Omniverse platform, serves as the foundation for creating these digital twins. It processes CAD models of new devices and generates over 50,000 diverse synthetic images per product, which are essential for training robust AI models. Isaac Sim also works with NVIDIA Isaac ROS to compute precise robotic arm trajectories for handling products of varying shapes and sizes. The system uses NVIDIA cuMotion, a CUDA-accelerated motion-planning library, to calculate collision-free paths in real time on NVIDIA Jetson AGX Orin modules. The nvblox library, part of Isaac ROS, generates distance fields that support cuMotion’s planning capabilities. Together, they enable fast, safe, and accurate robotic movements. A key enabler is FoundationPose, an NVIDIA foundation model trained on 5 million synthetic images for object pose estimation. It allows robots to identify and track new devices without prior exposure, enabling seamless transitions between products and reducing the need for retraining. Amazon Bedrock plays a vital role in high-level task planning. It analyzes product specifications—such as 3D designs and surface properties—and uses generative AI to create audit test cases and automate workflow planning across multiple factory stations. Amazon Bedrock AgentCore will support autonomous coordination of tasks across the production line. The entire development process was accelerated by AWS, which provided distributed AI training on Amazon EC2 G6 instances via AWS Batch. NVIDIA Isaac Sim and synthetic data generation ran on the same G6 family instances, enabling efficient, scalable processing. This breakthrough marks a major step toward generalized manufacturing—where automated systems can handle diverse products and processes with minimal human intervention. By combining AI, digital twins, and real-time simulation, Amazon Devices & Services is building more flexible, efficient, and scalable production pipelines. For more insights into how simulation and AI are transforming industrial operations, visit NVIDIA’s presentation at SIGGRAPH, running through August 14.