NVIDIA Advances Robotic Assembly with Zero-Shot Sim-to-Real Transfer Using Isaac Lab and Universal Robots Torque Interface
Robotic assembly is crucial in industries like manufacturing, automotive, aerospace, electronics, and medical devices. However, traditional robotic assembly systems are often limited by fixed automation, necessitating extensive human engineering and lacking adaptability. To overcome these limitations, NVIDIA and Universal Robots (UR) are collaborating to develop flexible automation solutions using simulation, artificial intelligence (AI), and reinforcement learning (RL). NVIDIA has introduced Isaac Lab, an open-source, modular training framework for robot learning, and Isaac ROS, a collection of accelerated computing packages and AI models built on the open-source ROS 2 software framework. Isaac Lab enhances contact-rich simulation, making it feasible to accurately simulate complex interactions and train robots through large-scale RL in parallel environments. The demonstration focuses on a gear assembly task performed by the UR10e robot. This task involves three core skills: grasp generation, free-space motion generation, and insertion. Grasp generation uses an off-the-shelf grasp planner to determine feasible grasp poses for the gears. The motion generation and insertion skills, which are more challenging, are learned using RL. The RL policies are trained in Isaac Lab, leveraging the Proximal Policy Optimization (PPO) algorithm from the rl-games library. Training involves domain randomization, where the robot experiences a variety of initial conditions and environmental variations, enhancing its ability to handle real-world unpredictability. The policies were tested in simulation and then transitioned to the real world using the UR10e's direct torque control interface. Unlike the rigidity of position controllers, impedance control allows the robot to have safer, more compliant interactions with objects. The combination of Isaac Lab’s policies and the UR10e’s impedance controller results in a highly adaptable and robust system. The sim-to-real transfer workflow involves a perception pipeline that processes RGB images to generate segmentation masks. These masks, along with depth images, help estimate the 6D poses of the gears. The pose estimates and joint positions from the robot’s encoders form the observations fed into the RL policy, which then predicts the necessary joint position adjustments. These predictions are converted into absolute target joint positions and fed into the impedance controller to execute the task. Video demonstrations show the UR10e robot successfully assembling gears in various starting positions and under different environmental conditions, confirming the effectiveness of the training and deployment process. The policies are robust to the sequence of gear assembly and initial poses, showcasing the potential of this flexible automation approach. To facilitate wider adoption, NVIDIA plans to release Isaac Lab environments, training code, and a reference workflow. These resources will allow developers to test and train their own contact-rich manipulation policies for robotic assembly tasks. Enrolling in NVIDIA’s free robotics fundamentals courses and engaging with their developer community can further aid those interested in exploring these technologies. Industry experts view this collaboration as a significant step towards overcoming the reality gap—the differences between simulation and the real world. They predict that this approach could lead to more adaptable and scalable robotic systems, reducing the need for extensive human intervention and making robotic assembly more accessible. The integration of advanced AI and simulation tools with industrial robots like the UR10e marks a promising advance in robotics, poised to revolutionize assembly processes in various sectors. Company Profiles: - NVIDIA: A leader in AI and GPU technology, NVIDIA develops powerful computing platforms and software tools that enable advanced simulations and robotic learning. - Universal Robots (UR): Known for collaborative robots, UR provides industrial robots that are user-friendly and capable of performing a wide range of tasks, including precision assembly work. Conclusion: The UR10e robot's successful zero-shot sim-to-real transfer for a gear assembly task underscores the potential of combining Isaac Lab, Isaac ROS, and advanced control interfaces. This breakthrough could drive the future of flexible automation in industrial robotics, making assembly tasks more efficient and less dependent on manual oversight.
