AI-Powered Insect Robot Performs 10 Backflips in 11 Seconds
In a breakthrough for microrobotics, researchers at MIT have developed an aerial microrobot capable of performing acrobatic maneuvers at speeds and agility rivaling those of real insects. The tiny robot, weighing less than a paperclip and measuring about the size of a mini cassette tape, can complete 10 consecutive forward somersaults in just 11 seconds—demonstrating a leap in performance that brings it closer to the flight capabilities of biological insects. The robot is part of a long-term effort by MIT’s Soft and Micro Robotics Lab, led by Professor Kevin Chen from the Department of Electrical Engineering and Computer Science and the Research Laboratory of Electronics. The new version is a soft-bodied, flapping-wing microrobot driven by artificial muscles that vibrate its four micro-wings at high frequency to generate lift and thrust. The latest design features larger wing surfaces, enabling more dynamic and precise movements. Despite significant hardware improvements, the robot’s flight performance had long been limited by its control system. Previous versions relied on manually tuned control parameters, a time-consuming process that only allowed for slow, stable flight. This method couldn’t support the rapid, complex maneuvers needed to match insect-like agility—especially in unpredictable environments. To overcome this, the team created a two-stage AI-powered control architecture that combines high-precision planning with real-time, lightweight execution. The first stage uses a Model Predictive Controller (MPC), a powerful algorithm that simulates the robot’s future motion and computes optimal control actions over a short time horizon. The MPC accounts for physical constraints, such as thrust and torque limits, and can plan complex sequences like backflips, while also compensating for errors and disturbances like wind gusts. However, the MPC is too computationally heavy to run in real time on the robot’s tiny onboard system. The solution? A second stage: a lightweight deep learning model trained to mimic the MPC’s behavior. The team used the MPC as a "teacher" in simulations, generating thousands of optimal action sequences under various conditions—including wind, manufacturing tolerances, and even cable tangles. The AI model learned to map the robot’s current state to the right control commands, compressing the expert knowledge into a fast, on-board policy. This hybrid system allows the robot to fly with unprecedented speed and precision. In tests, the new controller increased flight speed by 447% and acceleration by 255% compared to the team’s previous best results. During a key demonstration, the robot completed 10 front somersaults in 11 seconds, with a positional error of just 4 to 5 centimeters from the planned trajectory. The system also enables a natural insect-like behavior called "sweeping" motion: rapid forward acceleration followed by a sharp reversal in pitch to decelerate and hover. This maneuver, which helps insects reorient and gather visual information, could be critical for future microrobots equipped with cameras, allowing them to navigate and perceive their environment more effectively. While current experiments are conducted indoors with the help of motion-capture systems, the team is now working to make the robot fully autonomous. The next steps include integrating miniature cameras and inertial measurement units (IMUs) for on-board navigation, developing multi-robot coordination strategies, and testing the system in more turbulent outdoor conditions. The researchers emphasize that their work establishes a new paradigm: it’s no longer necessary to choose between high performance and low computational cost. Instead, a two-stage AI framework can deliver both. Sarah Bergbreiter, a professor of mechanical engineering at Carnegie Mellon University who was not involved in the study, praised the work for its robustness. She noted that the robot maintained high accuracy even under strong wind disturbances, manufacturing errors, and physical tangles—challenges that would typically cause failure in such small systems. She also pointed out a key limitation: the current controller still runs on an external computer. However, the team has shown that a simplified version of the strategy can function on the robot itself, even with limited onboard processing, opening the door to future fully autonomous, insect-scale machines. “This work is a major step toward realizing the dream of tiny, agile robots that can enter spaces too small for larger drones—like collapsed buildings, dense forests, or urban rubble—where they could one day play a vital role in search and rescue,” said Chen. “We’re not just building faster robots. We’re building smarter ones that can think, adapt, and act like nature’s own.”
