HyperAIHyperAI

Command Palette

Search for a command to run...

Physics-based AI models accelerate scientific discovery

Researchers from the Polymathic AI collaboration, including scientists from the University of Cambridge, have developed two groundbreaking AI models—Walrus and AION-1—trained on real scientific datasets rather than text or images. Unlike mainstream AI models such as ChatGPT, these foundation models are designed to learn the underlying physical principles of diverse systems, enabling them to solve problems across multiple scientific disciplines. This marks a significant shift toward general-purpose AI for physical science. Walrus is a transformer-based model trained on the Well, a massive dataset spanning 19 different fluid dynamics scenarios across 63 fields. It includes data on density, velocity, and pressure from phenomena ranging from neutron star mergers and atmospheric layers to acoustic waves and bacterial movement. By learning from this broad range of physical systems, Walrus can predict the next step in a sequence of snapshots, even when the system is unfamiliar. Its ability to transfer knowledge across domains means it can help scientists analyze new or complex physical processes without starting from scratch. As lead developer Michael McCabe noted, this allows researchers to bypass time-consuming model-building when facing novel physics. AION-1, trained on over 100 terabytes of astronomical data from major surveys like the Sloan Digital Sky Survey and Gaia, processes images, spectra, and other measurements from more than 200 million celestial objects. When presented with a low-resolution image of a galaxy, AION-1 can infer detailed physical properties by drawing on patterns learned from millions of other galaxies. This capability is especially valuable in low-data or low-budget scenarios, where traditional methods struggle. Both models are foundational—meaning they are trained on vast, diverse datasets to capture universal physical laws rather than narrow problem-specific patterns. This approach allows them to generalize across fields, much like how human senses work together to build a fuller understanding of the world. As the AION-1 team explained, just as we use sight, smell, and taste to infer missing information, these AI models use cross-domain knowledge to fill in gaps in new observations. The Polymathic AI team emphasizes that these models are not meant to replace scientists but to empower them. By providing a powerful starting point, they reduce the need to build custom AI pipelines for every experiment. As Liam Parker from UC Berkeley said, scientists can now begin with a pre-trained, state-of-the-art foundation and achieve high accuracy without extensive development. The models have already shown promise in accelerating discovery. Dr. Miles Cranmer from Cambridge’s Department of Applied Mathematics and Theoretical Physics called Walrus “a real step toward general-purpose AI for physical simulation.” Dr. Payel Mukhopadhyay added that open-sourcing the code and data invites the broader scientific community to build upon this foundation. Principal investigator Shirley Ho sees this as a way to bring AI intelligence directly into the hands of researchers. “We want to bring all this AI intelligence to the scientists who need it,” she said. The models are already being used to tackle problems in astronomy, fluid dynamics, and beyond, demonstrating the power of physics-based AI to unify scientific inquiry. Walrus was detailed in a preprint on arXiv, while AION-1 was presented at the NeurIPS conference. These developments represent a new frontier in AI—where models don’t just mimic language or images, but understand the laws of nature itself. By learning from real physical data, they offer a transformative tool for science, enabling faster, smarter, and more collaborative research across disciplines.

Related Links