12 Academicians Gathered at CCF HPC China to Discuss New Research Paradigms: Super-intelligence Fusion/computing-network Fusion/scientific Intelligent Computing...

Turing Award winner Jim Gray once divided scientific research into four major paradigms, namely experimental science, theoretical science, computational science, and data-intensive science. In recent years, with the explosive development of AI, some industry insiders have also said that "the fifth paradigm of scientific research has arrived."
In the iteration of scientific research paradigms, the core role of data has never changed. Fortunately, as the scientific research paradigm shifts from computing to AI, the cost of data output continues to decrease, and databases in fields such as life sciences and materials chemistry are expanding exponentially.Taking life science computing as an example, "the amount of data can double every three years, while the market size of life science computing can only double every six years."
Professor Kong Lei of Peking University said that the reason may be that the cost of data output has been reduced faster than the progress of chips, so a lot of data has not been effectively processed in time. He believes thatLife science research has become increasingly dependent on data-driven, and computing power has become one of the core competitive advantages of scientific research.

There is no doubt that what Professor Kong Lei said is a problem that needs to be faced in a wide range of scientific research fields. Computing power may become the key engine for breaking through technical bottlenecks in scientific research and even engineering applications. In this context,High performance computing (HPC) has received a lot of attention.As the "Crown of Computing Power", it has been maturely applied in many fields by relying on its advantages such as powerful computing power and parallel processing technology.
At the 20th CCF National High Performance Computing Academic Annual Conference (CCF HPC China 2024) which opened on September 24,From invited reports to theme forums, top scholars and industry experts from different professional fields shared and communicated in depth about the development status and trends of HPC and focused on rich application scenarios.
Specifically, CCF HPC China took "Twenty Years of Glory, New Quality Future" as its theme and invited 12 academicians and over 400 top scholars. The conference also held 30 theme forums and more than 30 colorful peripheral activities. The number of participants exceeded 4,000, and the atmosphere of on-site communication was strong. HyperAI participated in CCF HPC China 2024 as an official partner community and brought you useful reports.

Integration becomes a major trend
Some industry insiders use the phrase "calculate the sky, calculate the sea, calculate the earth, calculate people" to describe the ubiquity and powerful capabilities of high-performance computing applications. However, in actual applications, a single tree can hardly make a forest.Not only is the integration of supercomputing and intelligent computing in high-performance computing needed, but in the process of providing computing power services, in order to activate computing power resources, the integration of computing power and networks has also become a general trend.
Fusion of supercomputing and intelligent computing
Wang Huaimin, an academician of the Chinese Academy of Sciences, a professor at the National University of Defense Technology, and a fellow of the China Computer Federation (CCF), said in his speech:In the era of intelligent computing, AI for Science has not only promoted the progress of basic scientific research, but also brought new opportunities and challenges to the development of high-performance computing and artificial intelligence technologies. How to combine the specific application scenarios of scientific research and give full play to the potential of high-performance computing and artificial intelligence in big data analysis, simulation computing, intelligent prediction, and experimental assistance is the focus of the current application of AI for Science.
Academician Wang Huaimin believes that AI for Science and Computing for Science are related, and both support the use of computational methods for scientific research, but the difference between the two is more worthy of attention. The difference is not only reflected in the large number of differences between supercomputing and intelligent computing in processor chips, computer architecture, and even system software, but more importantly,The two represent completely different modeling methods of understanding the world.
He said that traditional Computing for Science is a scientific research method that uses mathematical equations to model the world, while AI for Science is a scientific research method that uses machine learning to model the world. The two are complementary.This also means that the integration of supercomputing and intelligent computing will support future scientific research and has broad prospects.

Similarly, in the theme forum on the 25th, many experts also gave wonderful speeches on "Super Intelligence Fusion". For example,At the "6th Forum on Intelligent Supercomputing Fusion Technology in Numerical Simulation Engineering Applications", Wang Yishen from China Electric Power Research Institute Co., Ltd. focused on power application scenarios and introduced power science intelligent computing technology.
He said that the current power system exhibits characteristics such as strong uncertainty, high-dimensional features, non-convex nonlinearity, multiple time scales, complex spatiotemporal characteristics, and multiple objectives and constraints. Power system calculations face major challenges such as massive growth in the scale of system analysis, massive and diversified combinations of methods, difficulty in refined modeling, complex safety mechanisms, and high-dimensional expansion of control objects and variables.
In view of this,Power science intelligent computing has emerged to make up for the shortcomings of traditional mathematical methods and general AI technology.For example, AI technology relies on training environments and samples, has weak generalization and scalability, and its interpretability has been repeatedly criticized. However, intelligent computing in power science that integrates mechanism data has many advantages, such as improving the computational efficiency of analytical decision-making, improving the model's refined representation capabilities, and improving the algorithm's adaptability and generalization capabilities.
Integration of computing power and network
Today, Moore's Law is gradually reaching a bottleneck. The space for improving the computing power of a single chip is getting narrower and narrower, and the cost is getting higher and higher. Therefore, it is crucial to activate the existing computing power resources. And this is the advantage of the "computing power network" - to provide users with the most suitable computing power resource services, that is, matching computing power types, appropriate computing power scale, and optimal computing power cost performance. In this process, the computing power network connects discrete computing power, with "computing" producing computing power and "network" connecting computing power.
It can be said thatThe computing power network should be positioned as the infrastructure of the intelligent era and should be widely used by the entire population like browsers and WeChat.For a new technology to be popularized, it must have a "killer" application. From now on, AIPC and AI mobile phones may become intelligent personal assistants popular among the whole people, and in the future, there may be a real demand for computing power networks. Only by allowing computing power to serve more people through the network and allowing the majority of users to get real benefits from the computing power network can the computing power network develop rapidly.
Li Guojie, an academician of the Chinese Academy of Engineering and a researcher at the Institute of Computing Technology of the Chinese Academy of Sciences, proposed thatNow different units are making different efforts in computing power networks. Operators focus on cloud-network integration, local governments build computing power hub centers, and the computer industry focuses on basic research in distributed computing. These studies need to be combined into a synergy.

Academician Li Guojie said in his report titled "Meta-Thinking on the Computing Power Network" that pre-training of large models is currently the main demand for computing power, but wide-area distributed computing is not suitable for training large models. Relying on multiple small intelligent computing centers to train large models through distributed computing may not be a solution.The research on computing network requires a core abstraction similar to web pages, developing "hyperlinks" into "hypertasks".Theoretical abstraction is not about incremental improvements in performance or SOTA rankings, but about achieving breakthroughs in qualitative research first.
The upgrade of scientific research paradigm is not a replacement but a complement
Feng Dawei, an associate researcher at the National University of Defense Technology, shared in his speech that scientific research has gone through five paradigms, including empirical science based on observation and induction, represented by scientists such as Mendel and Lavoisier, theoretical science based on hypothesis and logical deduction, represented by scientists such as Newton and Einstein, and by the 1950s, a third scientific research method emerged that simulated complex phenomena through computers, and molecular dynamics simulation was a typical example.

After 2000, the development of the Internet and cloud computing gave rise to a big data-driven scientific research paradigm that mainly emphasizes data management, sharing, and mining. After 2020, with the development of artificial intelligence technology, especially the AlphaFold series and GPT series of large models, an artificial intelligence-driven scientific research paradigm emerged.
Feng Dawei proposed that these scientific research methods are not substitutes for each other, but rather complement each other and jointly promote the development of scientific research.
About CCF HPC China
CCF HPC China was founded in 2005, and this year is the 20th event. Today, CCF HPC China has become one of the three most influential supercomputing events in the field of high performance computing, on par with the SC Supercomputing Conference in the United States and the ISC Supercomputing Conference in Germany. Over the past 20 years, the High Performance Computing Committee of the Chinese Computer Society (hereinafter referred to as the "High Performance Computing Committee") has built a professional, high-end and extensive communication platform for academia and industry, as well as for high performance computing users and foreign academic peers through an academic platform such as CCF HPC China, effectively promoting the rapid development of China's high performance computing industry.
In 2024, China's high-performance computing industry will have an important opportunity to deeply study the close relationship between artificial intelligence and new productivity and computing power industries. As a top industry event that continues to move forward, CCF HPC China is committed to adding new momentum to the industry's expansion through extensive exchanges and cooperation.
HyperAI is deeply involved in CCF HPC China 2024 as an official partner community. We will continue to share with you the practical speeches and cutting-edge views of top scholars and industry experts. Stay tuned!