Do you ever wonder how your local weather is forecast for up to a week ahead of time? Well, it takes a combination of a highly tuned theoretical model that starts with the current weather and follows its time evolution on a supercomputer using all the relevant laws of nature. This \u201csimulation\u201d providing your weather forecast is an example of how supercomputers are revolutionising all areas of science and technology from elementary particles to how the Universe evolved from the Big Bang to the present time.<\/p>\n
While the laws of nature, such as E = mc2<\/sup>, can appear to be deceptively simple, their application to complex systems on a supercomputer, such as the weather or the atomic nucleus, involves careful accounting for a multitude of detailed interactions among the many constituents where energy transfers occur. Here, a common theme emerges: to have accurate predictions for these complex systems, we need to follow the detailed consequences of the governing laws from the smaller to the larger space-time scales which can only be accomplished on a supercomputer (see Fig. 1).<\/p>\n
To better appreciate the value of supercomputer simulations for basic science, we can take the example of the properties of the atomic nucleus. To be more specific, we can look at Carbon-14 which is well-known for its usefulness in archaeological dating due to its half-life of 5,730 years. Carbon-14 is produced in our atmosphere by cosmic rays resulting in a stable supply. Once something finishes its growth or is made, its Carbon-14, with eight neutrons and six protons, begins to decay away slowly over time by converting to Nitrogen-14, with seven neutrons and seven protons (see Fig. 2). Measuring the depleted amount of Carbon-14 in a sample tells how long it has been since the object was made. How does this transmutation occur and why does it take so long?<\/p>\n
Herein was a long-standing problem. While we understood the basic laws governing the spontaneous transition of a neutron to a proton plus an electron and an antineutrino (beta decay) and we knew that a neutron outside of a nucleus undergoes this transition with a half-life of about 11 minutes (part of the “Data” in Fig. 1), we had not understood why this process takes 270 million times longer inside Carbon-14 despite many decades of careful research. It took a team of multi-disciplinary scientists and the world\u2019s largest supercomputer in 2010, Jaguar at Oak Ridge National Laboratory, working together to solve this problem. Using more than 30 million processor hours on Jaguar they made a surprising discovery that the relevant laws of nature required a major improvement. The improvement states that 3 particles interacting simultaneously inside Carbon-14 do not behave according to the sum of their pairwise interactions. Instead, an additional 3-particle force is needed which plays a critical role in greatly slowing down the decay process [1].<\/p>\n
Let us consider the scientific ecosystem that enabled that discovery since it serves as an example of similar scientist\/supercomputer ecosystems in science and technology today. Due to the complexity of the scientific issues and of state-of-the-art supercomputers, experts from diverse disciplines form teams to achieve a common scientific goal. The particular ecosystem that solved the 14 <\/sup>C half-life puzzle is emblematic of efforts to achieve reliable theoretical physics predictions. The measure of reliability is characterised by uncertainty quantification. Referring to the schematic in Fig. 3, these efforts involve partnerships of diverse theoretical physicists (upper two hexagons) working closely with computer scientists and applied mathematicians (lower two hexagons) in order to simulate the theoretical laws of nature successfully on current supercomputers.<\/p>\n
Before we discuss what drives the need for supercomputers for solving these problems, let us consider what it takes to qualify as a supercomputer.\u00a0This qualification changes frequently as newer, bigger and faster supercomputers are designed, built and delivered to the private and the public sector.\u00a0Every six months, the world\u2019s \u201cTop 500<\/a>\u201d supercomputers are ranked by their performance on an array of standardised computational tasks.\u00a0Today\u2019s number one is Fugaku in Japan and number two is Summit in the United States (both pictured in Fig. 1). The United States also hosts four more of the world\u2019s top 10 while China hosts two and Germany and Italy each host one.<\/p>\n
Solving for the properties of many-particle quantum systems, such as atomic nuclei, is well-recognised as \u201ccomputationally hard\u201d and is not readily programmed for supercomputers. Valiant efforts of physicists, applied mathematicians and computer scientists (see Fig. 3), have led to new algorithms enabling solutions for the properties of the atomic nuclei with up to about 20 particles on today\u2019s supercomputers [5-8]. However, this is only a small portion of nuclei and we need detailed understanding of many important phenomena occurring in heavier systems, such as astrophysical processes and fission. Hence, we will need new theoretical breakthroughs and increased computer capabilities to overcome current limits.<\/p>\n
Owing to this need and similar pressures from other areas of science and technology to increase computational capacity as well as to the need to reduce electrical energy consumption, supercomputer architectures are evolving dramatically. Major changes now involve, for example, the integration of powerful graphics processor units (GPUs) and the adoption of complex memory hierarchies. In turn, these changes produce a need to develop and implement new algorithms that efficiently utilise these advanced hardware capabilities. What was \u201chard\u201d before now becomes \u201charder\u201d but, at the same time, more worthwhile since success in overcoming these challenges will enable new predictions and new discoveries.<\/p>\n
What else does the future of supercomputer simulations hold? Here, two major paradigm shifts are already underway but will take from a few years to decades to reach full potential. The first is Machine Learning, a branch of Artificial Intelligence<\/a> that is already proving to be a very powerful research tool that can effectively extend the capacity of supercomputer simulations by \u201clearning\u201d how their predictions depend on the size of the simulation and predict results for future simulations on the larger supercomputers of the future. The second major paradigm shift is the rapid development of quantum computing<\/a> technology. Preliminary studies suggest that quantum computers will be capable of solving forefront unsolved problems in quantum mechanics such as the structure of atomic nuclei well beyond the light nuclei addressable with today\u2019s supercomputers.<\/p>\n
The author acknowledges support from US Department of Energy grants DE-FG02-87ER40371 and DE-SC0018223.<\/p>\n
[1] Pieter Maris, James P. Vary, Petr Navratil, W. Erich Ormand, Hai Ah Nam, David J. Dean, \u201cOrigin of the anomalous long lifetime of 14C,\u201d Phys. Rev. Lett. 106, 202502 (2011).<\/p>\n
[2] Robert Chase Cockrell, “Ab Initio<\/em> Nuclear Structure Calculations for Light Nuclei,” (2012). Graduate Theses and Dissertations. 12654.\u00a0<\/a><\/p>\n