Los Alamos National Laboratory conducts research in a wide variety of areas. From stockpile stewardship science, to deflecting asteroids, to designing new materials for nuclear reactors, Los Alamos uses its world class high-performance computing (HPC) capabilities to find solutions to some of the world most complex problems.
Combining HPC with scientific and experimental capabilities to understand the nation’s nuclear deterrent
While the role and prominence of nuclear weapons in U.S. security policy diminished with the end of the Cold War, nuclear weapons continue to provide an essential component of national security. The U.S. stopped designing new weapons in 1989, and ended nuclear testing in 1992. Shortly after that, the science-based Stockpile Stewardship program was established, which combines advanced scientific and experimental capabilities with high-performance supercomputing to help scientists and engineers understand and resolve issues in the nation’s nuclear deterrent.
Now a crucial part of how Los Alamos fulfills its mission, computer simulations allow Laboratory scientists to virtually detonate weapons systems and monitor what is happening inside the nation’s aging deterrent— most nuclear weapons in the U.S. stockpile were produced during the 1970s and 1980s and were not designed or intended to last indefinitely.
Computer simulations allow Laboratory scientists to virtually detonate weapons systems and monitor what is happening inside the nation’s aging deterrent
Baseline simulations of historical nuclear tests are used to compare against real historical test data to verify the correctness of the simulation tools. Then these simulation tools, including applications, codes, and more, are used to explore weapons virtually in circumstances different from the original test to determine unseen aspects of the weapon, such as the effects of aging.
High resolution, high fidelity
The extensive use of simulations made it necessary to rapidly develop supercomputers powerful enough to replace real-world nuclear testing with virtual testing. Increasing computer speed was important, but having 3D simulations with high resolution and accuracy was even more important. To achieve high-fidelity 3D simulations, computing would need to make incredible technological leaps: gigaflops to teraflops (trillions of calculations per second, which happened in 1999), teraflops to petaflops (quadrillions of calculations per second, which happened in 2008), and petaflops to exaflops (quintillions of calculations per second, coming soon). The Laboratory’s HPC capabilities are centered on the Stockpile Stewardship Program, but the benefits extend to almost every research are at Los Alamos.
Using HPC to help improve medicine and human health
High-performance computing has played an important role in health science research over the years. From the essential work on the human genome project, the world’s largest collaborative biological project, to helping understand the evolution and origins of HIV, HPC has played a pivotal role in advancing health science.
In addition to dedicating important HPC resources to the fight against COVID, HPC is being used to better understand DNA and the human body at fundamental levels. Researchers at Los Alamos created the largest simulation to date of an entire gene of DNA, a feat that required one billion atoms to model and will help better understand and develop cures for diseases like cancer.
HPC is helping researchers to understand and combat climate change
HPC is an essential tool helping researchers understand the impacts of climate change. The Department of Energy’s Energy Exascale Earth System Model (E3SM) is a state-of-the-science Earth system modeling, simulation, and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs. A key piece of E3SM is a Los Alamos National Laboratory software package known as CICE that calculates the complex physics of sea ice, such as how it freezes, melts, and moves across the ocean’s surface, and how it is influenced by external forces such as the ocean’s currents and winds.
Using HPC to make better material for energy production
As carbon emissions continue to be a leading cause of global warming, researchers are looking clean energy technologies to reduce the carbon footprint. Nuclear energy will be a key technology to a greener future.
At Los Alamos, scientists are using HPC to help design test materials inside of a nuclear reactor. They are simulating molecular dynamics – basically, the movement of every single atom in a material. This provides fundamental insight into materials down to the atomic scale and shows where each atom is at every point in time. Ultimately, the goal of computational materials science is to use such simulations to shorten the time it takes to develop new materials by developing them from scratch on the computer. That way, they can get a sense of how a material would perform just from computer simulations without having to go to the lab and try different possibilities.
HPC is used to understand how to deflect potential incoming asteroids or comets
What if an asteroid is on a collision course with Earth? One possible way to save the planet would be to knock the asteroid off of its path with a kinetic impactor—essentially a massive cannon ball—or a nuclear explosive device.
But knowing how the asteroid would react to impact is critical to make that decision. Depending on the composition of the asteroid, an impact could shatter the asteroid rather than altering its path. Rather than one large asteroid, there could potentially be thousands of smaller asteroids hurtling towards Earth.
This is where supercomputers can help. Los Alamos researchers use HPC to run high-fidelity simulations to accurately model the physics of an impact. These simulations are constantly updated with cutting-edge data from NASA missions and Earth-bound laboratory experiments.
HPC plays an important role in science and can help change the future. As the computing world enters the exascale era—that is, computers capable of making one quintillion calculations per second—researchers can solve even bigger and more complex problems.