People are often surprised to learn that the Laboratory has been studying climate for 40-plus years as part of our national security mission. Over the decades, we have applied deep scientific understanding to developing supercomputing models for our weapons research. In turn, we have extended weapons science and the associated modeling expertise across seemingly unrelated fields, from nuclear physics to the threat of a global nuclear winter to forecasting and potentially mitigating sea-level rise that could affect U.S. military bases.
At first glance, computer modeling for climate science seems light years apart from computer modeling for ensuring the continued safety, reliability, and effectiveness of the nation’s nuclear stockpile. But both are based on mathematically representing physical processes, and they overlap and inform each other in surprising ways. In each area, we need to deepen our scientific understanding to navigate unprecedented, but consequential, futures.
In both fields, we can partially validate those models by comparing their results to existing historical data and observations, which helps us understand the models’ limitations. And while climate and weapons modeling are relatively mature and demanding computationally, the increasing need for insights across multiple scales from small to large remains challenging.
Fortunately, help is on the way. Los Alamos scientists are applying advances in computing and innovative models to both sides of the challenge.
Nano, local and global
International climate researchers were quick to see the potential for improving climate forecasts by developing models on supercomputers developed by the Lab and its industry partners. In the 1980s, Lab researchers used a full 3D atmospheric model to study how smoke from a nuclear war might affect the atmosphere—nuclear winter. Building on that foundation and widening our scope, Lab researchers began modeling the world’s oceans, sea ice, and land ice.
Models slice up the world into 3D grids, and then compute what’s happening in each grid cell.
When the United States discontinued testing nuclear weapons in the early 1990s, the Lab accelerated its program to model weapons physics with the highest possible fidelity. That stockpile-stewardship work requires special-purpose supercomputers and expertise in innovative algorithm development.
Those capabilities—and related challenges—carry over into climate research. Both involve modeling multiple physics and chemical processes at atomic, nano, local, and global scales. Climate modeling looks at global effects and local effects, such as the impact of drought on a particular city. Weapons modeling looks at “local” microscale nuclear processes and macroscale effects on the surrounding hardware. In one concrete example of shared interests and basic equations, turbulence in fluids at various scales applies to weapons modeling and climate modeling, the latter particularly in the oceans and atmosphere.
To fully understand what’s going on in both regimes, our model simulations must move smoothly across scales, like a telephoto lens zooming in and out. That’s not so easy. It requires continued innovation in supercomputing and unique modeling techniques. In one ongoing Lab project, for instance, researchers in weapons and climate are collaborating to apply aerospace models to ocean models to seamlessly bridge scales.
Here’s how that works. Models slice up the world into 3D grids, and then compute what’s happening in each grid cell. The Lab’s ocean model—part of the Department of Energy’s (DOE’s) broader Energy Exascale Earth System Model (E3SM)— must capture a zoo of physical phenomena, such as instabilities in waves, that are really hard to examine in sufficient detail globally or even regionally. Scale-resolving techniques can unlock this impasse by increasing the resolution in only the grid cells containing these wave instabilities and not requiring costly high resolution across the entire globe.
Unscrambling chaos
Robust, reliable modeling benefits from running multiple simulations with slight changes to the initial conditions. In the butterfly effect of chaos theory, a tiny tweak at the beginning can produce startlingly different results. So statistically analyzing a large number of outcomes helps researchers understand the uncertainty in model-based projections, and thus make better predictions.
Los Alamos developed and continues to refine the ocean, sea-ice, and land-ice models within E3SM. To sharpen resolution and run more simulations for greater fidelity in less time without increasing computational cost, Los Alamos and other DOE labs are migrating the E3SM model to the next generation of supercomputers, dubbed exascale. Exascale will also support extremely large ensembles of simulations with slightly different initial conditions in a move to unscramble the effects of chaos in climate. Coupled with higher-fidelity physics, exascale computing is poised to revolutionize climate modeling.
Conducting model-based experiments of nuclear weapons instead of underground tests is a key piece of national security. So is understanding climate-change impacts. In the weapons regime, modeling and simulation have given the nation confidence in the safety, security, and effectiveness of the nuclear stockpile. In climate, modeling and simulation help us understand where we’re headed—and how we might steer a sustainable path into an uncertain future.