By Bob Webster and Nancy Joe Nicholas
Thirty years ago, on September 23, 1992, the United States conducted its 1,054th nuclear weapons test. When that test, named Divider, was detonated in the morning hours underground in the Nevada desert, no one knew it would be the last U.S. test for at least the next three decades. But by 1992, the Soviet Union had formally dissolved, and the United States government issued what was then seen as a short-term moratorium on testing that continues today.
This moratorium came with an unexpected benefit: no longer testing nuclear weapons ushered in a revolution in high-performance computing that has wide-ranging impacts on national and global security that few are aware of. The need to maintain our nuclear weapons in the absence of testing drove an unprecedented requirement for increased scientific computing power.
At Los Alamos National Laboratory in New Mexico, where the first atomic bomb was built, our primary mission is to maintain and verify the safety and reliability of the nuclear stockpile. We do this using nonnuclear and subcritical experiments coupled with advanced computer modeling and simulations to evaluate the health and extend the lifetimes of America’s nuclear weapons.
Read the full story as it appeared in Federal Times.
LA-UR-22-29580