Staff at Lawrence Livermore say it takes their best computers six weeks to simulate what happens inside a warhead when it is going off. Such detailed modelling has only recently become possible. The supercomputers used in the early 1990s, when nuclear testing stopped, would have taken 60,000 years to process the same data......Comment: These physical tests must also involve very complex ICT applications for data collection and analysis.
The models involved in the winning Livermore/Sandia bid are certainly good enough to recreate the results of earlier tests (a trick known as “hindcasting”). Whether they can accurately forecast things, no one knows for sure. But so-called subcritical tests are allowed by the test-ban treaty, and that may add confidence to the process.
Some of these tests involve smashing or shooting at small shards of plutonium. Blowing up little bits of the metal this way, without compressing them in a symmetrical manner, is allowed because it does not result in a chain reaction. And the chemical-explosive detonator can also be tested using “simulants” that are not fissile but mimic the behaviour of the plutonium pit in other ways. Scientists can thus find out whether the charge would have detonated, had it been made of plutonium.
The fusion stage can also be examined within the rules. An enormous—and enormously expensive—system of lasers called the National Ignition Facility is being built at Livermore. It is designed to cause thermonuclear fusion in tiny pellets of deuterium (so small that they would not be covered by the test-ban treaty) and is expected to be completed in 2009.
The fundamental point, however, is that this work is very computer intensive. JAD
No comments:
Post a Comment