search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
SIMULATION |


Supercomputer aids SMR simulation


Understanding physical behaviour inside an operating nuclear reactor can be enhanced with simulations on a supercomputer, says Jared Sagoff


SCIENTISTS HOPING TO BUILD NEW generations of small modular reactors (SMRs) need to be able to design and understand the behaviour of these reactors in simulated environments before they can be constructed. Large-scale high-resolution models yield information that can drive down costs to build a new, intrinsically safe nuclear reactor. Scientists at the US Department of Energy’s (DOE)


Argonne National Laboratory (ANL) have collaborated to develop a new computer model that allows for the visualisation of a full reactor core at unprecedented resolution. The aim of the project, which is conducted under the auspices of the DOE’s Exascale Computing Project (ExaSMR), is to carry out full-core multi-physics simulations on upcoming cutting-edge exascale supercomputers. This includes Aurora, which is scheduled to arrive at Argonne in 2022.


Jared Sagoff is


coordinating writer/ editor at Argonne National Laboratory


An update on the progress achieved was published in April in the journal Nuclear Engineering and Design, which will hopefully inspire researchers to further integrate high- fidelity numerical simulations in actual engineering designs.


Modelling in more detail In a nuclear reactor, the whirls and eddies of coolant that flow around the fuel pins play a critical role in determining the reactor’s thermal and hydraulics performance. They also give much-needed information to nuclear engineers about how best to design future nuclear reactor systems, both for normal operation and for stress tolerance. A typical light water reactor core is made up of nuclear


fuel assemblies, each containing several hundred individual fuel pins, which in turn are made up of fuel pellets. Until now, limitations in raw computing power have constrained models, so they could only address particular regions of the core. However, now an image can model all the individual pins in one of the first ever full-core nuclear reactor simulations.


“As we advance towards exascale computing, we will


see more opportunities to reveal large-scale dynamics of these complex structures in regimes that were previously inaccessible, giving us real information that can reshape how we approach the challenges in reactor designs,” said Argonne nuclear engineer Jun Fang, an author of the study, which was published by ExaSMR teams at Argonne and Professor Elia Merzari’s group at Pennsylvania State University. A key aspect of SMR fuel assembly modelling is the


presence of spacer grids. These grids play an important role in pressurised water reactors, such as the SMR under consideration, as they create turbulence structures and enhance the ability of the flow to remove heat from the fuel rods.


Instead of creating a computational grid resolving all


the local geometric details, the researchers developed a mathematical mechanism to reproduce the overall impact of these structures on the coolant flow without sacrificing accuracy. In doing so, the researchers could successfully scale up the related computational fluid dynamics CFD simulations to an entire SMR core for the first time. “The mechanisms by which the coolant mixes throughout


the core remain regular and relatively consistent. This enables us to leverage high-fidelity simulations of the turbulent flows in a section of the core to enhance the accuracy of our core-wide computational approach,” said Argonne principal nuclear engineer Dillon Shaver. The technical expertise exhibited by the ExaSMR


teams is built upon Argonne’s history of breakthroughs in related research fields such as nuclear engineering and computational sciences. Several decades ago, a group of Argonne scientists,


led by Paul Fischer, pioneered a CFD flow solver software package called Nek5000, which was transformative because it allowed users to simulate engineering fluid problems with up to one million parallel threads. Recently, Nek5000 has been re-engineered into a new solver called NekRS that uses the power of graphics processing units (GPUs) to increase the computational speed of the model. “Having codes designed for this particular purpose gives us the ability to take full advantage of the raw computing power the supercomputer offers us,” Fang said. The team’s computations were carried out on


Above: A supercomputer at the ALCF Photo credit: Argonne National Laboratory 32 | November 2021 | www.neimagazine.com


supercomputers at the Argonne Leadership Computing Facility (ALCF), Oak Ridge Leadership Computing Facility (OLCF), and Argonne’s Laboratory Computing Resource Center (LCRC). The ALCF and OLCF are DOE Office of Science User Facilities. The research is supported by the Exascale Computing Project, a collaborative effort of DOE’s Office of Science and the National Nuclear Security Administration. ■


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45