Nuclear plants have high up-front costs, complex processes occurring all the way down to the molecular level throughout their decades-long lifetimes, and strict safety criteria. Modelling all the parameters and predicting the outcomes has traditionally begun with theory and observation followed by simulations, the results of which are fed back into the next round of theories, and repeated until those results look valid. The quality of the results, applied to plant operation and design, affect costs, lifetimes and safety. Dan Yurman looks at two new methods using artificial intelligence that can significantly improve predictive power. The first is VERA (Virtual Environment for Reactor Applications) which has just been licensed for commercial use. Its use could improve performance and extend the lifetimes of the current reactor fleet. Modelling includes nucleate boiling, corrosion deposits on fuel rods, pellet expansion, and performance of reactor parts when exposed to high temperatures and radiation. Meanwhile, Argonne National Laboratory is using AI to create fast-running models of various nuclear thermal-hydraulic processes. Traditional methods need hundreds to thousands of repeated analyses which carry a high computational burden. Yurman explains that AI should allow analysts not only to get to the answer faster but dig deeper into the data and improve results for existing and new plants.
A software package, 10 years in the making, that can predict the behaviour of nuclear reactors’ cores with stunning accuracy has been licensed commercially for the first time.
The non-profit Electric Power Research Institute (EPRI) is the first organisation to hold a commercial license for the Virtual Environment for Reactor Applications, or VERA, a set of tools developed by the U.S. Department of Energy’s Consortium for the Advanced Simulation of Light Water Reactors (CASL).
“EPRI, one of our core CASL industry partners, now has the right to use VERA to perform services for its member utilities,” said Dave Kropaczek, CASL director.
CASL is a partnership of the DOE national laboratories, universities and nuclear industry companies working together to find solutions to specific challenges of efficiently operating nuclear reactors. Based at Oak Ridge National Laboratory and established in 2010, CASL was the first DOE Energy Innovation Hub.
The VERA software suite is a collection of interfacing codes that can simulate reactor core behaviour from the large-scale down to the molecular scale.
“By licensing VERA to EPRI, CASL is delivering a first step in handing its work off to industry,” Kropaczek said.
“EPRI’s mission is to advance safe, reliable, affordable and environmentally responsible electricity,” said Erik Mader, Technical Executive with EPRI Nuclear Fuels and Executive Director of the CASL Industry Council.
Operating performance, safety margins, transient behaviour
“VERA’s coupled multiphysics modelling and simulation tools can be used to better inform operating performance, safety margins and transient behaviour in nuclear power plants. This could improve plant operator decision-making, reduce uncertainty and accelerate innovation in nuclear energy.”
As the 10-year CASL project winds down this spring, the program has established the VERA Users Group, which provides training, ongoing support and access to DOE’s high-performance computing resources to perform large-scale simulations.
Improved performance, longer lifetimes for the reactor fleet
VERA provides advanced modelling and simulation capabilities to help address several challenges, leading to improved performance and longer lifetimes for the current reactor fleet. These include predictions of departure from nucleate boiling; growth of corrosion deposits on fuel rods; stress caused by pellet expansion; and performance of reactor parts when exposed to high temperatures and radiation.
Last year, CASL brought the VERA software suite up to Nuclear Quality Assurance-1 level in preparation for widespread industry use. The NQA-1 rating, the gold standard for the nuclear industry, signifies extensive efforts in the areas of procedures, training and software control.
Argonne: AI to improve safety, design of Advanced Nuclear Reactors
Argonne National Laboratory is integrating decades of knowledge with the latest artificial intelligence (AI) methods and tools. Doing so can help researchers better understand the mechanics that govern nuclear reactors, which reactor designers and analysts can use to improve their design, operation and safety.
Machine learning helps systems to learn automatically based on patterns in data, and make better searches, decisions, or predictions. Nuclear engineer Acacia Brunett and other researchers in Argonne’s Nuclear Science and Engineering division are using machine learning methods to generate fast-running models of various nuclear thermal-hydraulic processes.
They are exploring behaviour that includes the mixing and flow of coolants as well as thermal stratification, which describes the changes in temperature that emerge within liquids held in large vessels generally under low-flow conditions.
These processes can be difficult to accurately predict without significant computational burden. But they can heavily affect reactor safety and performance.
For example, when temperatures vary across layers of liquid within a pool, that condition can lead to thermal fatigue, a process that can degrade components in a reactor. This shortens the overall lifetime of the component or reactor as a whole. It could also weaken the safety features of certain kinds of advanced reactors. With methods to explore these phenomena, researchers can create a framework for more rapid and comprehensive design and analysis of these issues.
Argonne researchers are investigating ways of using machine learning to more quickly measure uncertainty, which reveals how confident they can be in their predictions.
“Predictive simulations all have some amount of uncertainty, which are features, or characteristics that we don’t know exactly,” Brunett said.
“Examples could include the material properties of manufactured components, such as thickness, emissivity (how much heat surfaces emit), or some other physical phenomena. It’s our responsibility to understand what those uncertainties are, which is typically a very arduous process.”
The process takes time because it typically requires hundreds to thousands of repeated analyses, and in some cases, several high-fidelity simulations, which carry a high computational burden. Brunett and others are exploring ways to create and use machine learning models to make this analysis more efficient and reduce the total time required to quantify uncertainty and optimise design.
With machine learning, scientists are analysing large volumes of computational data and identifying the key components which describe the fundamental behaviour of a system.
For example, the behaviour of an advanced reactor was characterised using millions of data points. But with this new method, the system can instead be represented by a few thousand data points. Characterising the system’s response with these methods can reduce the total analysis time while still directly quantifying uncertainties.
Traditional vs. AI-integrated approach
Nuclear experts have traditionally used theory and observation to create models of nuclear processes and run high fidelity simulations with them. They would then compare simulation results against real-world observations, adapt their model accordingly and run simulations all over again, until their model could accurately predict real-world behaviour.
Using machine learning instead, researchers can create relatively accurate models much faster. Unlike the traditional approach, machine learning tools can, with relatively high accuracy, predict behaviour of safety-critical features, phenomena, or trends that may have otherwise been omitted by the analyst.
“High fidelity simulations help us to calculate the micro-details of different nuclear phenomena and then generate the training data to develop machine learning models,” Brunett said.
“Those models can then accurately estimate parameters that define these micro-details, such as mass and energy transport.”
Integrating with System Level Codes
After developing these models, Brunett and others will integrate them directly into Argonne-developed advanced reactor safety analysis tools for further testing. Machine learning models may replace existing models built within system code today, which would improve the predictive capabilities of the software and/or address known limitations within the software. With these tools, researchers can continue to improve the design and safety of next-generation technologies, advancing nuclear energy in the U.S.
Dan Yurman is the author of Neutron Bytes and writes on nuclear matters
This article is published with permission