Engineers Develop More Accurate AI Algorithm for Improving Nuclear Reactor Performance

Purdue University research reactor serves as test bed for optimizing performance of small modular reactors.

Rep. Greg Pence, a member of the U.S. House Energy and Commerce Committee (second from right), toured Purdue University Reactor Number One, Indiana’s first and only nuclear reactor, alongside Purdue engineering faculty and research leaders in August 2023. (Image: Purdue University)

To expand the availability of electricity generated from nuclear power, several countries have started developing designs for small modular reactors (SMRs), which could take less time and money to construct compared to existing reactors.

Toward this effort, a study conducted at Purdue University has made progress in enabling artificial intelligence to improve monitoring and control of SMRs, possibly offering a way to further cut costs of their operation and maintenance so that they can be more economically viable.

The study, published in Nature’s Scientific Reports, showed how a machine learning algorithm could rapidly learn about the physics behind a measurement of how steadily a reactor is producing power, and predict changes in this indicator over time with 99 percent accuracy.

The researchers believe that an algorithm like this one, which doesn’t require as much training as other AI methods that have been proposed for predicting a reactor’s performance, might help engineers achieve efficient reactor monitoring and control. It could also allow future operators to monitor and improve a reactor’s performance more effectively over its whole lifespan.

Researchers from Purdue and the U.S. Department of Energy’s (DOE) Argonne National Laboratory conducted this study using measurements from the Purdue University Reactor Number One (PUR-1), the first and only reactor in the U.S. licensed with a fully digital instrumentation and control system.

Most existing reactors have analog systems, limiting how much AI can benefit their operation. SMRs, on the other hand, will have digital gauges and sensors, opening a door for AI to collect realtime data and inform their design and performance more comprehensively.

PUR-1’s digital instrumentation and controls make it an ideal test bed for developing and testing AI algorithms.

“We can harmoniously couple the subatomic world with AI systems that can dig into these numbers and extract all kinds of nuggets of information and knowledge about maintaining and improving the machine,” said Konstantinos Prantikos, the First Author of this paper and a Graduate Research Assistant in Purdue’s School of Nuclear Engineering.

Only used for educational and research purposes, PUR-1 also is the site of the first “digital twin” nuclear reactor control system on a university campus, which Purdue and Argonne researchers used to conduct this study. The digital twin is a clone of PUR-1 that allows researchers to collect real-time data and realistically simulate the reactor on a computer and experiment without affecting the reactor’s operation. The data is available in real time through a virtual program, which can then be used to view and analyze the data.

“With a digital twin, it’s possible to develop the capability to monitor a reactor remotely. In the future, SMRs could use digital twins to have algorithms running in the background that can predict what’s going to happen in the next minute, in the next hour, and then provide information to the operator to make adjustments,” said Stylianos Chatzidakis, a Purdue Assistant Professor of Nuclear Engineering and Associate Director of PUR-1, whose research group created the reactor’s digital twin.

The Purdue-Argonne team tested the machine learning algorithm’s ability to monitor and predict fluctuations in the number of neutrons released from the reactor core. These neutrons kick-start fission, which is the chemical reaction that allows a reactor to produce power. The reaction relies on neutrons to continuously split uranium-235 atoms, releasing a high volume of energy. Nuclear power plants use this energy to generate electricity.

The algorithm can predict neutron flux for PUR-1 with an error rate below 1 percent on average, which is sufficient accuracy for monitoring a reactor, the researchers report.

This accuracy is due to how comprehensively the algorithm learned about the reactor’s neutron flux levels. The researchers designed the algorithm to learn from a physics model trained on neutron flux level measurements provided via the digital twin. The algorithm then made predictions on how these levels could change.

The team found that this transfer of learning between the physics model and machine learning algorithm not only improves accuracy, but also happens in just a few seconds, significantly cutting down the training time needed to develop an algorithm that can monitor and improve the performance of a reactor.

This work builds on previous collaborations between Argonne and Purdue. The two entered into an agreement last year that allows certain researchers to share affiliations to both institutions. Under this joint appointment master agreement, designated researchers will have access to facilities and expertise at both the national laboratory and Purdue. Alexander Heifetz, a principal nuclear engineer at Argonne who collaborated with Purdue on this study, is a visiting scholar in Purdue’s School of Nuclear Engineering through this agreement.

This research is funded by DOE’s Advanced Research Projects Agency-Energy and a donation from Goldman Sachs Gives to the AI Systems Lab at Purdue.

This work was performed by researchers from Purdue University and Argonne National Laboratory. For more information, download the Technical Support Package (free white paper) below. ADTTSP-09241



This Brief includes a Technical Support Package (TSP).
Document cover
Physics‐informed neural network with transfer learning (TL‐PINN)

(reference ADTTSP-09241) is currently available for download from the TSP library.

Don't have an account?



Magazine cover
Aerospace & Defense Technology Magazine

This article first appeared in the September, 2024 issue of Aerospace & Defense Technology Magazine.

Read more articles from this issue here.

Read more articles from the archives here.


Overview

The document presents a research study published in Scientific Reports that focuses on improving the prediction of nuclear reactor transients using a novel approach called Physics-informed Neural Network with Transfer Learning (TL-PINN). The study is authored by Konstantinos Prantikos and colleagues, and it aims to enhance the safety and efficiency of nuclear reactors by utilizing advanced machine learning techniques.

The research addresses the challenges associated with predicting transient behaviors in nuclear reactors, which are critical for maintaining operational safety and efficiency. Traditional methods can be time-consuming and may not always provide accurate predictions. To overcome these limitations, the authors propose a TL-PINN framework that integrates physics-based modeling with machine learning, allowing for more accurate and faster predictions of reactor dynamics.

The TL-PINN approach leverages domain similarity measures, enabling the model to be pre-trained on data from one type of transient event. This pre-training significantly reduces the number of training iterations required for new transient predictions, leading to improved performance and efficiency. The study demonstrates that by using transfer learning, the model can effectively adapt to new scenarios with minimal additional training, thus enhancing its predictive capabilities.

The authors provide a detailed methodology, including the formulation of the physics-informed neural network and the implementation of transfer learning techniques. They validate their approach through various experiments, showcasing its effectiveness in predicting neutron density and other key parameters during reactor transients.

Additionally, the document discusses the implications of this research for the nuclear industry, emphasizing the potential for improved monitoring and control of reactor operations. The findings suggest that the TL-PINN framework could serve as a valuable tool for reactor operators, enabling them to respond more effectively to transient events and enhance overall safety.

The study was supported by the U.S. Department of Energy and highlights the collaborative efforts of the research team. The authors declare no competing interests and provide information on code availability for further research and application.

In summary, this research contributes to the field of nuclear engineering by introducing an innovative machine learning approach that combines physics-based insights with advanced neural network techniques, paving the way for more reliable and efficient nuclear reactor monitoring and management.