Machine learning of LWR spent nuclear fuel assembly decay heat measurements

被引:12
作者
Ebiwonjumi, Bamidele [1 ]
Cherezov, Alexey [1 ]
Dzianisau, Siarhei [1 ]
Lee, Deokjung [1 ]
机构
[1] Ulsan Natl Inst Sci & Technol, Dept Nucl Engn, 50 UNIST Gil, Ulsan 44919, South Korea
关键词
Decay heat; Spent nuclear fuel; Machine learning; Light water reactor; Synthetic data; Uncertainty analysis; POWER PEAKING FACTOR; NEURAL-NETWORKS; NOISE; PWR;
D O I
10.1016/j.net.2021.05.037
中图分类号
TL [原子能技术]; O571 [原子核物理学];
学科分类号
0827 ; 082701 ;
摘要
Measured decay heat data of light water reactor (LWR) spent nuclear fuel (SNF) assemblies are adopted to train machine learning (ML) models. The measured data is available for fuel assemblies irradiated in commercial reactors operated in the United States and Sweden. The data comes from calorimetric measurements of discharged pressurized water reactor (PWR) and boiling water reactor (BWR) fuel assemblies. 91 and 171 measurements of PWR and BWR assembly decay heat data are used, respectively. Due to the small size of the measurement dataset, we propose: (i) to use the method of multiple runs (ii) to generate and use synthetic data, as large dataset which has similar statistical characteristics as the original dataset. Three ML models are developed based on Gaussian process (GP), support vector ma-chines (SVM) and neural networks (NN), with four inputs including the fuel assembly averaged enrichment, assembly averaged burnup, initial heavy metal mass, and cooling time after discharge. The outcomes of this work are (i) development of ML models which predict LWR fuel assembly decay heat from the four inputs (ii) generation and application of synthetic data which improves the performance of the ML models (iii) uncertainty analysis of the ML models and their predictions. (c) 2021 Korean Nuclear Society, Published by Elsevier Korea LLC. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
引用
收藏
页码:3563 / 3579
页数:17
相关论文
共 54 条
[1]   The effects of adding noise during backpropagation training on a generalization performance [J].
An, GZ .
NEURAL COMPUTATION, 1996, 8 (03) :643-674
[2]  
[Anonymous], 2016, EV GUID EV SPENT NUC
[3]   Calculation of the power peaking factor in a nuclear reactor using support vector regression models [J].
Bae, In Ho ;
Na, Man Gyun ;
Lee, Yoon Joon ;
Park, Goon Cherl .
ANNALS OF NUCLEAR ENERGY, 2008, 35 (12) :2200-2205
[4]  
Beale M.H., 2020, Deep Learning Toolbox TM User' s Guide
[5]   TRAINING WITH NOISE IS EQUIVALENT TO TIKHONOV REGULARIZATION [J].
BISHOP, CM .
NEURAL COMPUTATION, 1995, 7 (01) :108-116
[6]   Business data mining - a machine learning perspective [J].
Bose, I ;
Mahapatra, RK .
INFORMATION & MANAGEMENT, 2001, 39 (03) :211-225
[7]  
Ceylan Z., 2020, UTIL ENV EFF, V38, P1
[8]   SMOTE: Synthetic minority over-sampling technique [J].
Chawla, Nitesh V. ;
Bowyer, Kevin W. ;
Hall, Lawrence O. ;
Kegelmeyer, W. Philip .
2002, American Association for Artificial Intelligence (16)
[9]   Resonance treatment using pin-based pointwise energy slowing-down method [J].
Choi, Sooyoung ;
Lee, Changho ;
Lee, Deokjung .
JOURNAL OF COMPUTATIONAL PHYSICS, 2017, 330 :134-155
[10]  
CLAB. Y, 2006, MEAS DEC HEAT SPENT