Prediction of burn-up nucleus density based on machine learning

被引:16
作者
Lei, Ji-Chong [1 ,2 ]
Zhou, Jian-Dong [3 ]
Zhao, Ya-Nan [1 ,2 ]
Chen, Zhen-Ping [1 ,2 ]
Zhao, Peng-Cheng [1 ,2 ]
Xie, Chao [1 ,2 ]
Ni, Zi-Ning [1 ,2 ]
Yu, Tao [1 ,2 ]
Xie, Jin-Sen [1 ,2 ]
机构
[1] Univ South China, Sch Nucl Sci & Technol, Hengyang 421001, Hunan, Peoples R China
[2] Univ South China, Virtual Simulat Expt Teaching Ctr Nucl Energy & T, Hengyang 421001, Hunan, Peoples R China
[3] Shanghai Nucl Engn Res & Design Inst Co Ltd, Shanghai, Peoples R China
关键词
burn‐ up; DRAGON; machine learning; multi‐ layer perceptron; nuclide density; MULTILAYER PERCEPTRON; REGRESSION;
D O I
10.1002/er.6660
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
Machine learning models were built by using four different algorithms using Linear Regression, Regression Tree, Multi-Layer Perceptron, and Random Forest by 10-fold Cross-Validation method using the training set. The validity of the four different machine learning algorithms was verified by predicting the nuclide densities of U-235, U-238, Pu-239, Pu-241, Cs-137, Cm-244, and Nd-254 at different burn-up depths by enrichment and burn-up depth. The experimental results show that the Pearson Correlation Coefficients of the training sets of the four algorithms based on the 10-fold Cross-Validation method are all greater than 0.72, among which the evaluation coefficients of the models of Regression Tree and Random Forest are better than those of the Multi-Layer Perceptron and Linear Regression; however, the prediction based on the test set is found that the model of the Multi-Layer Perceptron predicts better than the other three models, and the average deviation is less than 1% and the average deviation is less than 3% for the Regression Tree and Random Forest algorithm model.
引用
收藏
页码:14052 / 14061
页数:10
相关论文
共 20 条
  • [1] Cross-validation methods
    Browne, MW
    [J]. JOURNAL OF MATHEMATICAL PSYCHOLOGY, 2000, 44 (01) : 108 - 132
  • [3] Chambon R., 2018, IGE344 EC POL NONTR
  • [4] Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
  • [5] Hastie T., 2009, ELEMENTS STAT LEARNI, DOI DOI 10.1007/978-0-387-84858-7
  • [6] Fast and Accurate Matrix Completion via Truncated Nuclear Norm Regularization
    Hu, Yao
    Zhang, Debing
    Ye, Jieping
    Li, Xuelong
    He, Xiaofei
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (09) : 2117 - 2130
  • [7] LINEAR-REGRESSION IN ASTRONOMY .1.
    ISOBE, T
    FEIGELSON, ED
    AKRITAS, MG
    BABU, GJ
    [J]. ASTROPHYSICAL JOURNAL, 1990, 364 (01) : 104 - 113
  • [8] Dynamic Optimization of Process Quality Control and Maintenance Planning
    Jain, Amit Kumar
    Lad, Bhupesh Kumar
    [J]. IEEE TRANSACTIONS ON RELIABILITY, 2017, 66 (02) : 502 - 517
  • [9] An empirical study of learning from imbalanced data using random forest
    Khoshgoftaar, Taghi M.
    Golawala, Moiz
    Van Hulse, Jason
    [J]. 19TH IEEE INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, VOL II, PROCEEDINGS, 2007, : 310 - 317
  • [10] PRESSURIZED WATER-REACTOR CORE PARAMETER PREDICTION USING AN ARTIFICIAL NEURAL NETWORK
    KIM, HG
    CHANG, SH
    LEE, BH
    [J]. NUCLEAR SCIENCE AND ENGINEERING, 1993, 113 (01) : 70 - 76