Machine learning and Bayesian inference in nuclear fusion research: an overview

被引:25
作者
Pavone, A. [1 ]
Merlo, A. [1 ]
Kwak, S. [1 ]
Svensson, J. [1 ]
机构
[1] Max Planck Inst Plasma Phys, Wendelsteinstr 1, D-17491 Greifswald, Germany
关键词
machine learning; Bayesian inference; neural networks; nuclear fusion; deep learning; data analysis; DISRUPTION PREDICTOR; MAGNETIC CONTROL; NEURAL-NETWORKS; TOKAMAK; PHYSICS; JET; MITIGATION; FRAMEWORK; PLASMAS;
D O I
10.1088/1361-6587/acc60f
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
This article reviews applications of Bayesian inference and machine learning (ML) in nuclear fusion research. Current and next-generation nuclear fusion experiments require analysis and modelling efforts that integrate different models consistently and exploit information found across heterogeneous data sources in an efficient manner. Model-based Bayesian inference provides a framework well suited for the interpretation of observed data given physics and probabilistic assumptions, also for very complex systems, thanks to its rigorous and straightforward treatment of uncertainties and modelling hypothesis. On the other hand, ML, in particular neural networks and deep learning models, are based on black-box statistical models and allow the handling of large volumes of data and computation very efficiently. For this reason, approaches which make use of ML and Bayesian inference separately and also in conjunction are of particular interest for today's experiments and are the main topic of this review. This article also presents an approach where physics-based Bayesian inference and black-box ML play along, mitigating each other's drawbacks: the former is made more efficient, the latter more interpretable.
引用
收藏
页数:32
相关论文
共 187 条
[91]   FAST NONLINEAR EXTRACTION OF PLASMA EQUILIBRIUM PARAMETERS USING A NEURAL NETWORK MAPPING [J].
LISTER, JB ;
SCHNURRENBERGER, H .
NUCLEAR FUSION, 1991, 31 (07) :1291-1300
[92]  
Louppe G, 2017, ADV NEUR IN, V30
[93]   Compressive Neural Representations of Volumetric Scalar Fields [J].
Lu, Y. ;
Jiang, K. ;
Levine, J. A. ;
Berger, M. .
COMPUTER GRAPHICS FORUM, 2021, 40 (03) :135-146
[94]  
Lundberg SM, 2017, ADV NEUR IN, V30
[95]   INFORMATION-BASED OBJECTIVE FUNCTIONS FOR ACTIVE DATA SELECTION [J].
MACKAY, DJC .
NEURAL COMPUTATION, 1992, 4 (04) :590-604
[96]   A PRACTICAL BAYESIAN FRAMEWORK FOR BACKPROPAGATION NETWORKS [J].
MACKAY, DJC .
NEURAL COMPUTATION, 1992, 4 (03) :448-472
[97]   Understanding deep convolutional networks [J].
Mallat, Stephane .
PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2016, 374 (2065)
[98]   Big data requirements in current and next fusion research experiments [J].
Manduchi, G. ;
Luchetta, A. ;
Taliercio, C. ;
Rigoni, A. .
2018 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2018,
[99]   A curated dataset for data-driven turbulence modelling [J].
McConkey, Ryley ;
Yee, Eugene ;
Lien, Fue-Sang .
SCIENTIFIC DATA, 2021, 8 (01)
[100]  
Mehta V., 2022, ICML2022 WORKSHOP AD