Tree boosting for learning EFT parameters

被引:7
作者
Chatterjee, Suman [1 ]
Frohner, Nikolaus [2 ]
Lechner, Lukas [1 ]
Schoefbeck, Robert [1 ]
Schwarz, Dennis [1 ]
机构
[1] Inst High Energy Phys HEPHY, Austrian Acad Sci OAW, Nikolsdorfer Gasse 18, A-1050 Vienna, Austria
[2] TU Wien, Karlsplatz 13, A-1040 Vienna, Austria
基金
奥地利科学基金会;
关键词
LHC; Physics beyond the standard model; Machine learning; Effective field theory; Boosted decision trees; Fisher information;
D O I
10.1016/j.cpc.2022.108385
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We present a new tree boosting algorithm designed for the measurement of parameters in the context of effective field theory (EFT). To construct the algorithm, we interpret the optimized loss function of a traditional decision tree as the maximal Fisher information in Poisson counting experiments. We promote the interpretation to general EFT predictions and develop a suitable boosting method. The resulting "Boosted Information Tree" algorithm approximates the score, the derivative of the log-likelihood function with respect to the parameter. It thus provides a sufficient statistic in the vicinity of a reference point in parameter space where the estimator is trained. The training exploits per-event information of likelihood ratios for different theory parameter values available in the simulated EFT data sets. Program summary Program Title: BIT (Boosted Information Trees) CPC Library link to program files: https://doi .org /10 .17632 /9fjyb5hyxt .1 Developer's repository link: https://github .com /HephyAnalysisSW /BIT Licensing provisions: GPLv3 Programming language: Python2 and Python3 Nature of problem: Providing a discriminator for parameter estimation in the context of the standard model effective field theory. Solution method: A tree-based algorithm exploits "augmented" information of the simulated training data set to regress in the score function and thereby provides a sufficient test statistic of an EFT parameter. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:9
相关论文
共 35 条
[1]  
[Anonymous], 1946, Mathematical Methods of Statistics
[2]  
[Anonymous], 1999, NIPS 99
[3]  
[Anonymous], 2021, BOOSTED INFORM TREES
[4]  
[Anonymous], ARXIV160302754
[5]  
Barducci D., ARXIV 180207237
[6]   MadMiner: Machine Learning-Based Inference for Particle Physics [J].
Brehmer J. ;
Kling F. ;
Espejo I. ;
Cranmer K. .
Computing and Software for Big Science, 2020, 4 (1)
[7]   Mining gold from implicit models to improve likelihood-free inference [J].
Brehmer, Johann ;
Louppe, Gilles ;
Pavez, Juan ;
Cranmer, Kyle .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2020, 117 (10) :5242-5249
[8]   A guide to constraining effective field theories with machine learning [J].
Brehmer, Johann ;
Cranmer, Kyle ;
Louppe, Gilles ;
Pavez, Juan .
PHYSICAL REVIEW D, 2018, 98 (05)
[9]   Constraining Effective Field Theories with Machine Learning [J].
Brehmer, Johann ;
Cranmer, Kyle ;
Louppe, Gilles ;
Pavez, Juan .
PHYSICAL REVIEW LETTERS, 2018, 121 (11)
[10]   SMEFTsim 3.0-a practical guide [J].
Brivio, Ilaria .
JOURNAL OF HIGH ENERGY PHYSICS, 2021, 2021 (04)