Robust PACm: Training Ensemble Models Under Misspecification and Outliers

被引:3
作者
Zecchin, Matteo [1 ]
Park, Sangwoo [1 ]
Simeone, Osvaldo [1 ]
Kountouris, Marios [2 ]
Gesbert, David [2 ]
机构
[1] Kings Coll London, Dept Engn, Kings Commun Learning & Informat Proc KCLIP Lab, London WC2R 2LS, England
[2] EURECOM, Commun Syst Dept, F-06410 Sophia Antipolis, France
基金
英国工程与自然科学研究理事会;
关键词
Bayes methods; Pollution measurement; Standards; Europe; Training; Robustness; Predictive models; Bayesian learning; ensemble models; machine learning; misspecification; outliers; robustness; BAYESIAN-INFERENCE;
D O I
10.1109/TNNLS.2023.3295168
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Standard Bayesian learning is known to have suboptimal generalization capabilities under misspecification and in the presence of outliers. Probably approximately correct (PAC)-Bayes theory demonstrates that the free energy criterion minimized by Bayesian learning is a bound on the generalization error for Gibbs predictors (i.e., for single models drawn at random from the posterior) under the assumption of sampling distributions uncontaminated by outliers. This viewpoint provides a justification for the limitations of Bayesian learning when the model is misspecified, requiring ensembling, and when data are affected by outliers. In recent work, PAC-Bayes bounds-referred to as PACm-were derived to introduce free energy metrics that account for the performance of ensemble predictors, obtaining enhanced performance under misspecification. This work presents a novel robust free energy criterion that combines the generalized logarithm score function with PACm ensemble bounds. The proposed free energy training criterion produces predictive distributions that are able to concurrently counteract the detrimental effects of misspecification-with respect to both likelihood and prior distribution-and outliers.
引用
收藏
页码:16518 / 16532
页数:15
相关论文
共 50 条
[1]   Training Provably Robust Models by Polyhedral Envelope Regularization [J].
Liu, Chen ;
Salzmann, Mathieu ;
Susstrunk, Sabine .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (06) :3146-3160
[2]   Robust estimation of models for longitudinal data with dropouts and outliers [J].
Zhang, Yuexia ;
Qin, Guoyou ;
Zhu, Zhongyi ;
Fu, Bo .
JOURNAL OF APPLIED STATISTICS, 2022, 49 (04) :902-925
[3]   Asymptotics of Bayesian Inference for a Class of Probabilistic Models under Misspecification [J].
Miya, Nozomi ;
Suko, Tota ;
Yasuda, Goki ;
Matsushima, Toshiyasu .
IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2014, E97A (12) :2352-2360
[4]   Learning under Model Misspecification: Applications to Variational and Ensemble methods [J].
Masegosa, Andres R. .
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
[5]   Ensemble Stochastic Configuration Networks for Estimating Prediction Intervals: A Simultaneous Robust Training Algorithm and Its Application [J].
Lu, Jun ;
Ding, Jinliang ;
Dai, Xuewu ;
Chai, Tianyou .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (12) :5426-5440
[6]   Robust estimation in long-memory processes under additive outliers [J].
Molinares, Fabio Fajardo ;
Reisen, Valderio Anselmo ;
Cribari-Neto, Francisco .
JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2009, 139 (08) :2511-2525
[7]   Robust and sparse canonical correlation analysis for fault detection and diagnosis using training data with outliers [J].
Luo, Lijia ;
Wang, Weida ;
Bao, Shiyi ;
Peng, Xin ;
Peng, Yigong .
EXPERT SYSTEMS WITH APPLICATIONS, 2024, 236
[8]   Bias in dynamic panel models under time series misspecification [J].
Lee, Yoonseok .
JOURNAL OF ECONOMETRICS, 2012, 169 (01) :54-60
[9]   Novel hybrid and weighted ensemble models to predict river discharge series with outliers [J].
Shabbir, Maha ;
Chand, Sohail ;
Iqbal, Farhat .
KUWAIT JOURNAL OF SCIENCE, 2024, 51 (02)
[10]   LAFED: Towards robust ensemble models via Latent Feature Diversification [J].
Zhuang, Wenzi ;
Huang, Lifeng ;
Gao, Chengying ;
Liu, Ning .
PATTERN RECOGNITION, 2024, 150