A hybrid ensemble method with negative correlation learning for regression

被引:4
作者
Bai, Yun [1 ,2 ]
Tian, Ganglin [3 ]
Kang, Yanfei [1 ]
Jia, Suling [1 ]
机构
[1] Beihang Univ, Sch Econ & Management, Beijing 100191, Peoples R China
[2] MINES Paris PSL Univ, Ctr Proc Renewable Energies & Energy Syst PERSEE, Sophia Antipolis, France
[3] IMT Atlantique, Fac Microwave Observat Perspect Environm, F-29280 Plouzane, France
基金
中国国家自然科学基金;
关键词
Hybrid ensemble; Diversity; Negative correlation learning; Optimization; CLASSIFIERS; ALGORITHM;
D O I
10.1007/s10994-023-06364-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hybrid ensemble, an essential branch of ensembles, has flourished in the regression field, with studies confirming diversity's importance. However, previous ensembles consider diversity in the sub-model training stage, with limited improvement compared to single models. In contrast, this study automatically selects and weights sub-models from a heterogeneous model pool. It solves an optimization problem using an interior-point filtering linear-search algorithm. The objective function innovatively incorporates negative correlation learning as a penalty term, with which a diverse model subset can be selected. The best sub-models from each model class are selected to build the NCL ensemble, which performance is better than the simple average and other state-of-the-art weighting methods. It is also possible to improve the NCL ensemble with a regularization term in the objective function. In practice, it is difficult to conclude the optimal sub-model for a dataset prior due to the model uncertainty. Regardless, our method would achieve comparable accuracy as the potential optimal sub-models. In conclusion, the value of this study lies in its ease of use and effectiveness, allowing the hybrid ensemble to embrace diversity and accuracy.
引用
收藏
页码:3881 / 3916
页数:36
相关论文
共 55 条
[1]   A new hybrid ensemble credit scoring model based on classifiers consensus system approach [J].
Ala'raj, Maher ;
Abbod, Maysam F. .
EXPERT SYSTEMS WITH APPLICATIONS, 2016, 64 :36-55
[2]   Fast decorrelated neural network ensembles with random weights [J].
Alhamdoosh, Monther ;
Wang, Dianhui .
INFORMATION SCIENCES, 2014, 264 :104-117
[3]  
[Anonymous], 1995, How We Learn
[4]  
How We Remember: Toward An Understanding Of Brain And Neural Systems: Selected Papers of Leon N Cooper, DOI [DOI 10.1142/97898127958850025, DOI 10.1142/9789812795885_0025]
[5]  
[Anonymous], 1986, BRAIN THEORY P 1 TRI
[6]  
Armstrong JS, 2001, INT SER OPER RES MAN, V30, P417
[7]   A gradient boosting approach to the Kaggle load forecasting competition [J].
Ben Taieb, Souhaib ;
Hyndman, Rob J. .
INTERNATIONAL JOURNAL OF FORECASTING, 2014, 30 (02) :382-394
[8]   When Does Diversity Help Generalization in Classification Ensembles? [J].
Bian, Yijun ;
Chen, Huanhuan .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (09) :9059-9075
[9]   Kaggle forecasting competitions: An overlooked learning opportunity [J].
Bojer, Casper Solheim ;
Meldgaard, Jens Peder .
INTERNATIONAL JOURNAL OF FORECASTING, 2021, 37 (02) :587-603
[10]  
Box GE, 2011, Bayesian Inference in Statistical Analysis