ToPs: Ensemble Learning With Trees of Predictors

被引:9
作者
Yoon, Jinsung [1 ]
Zame, William R. [2 ,3 ]
van der Schaar, Mihaela [4 ]
机构
[1] Univ Calif Los Angeles, Dept Elect Engn, Los Angeles, CA 90095 USA
[2] Univ Calif Los Angeles, Dept Math, Los Angeles, CA 90095 USA
[3] Univ Calif Los Angeles, Dept Econ, Los Angeles, CA 90095 USA
[4] Univ Oxford, Dept Engn Sci, Oxford OX1 3PJ, England
关键词
Ensemble learning; model tree; personalized predictive models; REGRESSION;
D O I
10.1109/TSP.2018.2807402
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We present a new approach to ensemble learning. Our approach differs from previous approaches in that it constructs and applies different predictive models to different subsets of the feature space. It does this by constructing a tree of subsets of the feature space and associating a predictor (predictive model) to each node of the tree; we call the resulting object a tree of predictors. The (locally) optimal tree of predictors is derived recursively; each step involves jointly optimizing the split of the terminal nodes of the previous tree and the choice of learner (from among a given set of base learners) and training set-hence predictor-for each set in the split. The features of a new instance determine a unique path through the optimal tree of predictors; the final prediction aggregates the predictions of the predictors along this path. Thus, our approach uses base learners to create complex learners that are matched to the characteristics of the data set while avoiding overfitting. We establish loss bounds for the final predictor in terms of the Rademacher complexity of the base learners. We report the results of a number of experiments on a variety of datasets, showing that our approach provides statistically significant improvements over a wide variety of state-of-the-art machine learning algorithms, including various ensemble learning methods.
引用
收藏
页码:2141 / 2152
页数:12
相关论文
共 50 条
  • [21] Machine learning based novel ensemble learning framework for electricity operational forecasting
    Weeraddana, Dilusha
    Khoa, Nguyen Lu Dang
    Mahdavi, Nariman
    ELECTRIC POWER SYSTEMS RESEARCH, 2021, 201
  • [22] Hierarchical Ensemble Learning for Alzheimer's Disease Classification
    Wang, Ruyue
    Li, Hanhui
    Lan, Rushi
    Luo, Suhuai
    Luo, Xiaonan
    2018 7TH INTERNATIONAL CONFERENCE ON DIGITAL HOME (ICDH 2018), 2018, : 224 - 229
  • [23] Economic Policy Uncertainty Index Meets Ensemble Learning
    Lolic, Ivana
    Soric, Petar
    Logarusic, Marija
    COMPUTATIONAL ECONOMICS, 2022, 60 (02) : 401 - 437
  • [24] Forest fire forecasting using ensemble learning approaches
    Xie, Ying
    Peng, Minggang
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (09) : 4541 - 4550
  • [25] Credit scoring prediction leveraging interpretable ensemble learning
    Liu, Yang
    Huang, Fei
    Ma, Lili
    Zeng, Qingguo
    Shi, Jiale
    JOURNAL OF FORECASTING, 2024, 43 (02) : 286 - 308
  • [26] Probabilistic Forecasting of Solar Power: An Ensemble Learning Approach
    Mohammed, Azhar Ahmed
    Yaqub, Waheeb
    Aung, Zeyar
    INTELLIGENT DECISION TECHNOLOGIES, 2015, 39 : 449 - 458
  • [27] Flexible neural trees ensemble for stock index modeling
    Chen, Yuehui
    Yang, Bo
    Abraham, Ajith
    NEUROCOMPUTING, 2007, 70 (4-6) : 697 - 703
  • [28] Ensemble survival trees for identifying subpopulations in personalized medicine
    Chen, Yu-Chuan
    Chen, James J.
    BIOMETRICAL JOURNAL, 2016, 58 (05) : 1151 - 1163
  • [29] Development of Ensemble Learning Method Considering Applicability Domains
    Sato, Keigo
    Kaneko, Hiromasa
    JOURNAL OF COMPUTER CHEMISTRY-JAPAN, 2019, 18 (04) : 187 - 193
  • [30] An ensemble learning approach for predicting phenotypes from genotypes
    Yu, Tingxi
    Zhang, Wuping
    Han, Jiwan
    Li, Fuzhong
    Wang, Zhihong
    Cao, Chunqing
    20TH INT CONF ON UBIQUITOUS COMP AND COMMUNICAT (IUCC) / 20TH INT CONF ON COMP AND INFORMATION TECHNOLOGY (CIT) / 4TH INT CONF ON DATA SCIENCE AND COMPUTATIONAL INTELLIGENCE (DSCI) / 11TH INT CONF ON SMART COMPUTING, NETWORKING, AND SERV (SMARTCNS), 2021, : 382 - 389