ToPs: Ensemble Learning With Trees of Predictors

被引:9
作者
Yoon, Jinsung [1 ]
Zame, William R. [2 ,3 ]
van der Schaar, Mihaela [4 ]
机构
[1] Univ Calif Los Angeles, Dept Elect Engn, Los Angeles, CA 90095 USA
[2] Univ Calif Los Angeles, Dept Math, Los Angeles, CA 90095 USA
[3] Univ Calif Los Angeles, Dept Econ, Los Angeles, CA 90095 USA
[4] Univ Oxford, Dept Engn Sci, Oxford OX1 3PJ, England
关键词
Ensemble learning; model tree; personalized predictive models; REGRESSION;
D O I
10.1109/TSP.2018.2807402
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We present a new approach to ensemble learning. Our approach differs from previous approaches in that it constructs and applies different predictive models to different subsets of the feature space. It does this by constructing a tree of subsets of the feature space and associating a predictor (predictive model) to each node of the tree; we call the resulting object a tree of predictors. The (locally) optimal tree of predictors is derived recursively; each step involves jointly optimizing the split of the terminal nodes of the previous tree and the choice of learner (from among a given set of base learners) and training set-hence predictor-for each set in the split. The features of a new instance determine a unique path through the optimal tree of predictors; the final prediction aggregates the predictions of the predictors along this path. Thus, our approach uses base learners to create complex learners that are matched to the characteristics of the data set while avoiding overfitting. We establish loss bounds for the final predictor in terms of the Rademacher complexity of the base learners. We report the results of a number of experiments on a variety of datasets, showing that our approach provides statistically significant improvements over a wide variety of state-of-the-art machine learning algorithms, including various ensemble learning methods.
引用
收藏
页码:2141 / 2152
页数:12
相关论文
共 50 条
  • [41] Application of Biologically Inspired Methods to Improve Adaptive Ensemble Learning
    Grmanova, Gabriela
    Rozinajova, Viera
    Ezzedine, Anna Bou
    Lucka, Maria
    Lacko, Peter
    Loderer, Marek
    Vrablecova, Petra
    Laurinec, Peter
    ADVANCES IN NATURE AND BIOLOGICALLY INSPIRED COMPUTING, 2016, 419 : 235 - 246
  • [42] Ensemble Learning with Surrogate Splits
    Amasyali, Mehmet Fatih
    2017 25TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2017,
  • [43] Arrhythmia Using Ensemble Learning
    Alagambigai, P.
    Serin, J.
    Simeon, Jemima
    BIOSCIENCE BIOTECHNOLOGY RESEARCH COMMUNICATIONS, 2020, 13 (06): : 12 - 18
  • [44] An ensemble learning framework for anomaly detection in building energy consumption
    Araya, Daniel B.
    Grolinger, Katarina
    ElYamany, Hany F.
    Capretz, Miriam A. M.
    Bitsuamlak, Girma
    ENERGY AND BUILDINGS, 2017, 144 : 191 - 206
  • [45] Analysis of Dropout Learning Regarded as Ensemble Learning
    Hara, Kazuyuki
    Saitoh, Daisuke
    Shouno, Hayaru
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT II, 2016, 9887 : 72 - 79
  • [46] ENSEMBLE LEARNING FOR SPEECH ENHANCEMENT
    Le Roux, Jonathan
    Watanabe, Shinji
    Hershey, John R.
    2013 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2013,
  • [47] Evolutionary bagging for ensemble learning
    Ngo, Giang
    Beard, Rodney
    Chandra, Rohitash
    NEUROCOMPUTING, 2022, 510 : 1 - 14
  • [48] Ensemble Learning for Relational Data
    Eldardiry, Hoda
    Neville, Jennifer
    Rossi, Ryan A.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [49] Ensemble Learning for Customers Targeting
    Wang, Yu
    Xiao, Hongshan
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, 2011, 7091 : 24 - 31
  • [50] When multi-view classification meets ensemble learning
    Shi, Shaojun
    Nie, Feiping
    Wang, Rong
    Li, Xuelong
    NEUROCOMPUTING, 2022, 490 : 17 - 29