Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts

被引:0
|
作者
Saeed Masoudnia
Reza Ebrahimpour
Seyed Ali Asghar Abbaszadeh Arani
机构
[1] University of Tehran,School of Mathematics, Statistics and Computer Science
[2] Shahid Rajaee Teacher Training University,Brain & Intelligent Systems Research Laboratory, Department of Electrical and Computer Engineering
[3] Institute for Research in Fundamental Sciences (IPM),School of Cognitive Sciences (SCS)
来源
Neural Processing Letters | 2012年 / 36卷
关键词
Neural networks ensemble; Hybrid ensemble method; Mixture of experts; Negative correlation learning; Mixture of negatively correlated experts;
D O I
暂无
中图分类号
学科分类号
摘要
Combining accurate neural networks (NN) in the ensemble with negative error correlation greatly improves the generalization ability. Mixture of experts (ME) is a popular combining method which employs special error function for the simultaneous training of NN experts to produce negatively correlated NN experts. Although ME can produce negatively correlated experts, it does not include a control parameter like negative correlation learning (NCL) method to adjust this parameter explicitly. In this study, an approach is proposed to introduce this advantage of NCL into the training algorithm of ME, i.e., mixture of negatively correlated experts (MNCE). In this proposed method, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability. The proposed hybrid ensemble method, MNCE, is compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed ensemble method significantly improves the performance over the original ensemble methods.
引用
收藏
页码:31 / 47
页数:16
相关论文
共 50 条
  • [1] Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts
    Masoudnia, Saeed
    Ebrahimpour, Reza
    Arani, Seyed Ali Asghar Abbaszadeh
    NEURAL PROCESSING LETTERS, 2012, 36 (01) : 31 - 47
  • [2] Dropout regularization in hierarchical mixture of experts
    Irsoy, Ozan
    Alpaydin, Ethem
    NEUROCOMPUTING, 2021, 419 : 148 - 156
  • [3] A Proposal for Mixture of Experts with Entropic Regularization
    Peralta, Billy
    Saavedra, Ariel
    Caro, Luis
    2017 XLIII LATIN AMERICAN COMPUTER CONFERENCE (CLEI), 2017,
  • [4] Regularization and error bars for the mixture of experts network
    Ramamurti, V
    Ghosh, J
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 221 - 225
  • [5] Mixture of Experts with Entropic Regularization for Data Classification
    Peralta, Billy
    Saavedra, Ariel
    Caro, Luis
    Soto, Alvaro
    ENTROPY, 2019, 21 (02)
  • [6] Combining features of negative correlation learning with mixture of experts in proposed ensemble methods
    Masoudnia, Saeed
    Ebrahimpour, Reza
    Arani, Seyed Ali Asghar Abbaszadeh
    APPLIED SOFT COMPUTING, 2012, 12 (11) : 3539 - 3551
  • [7] Team Deep Mixture of Experts for Distributed Power Control
    Zecchin, Matteo
    Gesbert, David
    Kountouris, Marios
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [8] Control of Correlation in Negative Correlation Learning
    Liu, Yong
    Zhao, Qiangfu
    Pei, Yan
    2014 10TH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION (ICNC), 2014, : 7 - 11
  • [9] Discontinuity-Sensitive Optimal Control Learning by Mixture of Experts
    Tang, Gao
    Hauser, Kris
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 7892 - 7898
  • [10] A mixture of experts network structure construction algorithm for modelling and control
    Hong, X
    Harris, CJ
    APPLIED INTELLIGENCE, 2001, 16 (01) : 59 - 69