Progressive Ensemble Kernel-Based Broad Learning System for Noisy Data Classification

被引:24
|
作者
Yu, Zhiwen [1 ]
Lan, Kankan [1 ]
Liu, Zhulin [1 ]
Han, Guoqiang [1 ]
机构
[1] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510640, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Kernel; Learning systems; Noise measurement; Feature extraction; Training; Biological neural networks; Uncertainty; Broad learning system (BLS); ensemble learning; kernel learning; noisy data; RIDGE-REGRESSION; NEURAL-NETWORK; MACHINE; MODEL; REPRESENTATIONS; APPROXIMATION; RECOGNITION; CLASSIFIERS; SELECTION;
D O I
10.1109/TCYB.2021.3064821
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The broad learning system (BLS) is an algorithm that facilitates feature representation learning and data classification. Although weights of BLS are obtained by analytical computation, which brings better generalization and higher efficiency, BLS suffers from two drawbacks: 1) the performance depends on the number of hidden nodes, which requires manual tuning, and 2) double random mappings bring about the uncertainty, which leads to poor resistance to noise data, as well as unpredictable effects on performance. To address these issues, a kernel-based BLS (KBLS) method is proposed by projecting feature nodes obtained from the first random mapping into kernel space. This manipulation reduces the uncertainty, which contributes to performance improvements with the fixed number of hidden nodes, and indicates that manually tuning is no longer needed. Moreover, to further improve the stability and noise resistance of KBLS, a progressive ensemble framework is proposed, in which the residual of the previous base classifiers is used to train the following base classifier. We conduct comparative experiments against the existing state-of-the-art hierarchical learning methods on multiple noisy real-world datasets. The experimental results indicate our approaches achieve the best or at least comparable performance in terms of accuracy.
引用
收藏
页码:9656 / 9669
页数:14
相关论文
共 50 条
  • [21] Imbalanced data classification based on scaling kernel-based support vector machine
    Zhang, Yong
    Fu, Panpan
    Liu, Wenzhe
    Chen, Guolong
    NEURAL COMPUTING & APPLICATIONS, 2014, 25 (3-4): : 927 - 935
  • [22] Graph-based broad learning system for classification
    Liu, Zheng
    Huang, Shiluo
    Jin, Wei
    Mu, Ying
    NEUROCOMPUTING, 2021, 463 : 535 - 544
  • [23] Ideal Kernel-Based Multiple Kernel Learning for Spectral-Spatial Classification of Hyperspectral Image
    Gao, Wei
    Peng, Yu
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2017, 14 (07) : 1051 - 1055
  • [24] Active learning for bird sound classification via a kernel-based extreme learning machine
    Qian, Kun
    Zhang, Zixing
    Baird, Alice
    Schuller, Bjorn
    JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2017, 142 (04): : 1796 - 1804
  • [25] Emotion Classification Using EEG Brain Signals and the Broad Learning System
    Issa, Sali
    Peng, Qinmu
    You, Xinge
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (12): : 7382 - 7391
  • [26] Multiscale Random Convolution Broad Learning System for Hyperspectral Image Classification
    Ma, You
    Liu, Zhi
    Chen, C. L. Philip
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [27] Dynamic Ensemble Learning With Multi-View Kernel Collaborative Subspace Clustering for Hyperspectral Image Classification
    Lu, Hongliang
    Su, Hongjun
    Hu, Jun
    Du, Qian
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2022, 15 : 2681 - 2695
  • [28] Practical Kernel-Based Reinforcement Learning
    Barreto, Andre M. S.
    Precup, Doina
    Pineau, Joelle
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [29] Broad learning system based ensemble deep model
    Chenglong Zhang
    Shifei Ding
    Lili Guo
    Jian Zhang
    Soft Computing, 2022, 26 : 7029 - 7041
  • [30] Broad learning system based ensemble deep model
    Zhang, Chenglong
    Ding, Shifei
    Guo, Lili
    Zhang, Jian
    SOFT COMPUTING, 2022, 26 (15) : 7029 - 7041