Fault and Noise Tolerance in the Incremental Extreme Learning Machine

被引:16
|
作者
Leung, Ho Chun [1 ]
Leung, Chi Sing [1 ]
Wong, Eric Wing Ming [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Hong Kong, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
关键词
Single hidden layer network; incremental learning; extreme learning machine; multiplicative noise; open fault; NEURAL-NETWORKS; FEEDFORWARD NETWORKS; ERROR ANALYSIS; DESIGN;
D O I
10.1109/ACCESS.2019.2948059
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The extreme learning machine (ELM) is an efficient way to build single-hidden-layer feedforward networks (SLFNs). However, its fault tolerant ability is very weak. When node noise or node failure exist in a network trained by the ELM concept, the performance of the network is greatly degraded if a countermeasure is not taken. However, this kind of countermeasure for the ELM or incremental learning is seldom reported. This paper considers the situation that a trained SLFN suffers from the coexistence of node fault and node noise. We develop two fault tolerant incremental ELM algorithms for the regression problem, namely node fault tolerant incremental ELM (NFTI-ELM) and node fault tolerant convex incremental ELM (NFTCI-ELM). The NFTI-ELM determines the output weight of the newly inserted node only. We prove that in terms of the training set mean squared error (MSE) of faulty SLFNs, the NFTI-ELM converges. Our numerical results show that the NFTI-ELM is superior to the conventional ELM and incremental ELM algorithms under faulty situations. To further improve the performance, we propose the NFTCI-ELM algorithm. It not only determines the output weight of the newly inserted node, but also updates all previously trained output weights. In terms of training set MSE of faulty SLFNs, the NFTCI-ELM converges, and it is superior to the NFTI-ELM.
引用
收藏
页码:155171 / 155183
页数:13
相关论文
共 50 条
  • [1] Noise/fault aware regularization for incremental learning in extreme learning machines
    Wong, Hiu-Tung
    Leung, Ho-Chun
    Leung, Chi-Sing
    Wong, Eric
    NEUROCOMPUTING, 2022, 486 : 200 - 214
  • [2] Incremental regularized extreme learning machine and it's enhancement
    Xu, Zhixin
    Yao, Min
    Wu, Zhaohui
    Dai, Weihui
    NEUROCOMPUTING, 2016, 174 : 134 - 142
  • [3] Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning
    Feng, Guorui
    Huang, Guang-Bin
    Lin, Qingping
    Gay, Robert
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (08): : 1352 - 1357
  • [4] Incremental constructive extreme learning machine
    Li, Fan-Jun
    Qiao, Jun-Fei
    Han, Hong-Gui
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2014, 31 (05): : 638 - 643
  • [5] Length-Changeable Incremental Extreme Learning Machine
    Wu, You-Xi
    Liu, Dong
    Jiang, He
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2017, 32 (03) : 630 - 643
  • [6] A Class Incremental Extreme Learning Machine for Activity Recognition
    Zhao, Zhongtang
    Chen, Zhenyu
    Chen, Yiqiang
    Wang, Shuangquan
    Wang, Hongan
    COGNITIVE COMPUTATION, 2014, 6 (03) : 423 - 431
  • [7] A Class Incremental Extreme Learning Machine for Activity Recognition
    Zhongtang Zhao
    Zhenyu Chen
    Yiqiang Chen
    Shuangquan Wang
    Hongan Wang
    Cognitive Computation, 2014, 6 : 423 - 431
  • [8] Incremental Learning for Classification of Unstructured Data Using Extreme Learning Machine
    Madhusudhanan, Sathya
    Jaganathan, Suresh
    Jayashree, L. S.
    ALGORITHMS, 2018, 11 (10)
  • [9] Incremental and Decremental Extreme Learning Machine Based on Generalized Inverse
    Jin, Bo
    Jing, Zhongliang
    Zhao, Haitao
    IEEE ACCESS, 2017, 5 : 20852 - 20865
  • [10] Parallel Chaos Search Based Incremental Extreme Learning Machine
    Yang, Yimin
    Wang, Yaonan
    Yuan, Xiaofang
    NEURAL PROCESSING LETTERS, 2013, 37 (03) : 277 - 301