A novel learning algorithm of single-hidden-layer feedforward neural networks

被引:1
|
作者
Pu, Dong-Mei [1 ]
Gao, Da-Qi [1 ]
Ruan, Tong [1 ]
Yuan, Yu-Bo [1 ,2 ]
机构
[1] East China Univ Sci & Technol, Sch Informat Sci & Engn, Dept Comp Sci & Engn, Shanghai 200237, Peoples R China
[2] Zhejiang Ocean Univ, Key Lab Oceanog Big Data Min & Applicat Zhejiang, Zhoushan 316022, Zhejiang, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2017年 / 28卷
基金
国家高技术研究发展计划(863计划);
关键词
Neural networks; Iteration methods; Data classification; Data regression; Optimization; Algorithms; MACHINE;
D O I
10.1007/s00521-016-2372-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Single-hidden-layer feedforward neural network (SLFN) is an effective model for data classification and regression. However, it has a very important defect that it is rather time-consuming to explore the training algorithm of SLFN. In order to shorten the learning time, a special non-iterative learning algorithm was proposed, named as extreme learning machine (ELM). The main idea is that the input weights and bias are chosen randomly and the output weights are calculated by a pseudo-inverse matrix. However, ELM also has a very important drawback that it cannot achieve stable solution for different runs because of randomness. In this paper, we propose a stabilized learning algorithm based on iteration correction. The convergence analysis shows that the proposed algorithm can finish the learning process in fewer steps than the number of neurons. Three theorems and their proofs can prove that the proposed algorithm is stable. Several data sets are selected from UCI databases, and the experimental results demonstrate that the proposed algorithm is effective.
引用
收藏
页码:S719 / S726
页数:8
相关论文
共 50 条
  • [31] On Theoretical Analysis of Single Hidden Layer Feedforward Neural Networks with Relu Activations
    Shen, Guorui
    Yuan, Ye
    2019 34RD YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION (YAC), 2019, : 706 - 709
  • [32] Training Single Hidden Layer Feedforward Neural Networks by Singular Value Decomposition
    Hieu Trung Huynh
    Won, Yonggwan
    ICCIT: 2009 FOURTH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND CONVERGENCE INFORMATION TECHNOLOGY, VOLS 1 AND 2, 2009, : 1300 - 1304
  • [33] The Spectrum of the Fisher Information Matrix of a Single-Hidden-Layer Neural Network
    Pennington, Jeffrey
    Worah, Pratik
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [34] Single-hidden-layer feed-forward quantum neural network based on Grover learning
    Liu, Cheng-Yi
    Chen, Chein
    Chang, Ching-Ter
    Shih, Lun-Min
    NEURAL NETWORKS, 2013, 45 : 144 - 150
  • [35] Feedforward Neural Networks with a Hidden Layer Regularization Method
    Alemu, Habtamu Zegeye
    Wu, Wei
    Zhao, Junhong
    SYMMETRY-BASEL, 2018, 10 (10):
  • [37] Adaptive output feedback control of uncertain nonlinear systems using single-hidden-layer neural networks
    Hovakimyan, N
    Nardi, F
    Calise, A
    Kim, N
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (06): : 1420 - 1431
  • [38] Fading channel modelling using single-hidden layer feedforward neural networks
    Junbiao Liu
    Xinyu Jin
    Fang Dong
    Liang He
    Hong Liu
    Multidimensional Systems and Signal Processing, 2017, 28 : 885 - 903
  • [39] DNA microarray classification with compact single hidden-layer feedforward neural networks
    Huynh, Hieu Trung
    Kim, Jung-Ja
    Won, Yonggwan
    PROCEEDINGS OF THE FRONTIERS IN THE CONVERGENCE OF BIOSCIENCE AND INFORMATION TECHNOLOGIES, 2007, : 193 - +
  • [40] On Sharpness of Error Bounds for Univariate Approximation by Single Hidden Layer Feedforward Neural Networks
    Goebbels, Steffen
    RESULTS IN MATHEMATICS, 2020, 75 (03)