A weight initialization method for improving training speed in feedforward neural network

被引:125
|
作者
Yam, JYF [1 ]
Chow, TWS [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Tat Chee Ave, Kowloon, Peoples R China
关键词
initial weights determination; feedforward neural networks; backpropagation; linear least squares; Cauchy inequality;
D O I
10.1016/S0925-2312(99)00127-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An algorithm for determining the optimal initial weights of feedforward neural networks based on the Cauchy's inequality and a linear algebraic method is developed. The algorithm is computational efficient. The proposed method ensures that the outputs of neurons are in the active region and increases the rate of convergence. With the optimal initial weights determined, the initial error is substantially smaller and the number of iterations required to achieve the error criterion is significantly reduced. Extensive tests were performed to compare the proposed algorithm with other algorithms. In the case of the sunspots prediction, the number of iterations required for the network initialized with the proposed method was only 3.03% of those started with the next best weight initialization algorithm. (C) 2000 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:219 / 232
页数:14
相关论文
共 50 条
  • [31] Batch gradient training method with smoothing regularization for l0 feedforward neural networks
    Zhang, Huisheng
    Tang, Yanli
    Liu, Xiaodong
    NEURAL COMPUTING & APPLICATIONS, 2015, 26 (02) : 383 - 390
  • [32] Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks
    Huisheng Zhang
    Ying Zhang
    Dongpo Xu
    Xiaodong Liu
    Cognitive Neurodynamics, 2015, 9 : 331 - 340
  • [33] Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks
    Zhang, Huisheng
    Zhang, Ying
    Xu, Dongpo
    Liu, Xiaodong
    COGNITIVE NEURODYNAMICS, 2015, 9 (03) : 331 - 340
  • [34] On the solutions of one-class-one-outputed feedforward neural network classifiers using backpropagation training
    Huang, DS
    ICSP '98: 1998 FOURTH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, PROCEEDINGS, VOLS I AND II, 1998, : 1301 - 1305
  • [35] Dynamic node creation and fast learning algorithm for a hybrid feedforward neural network
    Xia, HF
    Dai, LK
    2002 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-4, PROCEEDINGS, 2002, : 202 - 205
  • [36] A simple feedforward convolutional conceptor neural network for classification
    Qian, Guangwu
    Zhang, Lei
    APPLIED SOFT COMPUTING, 2018, 70 : 1034 - 1041
  • [37] COMPARISON OF FEEDFORWARD AND RECURRENT NEURAL NETWORK LANGUAGE MODELS
    Sundermeyer, M.
    Oparin, I.
    Gauvain, J. -L.
    Freiberg, B.
    Schlueter, R.
    Ney, H.
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8430 - 8434
  • [38] Batch gradient method with smoothing L1/2 regularization for training of feedforward neural networks
    Wu, Wei
    Fan, Qinwei
    Zurada, Jacek M.
    Wang, Jian
    Yang, Dakun
    Liu, Yan
    NEURAL NETWORKS, 2014, 50 : 72 - 78
  • [39] Solving the linear interval tolerance problem for weight initialization of neural networks
    Adam, S. P.
    Karras, D. A.
    Magoulas, G. D.
    Vrahatis, M. N.
    NEURAL NETWORKS, 2014, 54 : 17 - 37
  • [40] A Novel Evolutionary Feedforward Neural Network with Artificial Immunology
    宫新保
    臧小刚
    周希朗
    Journal of Shanghai Jiaotong University, 2003, (01) : 40 - 42