Analyzing weight distribution of feedforward neural networks and efficient weight initialization

被引:0
作者
Go, J [1 ]
Baek, B [1 ]
Lee, C [1 ]
机构
[1] Yonsei Univ, BERC, Dept Elect & Elect Engn, Seoul 120749, South Korea
来源
STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, PROCEEDINGS | 2004年 / 3138卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we investigate and analyze the weight distribution of feedforward two-layer neural networks in order to understand and improve the time-consuming training process of neural networks. Generally, it takes a long time to train neural networks. However, when a new problem is presented, neural networks have to be trained again without any benefit from previous training. In order to address this problem, we view training process as finding a solution weight point in a weight space and analyze the distribution of solution weight points in the weight space. Then, we propose a weight initialization method that uses the information on the distribution of the solution weight points. Experimental results show that the proposed weight initialization method provides a better performance than the conventional method that uses a random generator in terms of convergence speed.
引用
收藏
页码:840 / 849
页数:10
相关论文
共 50 条
  • [41] Novel recurrent neural network weight initialization strategy
    Marichal, R.
    Pineiro, J. D.
    Gonzalez, E.
    Torres, J.
    [J]. WCECS 2007: WORLD CONGRESS ON ENGINEERING AND COMPUTER SCIENCE, 2007, : 903 - 908
  • [42] A novel weight initialization method for the random neural network
    Timotheou, Stelios
    [J]. NEUROCOMPUTING, 2009, 73 (1-3) : 160 - 168
  • [43] AN INITIALIZATION METHOD FOR FEEDFORWARD ARTIFICIAL NEURAL NETWORKS USING POLYNOMIAL BASES
    Varnava, Thanasis M.
    Meade, Andrew J., Jr.
    [J]. ADVANCES IN DATA SCIENCE AND ADAPTIVE ANALYSIS, 2011, 3 (03) : 385 - 400
  • [44] ODE System Identification of a Dynamic Weight Acquisition Process Using Feedforward Neural Networks
    Profe, Felix
    Ament, Christoph
    [J]. IFAC PAPERSONLINE, 2022, 55 (20): : 31 - 36
  • [45] Regularization effect of weight initialization in back propagation networks
    Cherkassky, V
    Shepherd, R
    [J]. IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 2258 - 2261
  • [46] Adaptive Weight Compression for Memory-Efficient Neural Networks
    Ko, Jong Hwan
    Kim, Duckhwan
    Na, Taesik
    Kung, Jaeha
    Mukhopadhyay, Saibal
    [J]. PROCEEDINGS OF THE 2017 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2017, : 199 - 204
  • [47] Hardware Efficient Weight-Binarized Spiking Neural Networks
    Tang, Chengcheng
    Han, Jie
    [J]. 2023 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2023,
  • [48] Quantitative Measures to Evaluate Neural Network Weight Initialization Strategies
    Ramos, Ernesto Zamora
    Nakakuni, Masanori
    Yfantis, Evangelos
    [J]. 2017 IEEE 7TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE IEEE CCWC-2017, 2017,
  • [49] Weight Agnostic Neural Networks
    Gaier, Adam
    Ha, David
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [50] An Improved Algorithm Using B-Spline Weight Functions for Training Feedforward Neural Networks
    Zhang, Daiyuan
    [J]. ADVANCES IN MECHATRONICS AND CONTROL ENGINEERING, PTS 1-3, 2013, 278-280 : 1301 - 1304