Solving the linear interval tolerance problem for weight initialization of neural networks

被引:29
作者
Adam, S. P. [1 ,2 ]
Karras, D. A. [3 ]
Magoulas, G. D. [4 ]
Vrahatis, M. N. [1 ]
机构
[1] Univ Patras, Dept Math, Computat Intelligence Lab, GR-26110 Patras, Greece
[2] Technol Educ Inst Epirus, Dept Comp Engn, Arta 47100, Greece
[3] Technol Educ Inst Sterea Hellas, Dept Automat, Psahna 34400, Evia, Greece
[4] Univ London, Birkbeck Coll, Dept Comp Sci & Informat Syst, London WC1E 7HX, England
关键词
Neural networks; Weight initialization; Interval analysis; Linear interval tolerance problem; FEEDFORWARD NETWORKS; STATISTICAL TESTS; TRAINING SPEED; HIGH-DIMENSION; BACKPROPAGATION; ALGORITHM; INTELLIGENCE;
D O I
10.1016/j.neunet.2014.02.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Determining good initial conditions for an algorithm used to train a neural network is considered a parameter estimation problem dealing with uncertainty about the initial weights. Interval analysis approaches model uncertainty in parameter estimation problems using intervals and formulating tolerance problems. Solving a tolerance problem is defining lower and upper bounds of the intervals so that the system functionality is guaranteed within predefined limits. The aim of this paper is to show how the problem of determining the initial weight intervals of a neural network can be defined in terms of solving a linear interval tolerance problem. The proposed linear interval tolerance approach copes with uncertainty about the initial weights without any previous knowledge or specific assumptions on the input data as required by approaches such as fuzzy sets or rough sets. The proposed method is tested on a number of well known benchmarks for neural networks trained with the back-propagation family of algorithms. Its efficiency is evaluated with regards to standard performance measures and the results obtained are compared against results of a number of well known and established initialization methods. These results provide credible evidence that the proposed method outperforms classical weight initialization methods. (C) 2014 Elsevier Ltd. All rights reserved.
引用
收藏
页码:17 / 37
页数:21
相关论文
共 50 条
  • [21] Solving the Forward Kinematics of Cable-Driven Parallel Robots with Neural Networks and Interval Arithmetic
    Schmidt, Valentin
    Mueller, Bertram
    Pott, Andreas
    COMPUTATIONAL KINEMATICS (CK2013), 2014, 15 : 103 - 110
  • [22] An analysis of weight initialization methods in connection with different activation functions forfeedforward neural networks
    Wong, Kit
    Dornberger, Rolf
    Hanne, Thomas
    EVOLUTIONARY INTELLIGENCE, 2024, 17 (03) : 2081 - 2089
  • [23] An application of interval-valued neural networks to a regression problem
    Chetwynd, D.
    Worden, K.
    Manson, G.
    PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2006, 462 (2074): : 3097 - 3114
  • [24] An analysis of a class of neural networks for solving linear programming problems
    Chong, EKP
    Hui, S
    Zak, SH
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1999, 44 (11) : 1995 - 2006
  • [25] An Improved Algorithm for TSP Problem Solving with Hopfield Neural Networks
    An Jinliang
    Gao Jia
    Lei Jinhui
    Gao Guohong
    SMART MATERIALS AND INTELLIGENT SYSTEMS, PTS 1 AND 2, 2011, 143-144 : 538 - +
  • [26] RBF neural networks for solving the inverse problem of backscattering spectra
    Li, Michael M.
    Verma, Brijesh
    Fan, Xiaolong
    Tickle, Kevin
    NEURAL COMPUTING & APPLICATIONS, 2008, 17 (04) : 391 - 397
  • [27] RBF neural networks for solving the inverse problem of backscattering spectra
    Michael M. Li
    Brijesh Verma
    Xiaolong Fan
    Kevin Tickle
    Neural Computing and Applications, 2008, 17 : 391 - 397
  • [28] A weight initialization method based on neural network with asymmetric activation function
    Liu, Jingjing
    Liu, Yefeng
    Zhang, Qichun
    NEUROCOMPUTING, 2022, 483 : 171 - 182
  • [29] An effective SteinGLM initialization scheme for training multi-layer feedforward sigmoidal neural networks
    Yang, Zebin
    Zhang, Hengtao
    Sudjianto, Agus
    Zhang, Aijun
    NEURAL NETWORKS, 2021, 139 : 149 - 157
  • [30] Initialization of neural networks by means of decision trees
    Ivanova, I
    Kubat, M
    KNOWLEDGE-BASED SYSTEMS, 1995, 8 (06) : 333 - 344