Bound the Parameters of Neural Networks Using Particle Swarm Optimization

被引:1
作者
Tsoulos, Ioannis G. [1 ]
Tzallas, Alexandros [1 ]
Karvounis, Evangelos [1 ]
Tsalikakis, Dimitrios [2 ]
机构
[1] Univ Ioannina, Dept Informat & Telecommun, Ioannina 45110, Greece
[2] Univ Western Macedonia, Dept Engn Informat & Telecommun, Kozani 50100, Greece
关键词
global optimization; local optimization; stochastic methods; evolutionary techniques; termination rules; WEIGHT INITIALIZATION METHOD; TRAINING ALGORITHM; NUMERICAL-SOLUTION; PREDICTION; RECOGNITION; CLASSIFICATION; CONSTRUCTION; EXTRACTION; DISCOVERY; DIAGNOSIS;
D O I
10.3390/computers12040082
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Artificial neural networks are machine learning models widely used in many sciences as well as in practical applications. The basic element of these models is a vector of parameters; the values of these parameters should be estimated using some computational method, and this process is called training. For effective training of the network, computational methods from the field of global minimization are often used. However, for global minimization techniques to be effective, the bounds of the objective function should also be clearly defined. In this paper, a two-stage global optimization technique is presented for efficient training of artificial neural networks. In the first stage, the bounds for the neural network parameters are estimated using Particle Swarm Optimization and, in the following phase, the parameters of the network are optimized within the bounds of the first phase using global optimization techniques. The suggested method was used on a series of well-known problems in the literature and the experimental results were more than encouraging.
引用
收藏
页数:17
相关论文
共 114 条
[1]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[2]   Optimizing connection weights in neural networks using the whale optimization algorithm [J].
Aljarah, Ibrahim ;
Faris, Hossam ;
Mirjalili, Seyedali .
SOFT COMPUTING, 2018, 22 (01) :1-15
[3]  
[Anonymous], 1991, Computer systems that learn classification and prediction methods from statistics, neural nets, machine learning and expert systems
[4]   Pruning algorithms of neural networks - a comparative study [J].
Augasta, M. Gethsiyal ;
Kathirvalavakumar, T. .
OPEN COMPUTER SCIENCE, 2013, 3 (03) :105-115
[5]   Parameterized neural networks for high-energy physics [J].
Baldi, Pierre ;
Cranmer, Kyle ;
Faucett, Taylor ;
Sadowski, Peter ;
Whiteson, Daniel .
EUROPEAN PHYSICAL JOURNAL C, 2016, 76 (05)
[6]  
Bartzatt R., 2018, Chem. Fac. Publ, V49, P16
[7]   A renaissance of neural networks in drug discovery [J].
Baskin, Igor I. ;
Winkler, David ;
Tetko, Igor V. .
EXPERT OPINION ON DRUG DISCOVERY, 2016, 11 (08) :785-795
[8]  
Bishop C.M., 1995, Neural Networks for Pattern Recognition
[9]   Facial expression recognition based on a mlp neural network using constructive training algorithm [J].
Boughrara, Hayet ;
Chtourou, Mohamed ;
Ben Amar, Chokri ;
Chen, Liming .
MULTIMEDIA TOOLS AND APPLICATIONS, 2016, 75 (02) :709-731
[10]   MAIN ROTOR BROAD-BAND NOISE STUDY IN THE DNW [J].
BROOKS, TF ;
MARCOLINI, MA ;
POPE, DS .
JOURNAL OF THE AMERICAN HELICOPTER SOCIETY, 1989, 34 (02) :3-12