Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method

被引:14
作者
Pu, Yi-fei [1 ]
Wang, Jian [2 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Chengdu 610065, Peoples R China
[2] China Univ Petr, Coll Sci, Qingdao 266580, Peoples R China
基金
中国国家自然科学基金;
关键词
Fractional calculus; Fractional-order backpropagation algorithm; Fractional-order steepest descent method; Mean square error; Fractional-order multi-scale global optimization; O235; N93; BACK-PROPAGATION; FEEDFORWARD NETWORKS; NEURAL-NETWORKS; OPTIMIZATION; CONVERGENCE;
D O I
10.1631/FITEE.1900593
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We introduce the fractional-order global optimal backpropagation machine, which is trained by an improved fractional-order steepest descent method (FSDM). This is a fractional-order backpropagation neural network (FBPNN), a state-of-the-art fractional-order branch of the family of backpropagation neural networks (BPNNs), different from the majority of the previous classic first-order BPNNs which are trained by the traditional first-order steepest descent method. The reverse incremental search of the proposed FBPNN is in the negative directions of the approximate fractional-order partial derivatives of the square error. First, the theoretical concept of an FBPNN trained by an improved FSDM is described mathematically. Then, the mathematical proof of fractional-order global optimal convergence, an assumption of the structure, and fractional-order multi-scale global optimization of the FBPNN are analyzed in detail. Finally, we perform three (types of) experiments to compare the performances of an FBPNN and a classic first-order BPNN, i.e., example function approximation, fractional-order multi-scale global optimization, and comparison of global search and error fitting abilities with real data. The higher optimal search ability of an FBPNN to determine the global optimal solution is the major advantage that makes the FBPNN superior to a classic first-order BPNN.
引用
收藏
页码:809 / 833
页数:25
相关论文
共 47 条
[1]   Cutting angle methods in global optimization [J].
Andramonov, M ;
Rubinov, A ;
Glover, B .
APPLIED MATHEMATICS LETTERS, 1999, 12 (03) :95-100
[2]   OPTIMIZATION FOR TRAINING NEURAL NETS [J].
BARNARD, E .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (02) :232-240
[3]   UNIVERSAL APPROXIMATION BOUNDS FOR SUPERPOSITIONS OF A SIGMOIDAL FUNCTION [J].
BARRON, AR .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1993, 39 (03) :930-945
[4]   1ST-ORDER AND 2ND-ORDER METHODS FOR LEARNING - BETWEEN STEEPEST DESCENT AND NEWTON METHOD [J].
BATTITI, R .
NEURAL COMPUTATION, 1992, 4 (02) :141-166
[5]   A Survey of Monte Carlo Tree Search Methods [J].
Browne, Cameron B. ;
Powley, Edward ;
Whitehouse, Daniel ;
Lucas, Simon M. ;
Cowling, Peter I. ;
Rohlfshagen, Philipp ;
Tavener, Stephen ;
Perez, Diego ;
Samothrakis, Spyridon ;
Colton, Simon .
IEEE TRANSACTIONS ON COMPUTATIONAL INTELLIGENCE AND AI IN GAMES, 2012, 4 (01) :1-43
[6]   An empirical comparison of combinations of evolutionary algorithms and neural networks for classification problems [J].
Cantú-Paz, E ;
Kamath, C .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2005, 35 (05) :915-927
[7]   CONJUGATE-GRADIENT ALGORITHM FOR EFFICIENT TRAINING OF ARTIFICIAL NEURAL NETWORKS [J].
CHARALAMBOUS, C .
IEE PROCEEDINGS-G CIRCUITS DEVICES AND SYSTEMS, 1992, 139 (03) :301-310
[8]   The annealing robust backpropagation (ARBP) learning algorithm [J].
Chuang, CC ;
Su, SF ;
Hsiao, CC .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (05) :1067-1077
[9]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
[10]   Fractional-Order Circuits and Systems: An Emerging Interdisciplinary Research Area [J].
Elwakil, Ahmed S. .
IEEE CIRCUITS AND SYSTEMS MAGAZINE, 2010, 10 (04) :40-50