Local coupled feedforward neural network

被引:8
作者
Sun, Jianye [1 ]
机构
[1] Harbin Univ Sci & Technol, Computat Ctr, Harbin, Peoples R China
关键词
MLP; BP; Feedforward; Neural networks; LCFNN; 2ND-ORDER LEARNING ALGORITHM; MULTILAYER PERCEPTRONS; BACKPROPAGATION; LAYER;
D O I
10.1016/j.neunet.2009.06.016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, the local coupled feedforward neural network is presented. Its connection structure is same as that of Multilayer Perceptron with one hidden layer. In the local coupled feedforward neural network, each hidden node is assigned an address in an input space, and each input activates only the hidden nodes near it. For each input, only the activated hidden nodes take part in forward and backward Propagation processes. Theoretical analysis and simulation results show that this neural network owns the "universal approximation" property and can solve the learning problem of feedforward neural networks. In addition, its characteristic of local coupling makes knowledge accumulation possible. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:108 / 113
页数:6
相关论文
共 27 条
[1]   PARALLEL RECURSIVE PREDICTION ERROR ALGORITHM FOR TRAINING LAYERED NEURAL NETWORKS [J].
CHEN, S ;
COWAN, CFN ;
BILLINGS, SA ;
GRANT, PM .
INTERNATIONAL JOURNAL OF CONTROL, 1990, 51 (06) :1215-1228
[2]   AN ACCELERATED LEARNING ALGORITHM FOR MULTILAYER PERCEPTRONS - OPTIMIZATION LAYER-BY-LAYER [J].
ERGEZINGER, S ;
THOMSEN, E .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (01) :31-42
[3]  
Evans D. J., 1997, Neural, Parallel & Scientific Computations, V5, P297
[4]   Incremental backpropagation learning networks [J].
Fu, LM ;
Hsu, HH ;
Principe, JC .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (03) :757-761
[5]   A pulse-based reinforcement algorithm for learning continuous functions [J].
Gorse, D ;
RomanoCritchley, DA ;
Taylor, JG .
NEUROCOMPUTING, 1997, 14 (04) :319-344
[6]  
LIGHT WA, 1992, NATO ADV SCI I C-MAT, V356, P163
[7]   Adaptive Improved Natural Gradient Algorithm for Blind Source Separation [J].
Liu, Jian-Qiang ;
Feng, Da-Zheng ;
Zhang, Wei-Wei .
NEURAL COMPUTATION, 2009, 21 (03) :872-889
[8]   A new adaptive backpropagation algorithm based on Lyapunov stability theory for neural networks [J].
Man, Zhihong ;
Wu, Hong Ren ;
Liu, Sophie ;
Yu, Xinghuo .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (06) :1580-1591
[9]   Efficient block training of multilayer perceptrons [J].
Navia-Vázquez, A ;
Figueiras-Vidal, AR .
NEURAL COMPUTATION, 2000, 12 (06) :1429-1447
[10]   Magnified gradient function with deterministic weight modification in adaptive learning [J].
Ng, SC ;
Cheung, CC ;
Leung, SH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (06) :1411-1423