A note on learning automata-based schemes for adaptation of BP parameters

被引:34
作者
Meybodi, MR [1 ]
Beigy, H [1 ]
机构
[1] Amirkabir Univ Technol, Comp Engn Dept, Soft Comp Lab, Tehran, Iran
关键词
neural network; backpropagation; learning automata; learning rate; steepness parameter; momentum factor;
D O I
10.1016/S0925-2312(01)00686-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we study the ability of learning automata-based schemes in escaping from local minima when standard backpropagation (BP) fails to find the global minima. It is demonstrated through simulation that learning automata-based schemes compared to other schemes such as SAB, Super SAB, Fuzzy BP, adaptive steepness method, and variable learning rate method have a higher ability to escape from local minima. (C) 2002 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:957 / 974
页数:18
相关论文
共 30 条
[1]  
ARABSHAHI P, 1992, P 1 IEEE INT C FUZZ, P967
[2]  
BEIGY H, 2001, INT J SCI TECHNOL, V8, P1
[3]  
BEIGY H, 1999, P 4 ANN COMP SOC IR, P117
[4]  
Beigy H., 1998, P 6 IR C EL ENG TEHR, P117
[5]   Approximation of Boolean Functions by Sigmoidal Networks: Part I: XOR and Other Two-Variable Functions [J].
Blum, E. K. .
NEURAL COMPUTATION, 1989, 1 (04) :532-540
[6]  
CATER JP, 1987, IEEE 1 INT C NEUR NE, V2, P645
[7]  
Chan L.-W., 1987, Computer Speech and Language, V2, P205, DOI 10.1016/0885-2308(87)90009-X
[8]  
DARKEN C, 1992, P IEEE WORKSH NEUR N, P3
[9]  
DEVOS MR, 1988, P NEUR, P104
[10]  
FRANZINI MA, 1987, IEEE P 9 ANN C ENG M, P1702