30 YEARS OF ADAPTIVE NEURAL NETWORKS - PERCEPTRON, MADALINE, AND BACKPROPAGATION

被引:1263
作者
WIDROW, B
LEHR, MA
机构
[1] Information Systems Laboratory, Department of Electrical Engineering, Stanford University, Stanford
基金
美国国家航空航天局;
关键词
Neural Networks;
D O I
10.1109/5.58323
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Fundamental developments in feedforward artificial neural networks from the past thirty years are reviewed. The central theme of this paper is a description of the history, origination, operating characteristics, and basic theory of several supervised neural network training algorithms including the Perceptron rule, the LMS algorithm, three Madaline rules, and the backpropagation technique. These methods were developed independently, but with the perspective of history they can all be related to each other. The concept underlying these algorithms is the “minimal disturbance principle,” which suggests that during training it is advisable to inject new information into a network in a manner that disturbs stored information to the smallest extent possible. © 1990 IEEE
引用
收藏
页码:1415 / 1442
页数:28
相关论文
共 131 条
  • [1] INFORMATION CAPACITY OF THE HOPFIELD MODEL
    ABUMOSTAFA, YS
    ST JACQUES, JM
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 1985, 31 (04) : 461 - 464
  • [2] ABUMOSTAFA YS, 1986, AIP C P SNOWBIRD UT, V151, P1
  • [3] Albus J. S., 1975, Transactions of the ASME. Series G, Journal of Dynamic Systems, Measurement and Control, V97, P220, DOI 10.1115/1.3426922
  • [4] ALMEIDA LB, 1987, 1ST P IEEE INT C NEU, V2, P609
  • [5] ANDERSON JA, 1988, NEUROCOMPUTING F RES
  • [6] ANDES D, 1990, JAN P INT JOINT C NE, V1, P533
  • [7] [Anonymous], 1988, DARPA NEURAL NETWORK
  • [8] [Anonymous], 1988, CONTINUOUS VALUED NE
  • [9] [Anonymous], 1989, ANALOG VLSI NEURAL S
  • [10] [Anonymous], 2016, LINEAR NONLINEAR PRO