NEURAL SUBNET DESIGN BY DIRECT POLYNOMIAL MAPPING

被引:14
作者
ROHANI, K [1 ]
CHEN, MS [1 ]
MANRY, MT [1 ]
机构
[1] UNIV TEXAS,DEPT ELECT ENGN,ARLINGTON,TX 76019
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1992年 / 3卷 / 06期
关键词
D O I
10.1109/72.165606
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In a recent paper by two of the authors, it was shown that multilayer perceptron neural networks can be used to form products of any number of inputs, thereby constructively proving universal approximation. In this letter we extend this result, describing a new method for the analysis and synthesis of single-input, single-output neural subnetworks. Given training samples of a function to be approximated, a feedforward neural network is designed which implements a polynomial approximation of the function with arbitrary accuracy. For comparison, example subnets are designed by classical back-propagation training and by mapping. The examples illustrate that (1) our mapped subnets avoid local minima which back-propagation-trained subnets get trapped in and that (2) the mapping approach is much faster.
引用
收藏
页码:1024 / 1026
页数:3
相关论文
共 12 条
  • [1] CHEN M, 1990, JUN INT JOINT C NEUR, V1, P643
  • [2] CHEN MS, 1990, SEP C REC MIDC DALL, P414
  • [3] CHEN MS, 1991, 34TH P MIDW S CIRC S
  • [4] CHEN MS, 1991, P IJCNN 91 SEATTLE, pI295
  • [5] Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274
  • [6] Hecht-Nielsen R., 1989, IJCNN: International Joint Conference on Neural Networks (Cat. No.89CH2765-6), P593, DOI 10.1109/IJCNN.1989.118638
  • [7] ROHANI K, 1991, JUL INT JOINT C NEUR, V2, P497
  • [8] Rumelhart DE, 1986, ENCY DATABASE SYST, P45
  • [9] APPROXIMATION THEOREMS FOR DISCRETE-TIME-SYSTEMS
    SANDBERG, IW
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1991, 38 (05): : 564 - 566
  • [10] WAIBEL A, 1990, MAY P INT C AC SPEEC, P112