Learning capability and storage capacity of two-hidden-layer feedforward networks

被引:641
作者
Huang, GB [1 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2003年 / 14卷 / 02期
关键词
learning capability; neural-network modularity; storage capacity; two-hidden-layer feedforward networks (TLFNs);
D O I
10.1109/TNN.2003.809401
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The problem of the necessary complexity of neural networks is of interest in applications. In this paper, learning capability and storage capacity of feedforward neural networks are considered. We markedly improve recent results by introducing neural-network modularity logically. This paper rigorously proves in a constructive method that two-hidden-layer. feedforward networks (TLFNs) with 2root(m+2)N (much less thanN) hidden neurons can learn any N distinct samples (x(i), t(i)) with any arbitrarily small error, where m is the required number of output neurons. It implies that the required number of hidden neurons needed in feedforward networks can be decreased significantly, comparing with previous results-Conversely, a TLFN with Q hidden neurons can store at least Q(2)/4(m + 2) any distinct data (X-i, t(i)) with any desired precision.
引用
收藏
页码:274 / 281
页数:8
相关论文
共 29 条
[11]  
GALLANT A, 1992, ARTIFICIAL NEURAL NE, P5
[12]   APPROXIMATION CAPABILITIES OF MULTILAYER FEEDFORWARD NETWORKS [J].
HORNIK, K .
NEURAL NETWORKS, 1991, 4 (02) :251-257
[13]  
Huang GB, 1998, IEEE T NEURAL NETWOR, V9, P714, DOI 10.1109/72.701184
[14]   Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions [J].
Huang, GB ;
Babri, HA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1998, 9 (01) :224-229
[15]   APPROXIMATION OF CONTINUOUS-FUNCTIONS ON R(D) BY LINEAR-COMBINATIONS OF SHIFTED ROTATIONS OF A SIGMOID FUNCTION WITH AND WITHOUT SCALING [J].
ITO, Y .
NEURAL NETWORKS, 1992, 5 (01) :105-115
[16]  
KARPINSKI M, 1995, P 27 ACM S THEOR COM, P200
[17]   Neural networks with quadratic VC dimension [J].
Koiran, P ;
Sontag, ED .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 54 (01) :190-198
[18]   ARBITRARY NONLINEARITY IS SUFFICIENT TO REPRESENT ALL FUNCTIONS BY NEURAL NETWORKS - A THEOREM [J].
KREINOVICH, VY .
NEURAL NETWORKS, 1991, 4 (03) :381-383
[19]   KOLMOGOROV THEOREM AND MULTILAYER NEURAL NETWORKS [J].
KURKOVA, V .
NEURAL NETWORKS, 1992, 5 (03) :501-506
[20]   MULTILAYER FEEDFORWARD NETWORKS WITH A NONPOLYNOMIAL ACTIVATION FUNCTION CAN APPROXIMATE ANY FUNCTION [J].
LESHNO, M ;
LIN, VY ;
PINKUS, A ;
SCHOCKEN, S .
NEURAL NETWORKS, 1993, 6 (06) :861-867