Analog versus discrete neural networks

被引:8
作者
DasGupta, B [1 ]
Schnitger, G [1 ]
机构
[1] UNIV FRANKFURT,FACHBEREICH INFORMAT 20,D-60054 FRANKFURT,GERMANY
关键词
D O I
10.1162/neco.1996.8.4.805
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that neural networks with three-times continuously differentiable activation functions are capable of computing a certain family of n-bit Boolean functions with two gates, whereas networks composed of binary threshold functions require at least Omega(log n) gates. Thus, for a large class of activation functions, analog neural networks can be more powerful than discrete neural networks, even when computing Boolean functions.
引用
收藏
页码:805 / 818
页数:14
相关论文
共 14 条
[1]  
BOPPANA R, 1990, HDB THEORETICAL COMP
[2]  
DASGUPTA B, 1993, ADV NEURAL INFORMATI, V5, P615
[3]  
GOLDMANN M, 1991, COMPUT COMPLEX, V1, P113
[4]  
Hajnal A., 1987, 28th Annual Symposium on Foundations of Computer Science (Cat. No.87CH2471-1), P99, DOI 10.1109/SFCS.1987.59
[5]   COMPUTATIONAL LIMITATIONS ON TRAINING SIGMOID NEURAL NETWORKS [J].
HOFFGEN, KU .
INFORMATION PROCESSING LETTERS, 1993, 46 (06) :269-274
[6]  
Maass W., 1993, Proceedings of the Twenty-Fifth Annual ACM Symposium on the Theory of Computing, P335, DOI 10.1145/167088.167193
[7]  
MAASS W, 1991, 32ND P IEEE S F COMP, P767
[8]  
Macintyre A., 1993, Proceedings of the Twenty-Fifth Annual ACM Symposium on the Theory of Computing, P325, DOI 10.1145/167088.167192
[10]  
REIF J, 1987, 2ND P STRUCT COMPL T, P118