Deep Belief Networks Are Compact Universal Approximators

被引:119
作者
Le Roux, Nicolas [1 ]
Bengio, Yoshua [2 ]
机构
[1] Microsoft Res Cambridge, Cambridge CB3 OFB, England
[2] Univ Montreal, Dept Informat & Rech Operat, Montreal, PQ H3C 3J7, Canada
关键词
D O I
10.1162/neco.2010.08-09-1081
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep belief networks (DBN) are generative models with many layers of hidden causal variables, recently introduced by Hinton, Osindero, and Teh (2006), along with a greedy layer-wise unsupervised learning algorithm. Building on Le Roux and Bengio (2008) and Sutskever and Hinton (2008), we show that deep but narrow generative networks do not require more parameters than shallow ones to achieve universal approximation. Exploiting the proof technique, we prove that deep but narrow feedforward neural networks with sigmoidal units can represent any Boolean expression.
引用
收藏
页码:2192 / 2207
页数:16
相关论文
共 15 条
[1]  
[Anonymous], 2008, P 25 INT C MACH LEAR
[2]  
[Anonymous], 1991, Comput. Complexity
[3]  
Bengio Y., 2006, Advances in Neural Information Processing Systems, V19, DOI DOI 10.7551/MITPRESS/7503.003.0024
[4]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[5]  
Freund Y., 1991, Advances in Neural Information Processing Systems, V4
[6]  
Gray F., 1953, United States Patent
[7]   Training products of experts by minimizing contrastive divergence [J].
Hinton, GE .
NEURAL COMPUTATION, 2002, 14 (08) :1771-1800
[8]   A fast learning algorithm for deep belief nets [J].
Hinton, Geoffrey E. ;
Osindero, Simon ;
Teh, Yee-Whye .
NEURAL COMPUTATION, 2006, 18 (07) :1527-1554
[9]   Representational power of restricted Boltzmann machines and deep belief networks [J].
Le Roux, Nicolas ;
Bengio, Yoshua .
NEURAL COMPUTATION, 2008, 20 (06) :1631-1649
[10]   CONNECTIONIST LEARNING OF BELIEF NETWORKS [J].
NEAL, RM .
ARTIFICIAL INTELLIGENCE, 1992, 56 (01) :71-113