Theory of deep convolutional neural networks: Downsampling

被引:149
作者
Zhou, Ding-Xuan [1 ,2 ]
机构
[1] City Univ Hong Kong, Sch Data Sci, Kowloon, Hong Kong, Peoples R China
[2] City Univ Hong Kong, Dept Math, Kowloon, Hong Kong, Peoples R China
关键词
Deep learning; Convolutional neural networks; Approximation theory; Downsampling; Filter masks; MULTILAYER FEEDFORWARD NETWORKS; OPTIMAL APPROXIMATION; REGRESSION; ALGORITHM; BOUNDS;
D O I
10.1016/j.neunet.2020.01.018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Establishing a solid theoretical foundation for structured deep neural networks is greatly desired due to the successful applications of deep learning in various practical domains. This paper aims at an approximation theory of deep convolutional neural networks whose structures are induced by convolutions. To overcome the difficulty in theoretical analysis of the networks with linearly increasing widths arising from convolutions, we introduce a downsampling operator to reduce the widths. We prove that the downsampled deep convolutional neural networks can be used to approximate ridge functions nicely, which hints some advantages of these structured networks in terms of approximation or modeling. We also prove that the output of any multi-layer fully-connected neural network can be realized by that of a downsampled deep convolutional neural network with free parameters of the same order, which shows that in general, the approximation ability of deep convolutional neural networks is at least as good as that of fully-connected networks. Finally, a theorem for approximating functions on Riemannian manifolds is presented, which demonstrates that deep convolutional neural networks can be used to learn manifold features of data. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页码:319 / 327
页数:9
相关论文
共 30 条
[21]   Optimal approximation of piecewise smooth functions using deep ReLU neural networks [J].
Petersen, Philipp ;
Voigtlaender, Felix .
NEURAL NETWORKS, 2018, 108 :296-330
[22]   Provable approximation properties for deep neural networks [J].
Shaham, Uri ;
Cloninger, Alexander ;
Coifman, Ronald R. .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2018, 44 (03) :537-557
[23]  
STEINWART I., 2008, SUPPORT VECTOR MACHI
[24]  
Telgarsky Matus, 2016, P MACHINE LEARNING R
[25]   Error bounds for approximations with deep ReLU networks [J].
Yarotsky, Dmitry .
NEURAL NETWORKS, 2017, 94 :103-114
[26]   Unregularized online learning algorithms with general loss functions [J].
Ying, Yiming ;
Zhou, Ding-Xuan .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2017, 42 (02) :224-244
[27]  
Zhang YC, 2015, J MACH LEARN RES, V16, P3299
[28]  
Zhou D. X., 2018, DISTRIBUTED AP UNPUB
[29]   Universality of deep convolutional neural networks [J].
Zhou, Ding-Xuan .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2020, 48 (02) :787-794
[30]   Deep distributed convolutional neural networks: Universality [J].
Zhou, Ding-Xuan .
ANALYSIS AND APPLICATIONS, 2018, 16 (06) :895-919