Smooth Function Approximation by Deep Neural Networks with General Activation Functions

被引:58
作者
Ohn, Ilsang [1 ]
Kim, Yongdai [1 ]
机构
[1] Seoul Natl Univ, Dept Stat, Seoul 08826, South Korea
基金
新加坡国家研究基金会;
关键词
function approximation; deep neural networks; activation functions; Holder continuity; convergence rates; MULTILAYER FEEDFORWARD NETWORKS;
D O I
10.3390/e21070627
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class of activation functions. This class of activation functions includes most of frequently used activation functions. We derive the required depth, width and sparsity of a deep neural network to approximate any Holder smooth function upto a given approximation error for the large class of activation functions. Based on our approximation error analysis, we derive the minimax optimality of the deep neural network estimators with the general activation functions in both regression and classification problems.
引用
收藏
页数:21
相关论文
共 33 条