Dropout Rademacher complexity of deep neural networks

被引:1
|
作者
Wei GAO [1 ,2 ]
Zhi-Hua ZHOU [1 ,2 ]
机构
[1] National Key Laboratory for Novel Software Technology, Nanjing University
[2] Collaborative Innovation Center of Novel Software Technology and Industrialization,Nanjing University
基金
中国国家自然科学基金;
关键词
artificial intelligence; machine learning; deep learning; dropout; Rademacher complexity;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Great successes of deep neural networks have been witnessed in various real applications. Many algorithmic and implementation techniques have been developed; however, theoretical understanding of many aspects of deep neural networks is far from clear. A particular interesting issue is the usefulness of dropout,which was motivated from the intuition of preventing complex co-adaptation of feature detectors. In this paper,we study the Rademacher complexity of different types of dropouts, and our theoretical results disclose that for shallow neural networks(with one or none hidden layer) dropout is able to reduce the Rademacher complexity in polynomial, whereas for deep neural networks it can amazingly lead to an exponential reduction.
引用
收藏
页码:173 / 184
页数:12
相关论文
共 50 条
  • [21] Augmenting Recurrent Neural Networks Resilience by Dropout
    Bacciu, Davide
    Crecchi, Francesco
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (01) : 345 - 351
  • [22] Relating Information Complexity and Training in Deep Neural Networks
    Gain, Alex
    Siegelmann, Hava
    MICRO- AND NANOTECHNOLOGY SENSORS, SYSTEMS, AND APPLICATIONS XI, 2019, 10982
  • [23] On the Reduction of Computational Complexity of Deep Convolutional Neural Networks
    Maji, Partha
    Mullins, Robert
    ENTROPY, 2018, 20 (04)
  • [24] CamDrop: A New Explanation of Dropout and A Guided Regularization Method for Deep Neural Networks
    Wang, Hongjun
    Wang, Guangrun
    Li, Guanbin
    Lin, Liang
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 1141 - 1149
  • [25] Deep Dropout Artificial Neural Networks for Recognising Digits and Characters in Natural Images
    Barrow, Erik
    Jayne, Chrisina
    Eastwood, Mark
    NEURAL INFORMATION PROCESSING, ICONIP 2015, PT IV, 2015, 9492 : 29 - 37
  • [26] A Review on Dropout Regularization Approaches for Deep Neural Networks within the Scholarly Domain
    Salehin, Imrus
    Kang, Dae-Ki
    ELECTRONICS, 2023, 12 (14)
  • [27] Dropout in Neural Networks Simulates the Paradoxical Effects of Deep Brain Stimulation on Memory
    Tan, Shawn Zheng Kai
    Du, Richard
    Perucho, Jose Angelo Udal
    Chopra, Shauhrat S.
    Vardhanabhuti, Varut
    Lim, Lee Wei
    FRONTIERS IN AGING NEUROSCIENCE, 2020, 12
  • [28] ISING-DROPOUT: A REGULARIZATION METHOD FOR TRAINING AND COMPRESSION OF DEEP NEURAL NETWORKS
    Salehinejad, Hojjat
    Valaee, Shahrokh
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3602 - 3606
  • [29] Online Arabic Handwriting Recognition with Dropout applied in Deep Recurrent Neural Networks
    Maalej, Rania
    Tagougui, Najiba
    Kherallah, Monji
    PROCEEDINGS OF 12TH IAPR WORKSHOP ON DOCUMENT ANALYSIS SYSTEMS, (DAS 2016), 2016, : 417 - 421
  • [30] Threshout Regularization for Deep Neural Networks
    Williams, Travis
    Li, Robert
    SOUTHEASTCON 2021, 2021, : 728 - 735