Local and Global Sparsity for Deep Learning Networks

被引:0
作者
Zhang, Long [1 ]
Zhao, Jieyu [1 ]
Shi, Xiangfu [1 ]
Ye, Xulun [1 ]
机构
[1] Ningbo Univ, Dept Comp Sci, 818 Fenghua Rd, Ningbo 315211, Peoples R China
来源
IMAGE AND GRAPHICS (ICIG 2017), PT II | 2017年 / 10667卷
基金
中国国家自然科学基金;
关键词
Sparsity; Regularization; Deep learning; GAN;
D O I
10.1007/978-3-319-71589-6_7
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
It has been proved that applying sparsity regularization in deep learning networks is an efficient approach. Researchers have developed several algorithms to control the sparseness of activation probability of hidden units. However, each of them has inherent limitations. In this paper, we firstly analyze weaknesses and strengths for popular sparsity algorithms, and categorize them into two groups: local and global sparsity. L-1/2 regularization is first time introduced as a global sparsity method for deep learning networks. Secondly, a combined solution is proposed to integrate local and global sparsity methods. Thirdly we customize proposed solution to fit in two deep learning networks: deep belief network (DBN) and generative adversarial network (GAN), and then test on benchmark datasets MNIST and CelebA. Experimental results show that our method outperforms existing sparsity algorithm on digits recognition, and achieves a better performance on human face generation. Additionally, proposed method could also stabilize GAN loss changes and eliminate noises.
引用
收藏
页码:74 / 85
页数:12
相关论文
共 50 条
  • [31] A sparsity-based stochastic pooling mechanism for deep convolutional neural networks
    Song, Zhenhua
    Liu, Yan
    Song, Rong
    Chen, Zhenguang
    Yang, Jianyong
    Zhang, Chao
    Jiang, Qing
    NEURAL NETWORKS, 2018, 105 : 340 - 345
  • [32] SPARSITY-PROMOTING ADAPTIVE ALGORITHM FOR DISTRIBUTED LEARNING IN DIFFUSION NETWORKS
    Chouvardas, Symeon
    Slavakis, Konstantinos
    Kopsinis, Yannis
    Theodoridis, Sergios
    2012 PROCEEDINGS OF THE 20TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2012, : 1084 - 1088
  • [33] The Use of Deep Learning and Mean Shift to Learn Global and Local Processing in Human Visual Perception
    Hsu, Wei-Wen
    Zhang, Min
    Chen, Chung-Hao
    Yang, Wen-Chao
    2016 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2016, : 2139 - 2144
  • [34] A study on deep learning model based on global-local structure for crowd flow prediction
    Go, Heounmo
    Park, Sanghyun
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [35] Assessing Global-Local Secondary Structure Fingerprints to Classify RNA Sequences With Deep Learning
    Sutanto, Kevin
    Turcotte, Marcel
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (05) : 2736 - 2747
  • [36] A local and global event sentiment based efficient stock exchange forecasting using deep learning
    Maqsood, Haider
    Mehmood, Irfan
    Maqsood, Muazzam
    Yasir, Muhammad
    Afzal, Sitara
    Aadil, Farhan
    Selim, Mahmoud Mohamed
    Muhammad, Khan
    INTERNATIONAL JOURNAL OF INFORMATION MANAGEMENT, 2020, 50 : 432 - 451
  • [37] DWD-net: Cascaded local and global deep learning network for brain MR registration
    Yang, Yue
    Wang, Yuwen
    Li, Jitao
    Zhang, Lintao
    Hu, Shunbo
    2021 14TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI 2021), 2021,
  • [38] Compressing Low Precision Deep Neural Networks Using Sparsity-Induced Regularization in Ternary Networks
    Faraone, Julian
    Fraser, Nicholas
    Gambardella, Giulio
    Blott, Michaela
    Leong, Philip H. W.
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT II, 2017, 10635 : 393 - 404
  • [39] Macroeconomic forecasting with large Bayesian VARs: Global-local priors and the illusion of sparsity
    Cross, Jamie L.
    Hou, Chenghan
    Poon, Aubrey
    INTERNATIONAL JOURNAL OF FORECASTING, 2020, 36 (03) : 899 - 915
  • [40] Deep Spike Learning With Local Classifiers
    Ma, Chenxiang
    Yan, Rui
    Yu, Zhaofei
    Yu, Qiang
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (05) : 3363 - 3375