Effective sparsity control in deep belief networks using normal regularization term

被引:0
|
作者
Mohammad Ali Keyvanrad
Mohammad Mehdi Homayounpour
机构
[1] Amirkabir University of Technology,Laboratory for Intelligent Multimedia Processing (LIMP)
来源
Knowledge and Information Systems | 2017年 / 53卷
关键词
Deep belief network; Restricted Boltzmann machine; Normal sparse RBM; Quadratic sparse RBM; Rate distortion sparse RBM;
D O I
暂无
中图分类号
学科分类号
摘要
Nowadays the use of deep network architectures has become widespread in machine learning. Deep belief networks (DBNs) have deep network architectures to create a powerful generative model using training data. Deep belief networks can be used in classification and feature learning. A DBN can be learned unsupervised, and then the learned features are suitable for a simple classifier (like a linear classifier) with a few labeled data. In addition, according to researches, by using sparsity in DBNs we can learn useful low-level feature representations for unlabeled data. In sparse representation, we have the property that learned features can be interpreted, i.e., correspond to meaningful aspects of the input, and capture factors of variation in the data. Different methods are proposed to build sparse DBNs. In this paper, we proposed a new method that has different behavior according to deviation of the activation of the hidden units from a (low) fixed value. In addition, our proposed regularization term has a variance parameter that can control the force degree of sparseness. According to the results, our new method achieves the best recognition accuracy on the test sets in different datasets with different applications (image, speech and text) and we can achieve incredible results when using a different number of training samples, especially when we have a few samples for training.
引用
收藏
页码:533 / 550
页数:17
相关论文
共 50 条
  • [21] Project performance evaluation using deep belief networks
    Nguvulu, Alick
    Yamato, Shoso
    Honma, Toshihisa
    IEEJ Transactions on Electronics, Information and Systems, 2012, 132 (02) : 306 - 312
  • [22] Bearing fault diagnosis using deep belief networks
    Xiao, Xiang Ping
    Lin, Tian Ran
    Yu, Kun
    International Journal of COMADEM, 2018, 21 (02): : 23 - 27
  • [23] Fusion of medical images using deep belief networks
    Kaur, Manjit
    Singh, Dilbag
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2020, 23 (02): : 1439 - 1453
  • [24] Fusion of medical images using deep belief networks
    Manjit Kaur
    Dilbag Singh
    Cluster Computing, 2020, 23 : 1439 - 1453
  • [25] Human Activity Recognition Using Deep Belief Networks
    Yalcin, Hulya
    2016 24TH SIGNAL PROCESSING AND COMMUNICATION APPLICATION CONFERENCE (SIU), 2016, : 1649 - 1652
  • [26] Breast cancer classification using deep belief networks
    Abdel-Zaher, Ahmed M.
    Eldeib, Ayman M.
    EXPERT SYSTEMS WITH APPLICATIONS, 2016, 46 : 139 - 144
  • [27] Deep Brain Stimulation Signal Classification using Deep Belief Networks
    Guillen-Rondon, Pablo
    Robinson, Melvin D.
    2016 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE & COMPUTATIONAL INTELLIGENCE (CSCI), 2016, : 155 - 158
  • [28] Norm Loss: An efficient yet effective regularization method for deep neural networks
    Georgiou, Theodoros
    Schmitt, Sebastian
    Back, Thomas
    Chen, Wei
    Lew, Michael
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8812 - 8818
  • [29] Modeling of landslides susceptibility prediction using deep belief networks with optimized learning rate control
    Liu, Qiang
    Norbu, Namkha
    GEOCARTO INTERNATIONAL, 2024, 39 (01)
  • [30] Short-Term Traffic Flow Forecasting Using Ensemble Approach Based on Deep Belief Networks
    Liu, Jin
    Wu, NaiQi
    Qiao, Yan
    Li, ZhiWu
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (01) : 404 - 417