Biologically motivated learning method for deep neural networks using hierarchical competitive learning

被引:6
|
作者
Shinozaki, Takashi [1 ,2 ]
机构
[1] Natl Inst Informat & Commun Technol NICT, Ctr Informat & Neural Networks CiNet, 1-4 Yamadaoka, Suita, Osaka 5650871, Japan
[2] Osaka Univ, Grad Sch Informat Sci & Technol, 1-5 Yamadaoka, Suita, Osaka 5650871, Japan
关键词
Semisupervised learning; Unsupervised learning; Deep neural network; Deep learning; Feature extraction; MODEL; CODE;
D O I
10.1016/j.neunet.2021.08.027
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This study proposes a novel biologically motivated learning method for deep convolutional neural networks (CNNs). The combination of CNNs and backpropagation learning is the most powerful method in recent machine learning regimes. However, it requires a large amount of labeled data for training, and this requirement can occasionally become a barrier for real world applications. To address this problem and use unlabeled data, we introduce unsupervised competitive learning, which only requires forward propagating signals for CNNs. The method was evaluated on image discrimination tasks using the MNIST, CIFAR-10, and ImageNet datasets, and it achieved state-of-the-art performance with respect to other biologically motivated methods in the ImageNet benchmark. The results suggest that the method enables higher-level learning representations solely based on the forward propagating signals without the need for a backward error signal for training convolutional layers. The proposed method could be useful for a variety of poorly labeled data, for example, time series or medical data. (C) 2021 The Author. Published by Elsevier Ltd.
引用
收藏
页码:271 / 278
页数:8
相关论文
共 50 条
  • [1] Learning deep hierarchical and temporal recurrent neural networks with residual learning
    Zia, Tehseen
    Abbas, Assad
    Habib, Usman
    Khan, Muhammad Sajid
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2020, 11 (04) : 873 - 882
  • [2] Learning deep hierarchical and temporal recurrent neural networks with residual learning
    Tehseen Zia
    Assad Abbas
    Usman Habib
    Muhammad Sajid Khan
    International Journal of Machine Learning and Cybernetics, 2020, 11 : 873 - 882
  • [3] Biologically-motivated neural learning in situated systems
    Damper, RI
    Scutt, TW
    ISCAS '98 - PROCEEDINGS OF THE 1998 INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-6, 1998, : B115 - B118
  • [4] A SPEECH RECOGNITION METHOD USING COMPETITIVE AND SELECTIVE LEARNING NEURAL NETWORKS
    徐雄
    胡光锐
    严永红
    JournalofShanghaiJiaotongUniversity, 2000, (02) : 10 - 13
  • [5] Biologically Motivated Quantum Neural Networks
    Steck, James E.
    Behrman, Elizabeth C.
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 1030 - 1034
  • [6] Speech Enhancement Using NMF based on Hierarchical Deep Neural Networks with Joint Learning
    Mirjalili, Mohammad Mahdi
    Seyedin, Sanaz
    2020 28TH IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2020, : 1411 - 1415
  • [7] Learning Automata Based Incremental Learning Method for Deep Neural Networks
    Guo, Haonan
    Wang, Shilin
    Fan, Jianxun
    Li, Shenghong
    IEEE ACCESS, 2019, 7 (41164-41171) : 41164 - 41171
  • [8] DEEP LEARNING BASED METHOD FOR PRUNING DEEP NEURAL NETWORKS
    Li, Lianqiang
    Zhu, Jie
    Sun, Ming-Ting
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2019, : 312 - 317
  • [9] Learning to Quantize Deep Neural Networks: A Competitive-Collaborative Approach
    Khan, Md Fahim Faysal
    Kamani, Mohammad Mahdi
    Mahdavi, Mehrdad
    Narayanan, Vijaykrishnan
    PROCEEDINGS OF THE 2020 57TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2020,
  • [10] Adaptive competitive learning neural networks
    Abas, Ahmed R.
    EGYPTIAN INFORMATICS JOURNAL, 2013, 14 (03) : 183 - 194