Fractional-order convolutional neural networks with population extremal optimization

被引:18
|
作者
Chen, Bi-Peng [1 ]
Chen, Yun [1 ]
Zeng, Guo-Qiang [2 ]
She, Qingshan [1 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Peoples R China
[2] Wenzhou Univ, Natl Local Joint Engn Lab Digitalize Elect Design, Wenzhou 325035, Peoples R China
基金
中国国家自然科学基金;
关键词
Caputo fractional-order gradient method; Population extremal optimization; Initial bias and weight; MNIST dataset; Fractional-order convolutional neural networks; PARTICLE SWARM OPTIMIZATION; QUANTITATIVE-ANALYSIS; STABILITY;
D O I
10.1016/j.neucom.2022.01.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article is devoted to the intelligent optimization issue by means of PEO-FOCNN, i.e., the fractional order convolutional neural networks (FOCNNs) with population extremal optimization (PEO). The Caputo fractional-order gradient method (CFOGM) is adopted to improve the dynamic updating effectiveness of the biases and weights for convolutional neural networks (CNN). Moreover, considering the significance of the initial biases and weights and their updating mechanisms to the optimization performance of FOCNN, the PEO algorithm is used to seek an optimal selection from lots of the initial biases and weights. The optimization effect of PEO method for FOCNN is demonstrated by the training and testing accuracies of PEO-FOCNN compared with standard FOCNN. And, the superiority of the proposed PEO-FOCNN to FOCNN based on some other popular optimization algorithms, such as the genetic algorithm-based FOCNN (GA-FOCNN), differential evolution-based FOCNN (DE-FOCNN) and particle swarm optimization-based FOCNN (PSO-FOCNN), is verified by the experiments on the MNIST dataset in terms of three types of statistical tests. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:36 / 45
页数:10
相关论文
共 50 条
  • [21] Chaos and hyperchaos in fractional-order cellular neural networks
    Huang, Xia
    Zhao, Zhao
    Wang, Zhen
    Li, Yuxia
    NEUROCOMPUTING, 2012, 94 : 13 - 21
  • [22] Quantitative Analysis in Delayed Fractional-Order Neural Networks
    Yuan, Jun
    Huang, Chengdai
    NEURAL PROCESSING LETTERS, 2020, 51 (02) : 1631 - 1651
  • [23] Quantitative Analysis in Delayed Fractional-Order Neural Networks
    Jun Yuan
    Chengdai Huang
    Neural Processing Letters, 2020, 51 : 1631 - 1651
  • [24] A comment on "α-stability and α-synchronization for fractional-order neural networks"
    Li Kexue
    Peng Jigen
    Gao Jinghuai
    NEURAL NETWORKS, 2013, 48 : 207 - 208
  • [25] Asymptotic Stability of Fractional-Order Incommensurate Neural Networks
    Chen, Liping
    Gu, Panpan
    Lopes, Antonio M.
    Chai, Yi
    Xu, Shuiqing
    Ge, Suoliang
    NEURAL PROCESSING LETTERS, 2023, 55 (05) : 5499 - 5513
  • [26] Hopf Bifurcation in Fractional-Order Recurrent Neural Networks
    Zhao, Lingzhi
    Cao, Ernie
    Xiao, Min
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 5921 - 5926
  • [27] Dynamic analysis of fractional-order neural networks with inertia
    Li, Zhiying
    Jiang, Wangdong
    Zhang, Yuehong
    AIMS MATHEMATICS, 2022, 7 (09): : 16889 - 16906
  • [28] Asymptotic Stability of Fractional-Order Incommensurate Neural Networks
    Liping Chen
    Panpan Gu
    António M. Lopes
    Yi Chai
    Shuiqing Xu
    Suoliang Ge
    Neural Processing Letters, 2023, 55 : 5499 - 5513
  • [29] Multistability of delayed fractional-order competitive neural networks
    Zhang, Fanghai
    Huang, Tingwen
    Wu, Qiujie
    Zeng, Zhigang
    NEURAL NETWORKS, 2021, 140 : 325 - 335
  • [30] Recent Advances and Applications of Fractional-Order Neural Networks
    Maiti, Monalisa
    Sunder, M.
    Abishek, R.
    Bingi, Kishore
    Shaik, Nagoor Basha
    Benjapolakul, Watit
    ENGINEERING JOURNAL-THAILAND, 2022, 26 (07): : 49 - 67