Fractional-order convolutional neural networks with population extremal optimization

被引:17
|
作者
Chen, Bi-Peng [1 ]
Chen, Yun [1 ]
Zeng, Guo-Qiang [2 ]
She, Qingshan [1 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Peoples R China
[2] Wenzhou Univ, Natl Local Joint Engn Lab Digitalize Elect Design, Wenzhou 325035, Peoples R China
基金
中国国家自然科学基金;
关键词
Caputo fractional-order gradient method; Population extremal optimization; Initial bias and weight; MNIST dataset; Fractional-order convolutional neural networks; PARTICLE SWARM OPTIMIZATION; QUANTITATIVE-ANALYSIS; STABILITY;
D O I
10.1016/j.neucom.2022.01.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article is devoted to the intelligent optimization issue by means of PEO-FOCNN, i.e., the fractional order convolutional neural networks (FOCNNs) with population extremal optimization (PEO). The Caputo fractional-order gradient method (CFOGM) is adopted to improve the dynamic updating effectiveness of the biases and weights for convolutional neural networks (CNN). Moreover, considering the significance of the initial biases and weights and their updating mechanisms to the optimization performance of FOCNN, the PEO algorithm is used to seek an optimal selection from lots of the initial biases and weights. The optimization effect of PEO method for FOCNN is demonstrated by the training and testing accuracies of PEO-FOCNN compared with standard FOCNN. And, the superiority of the proposed PEO-FOCNN to FOCNN based on some other popular optimization algorithms, such as the genetic algorithm-based FOCNN (GA-FOCNN), differential evolution-based FOCNN (DE-FOCNN) and particle swarm optimization-based FOCNN (PSO-FOCNN), is verified by the experiments on the MNIST dataset in terms of three types of statistical tests. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:36 / 45
页数:10
相关论文
共 50 条
  • [1] Convolutional neural networks based on fractional-order momentum for parameter training
    Kan, Tao
    Gao, Zhe
    Yang, Chuang
    Jian, Jing
    NEUROCOMPUTING, 2021, 449 : 85 - 99
  • [2] A fractional-order momentum optimization approach of deep neural networks
    Yu, ZhongLiang
    Sun, Guanghui
    Lv, Jianfeng
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (09) : 7091 - 7111
  • [3] Dynamics of fractional-order neural networks
    Kaslik, Eva
    Sivasundaram, Seenith
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 611 - 618
  • [4] Dynamics in fractional-order neural networks
    Song, Chao
    Cao, Jinde
    NEUROCOMPUTING, 2014, 142 : 494 - 498
  • [5] Optimizing Fractional-Order Convolutional Neural Networks for Groove Classification in Music Using Differential Evolution
    Chen, Jiangang
    Su, Pei
    Li, Daxin
    Han, Junbo
    Zhou, Gaoquan
    Tang, Donghui
    FRACTAL AND FRACTIONAL, 2024, 8 (11)
  • [6] A system of fractional-order interval projection neural networks
    Wu, Zeng-bao
    Zou, Yun-zhi
    Huang, Nan-jing
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2016, 294 : 389 - 402
  • [7] Asymptotic Stability of Fractional-Order Incommensurate Neural Networks
    Chen, Liping
    Gu, Panpan
    Lopes, Antonio M.
    Chai, Yi
    Xu, Shuiqing
    Ge, Suoliang
    NEURAL PROCESSING LETTERS, 2023, 55 (05) : 5499 - 5513
  • [8] Hopf Bifurcation in Fractional-Order Recurrent Neural Networks
    Zhao, Lingzhi
    Cao, Ernie
    Xiao, Min
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 5921 - 5926
  • [9] Nonlinear dynamics and chaos in fractional-order neural networks
    Kaslik, Eva
    Sivasundaram, Seenith
    NEURAL NETWORKS, 2012, 32 : 245 - 256
  • [10] Dynamics analysis of fractional-order Hopfield neural networks
    Batiha, Iqbal M.
    Albadarneh, Ramzi B.
    Momani, Shaher
    Jebril, Iqbal H.
    INTERNATIONAL JOURNAL OF BIOMATHEMATICS, 2020, 13 (08)