Fractional-order convolutional neural networks with population extremal optimization

被引:17
|
作者
Chen, Bi-Peng [1 ]
Chen, Yun [1 ]
Zeng, Guo-Qiang [2 ]
She, Qingshan [1 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Peoples R China
[2] Wenzhou Univ, Natl Local Joint Engn Lab Digitalize Elect Design, Wenzhou 325035, Peoples R China
基金
中国国家自然科学基金;
关键词
Caputo fractional-order gradient method; Population extremal optimization; Initial bias and weight; MNIST dataset; Fractional-order convolutional neural networks; PARTICLE SWARM OPTIMIZATION; QUANTITATIVE-ANALYSIS; STABILITY;
D O I
10.1016/j.neucom.2022.01.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This article is devoted to the intelligent optimization issue by means of PEO-FOCNN, i.e., the fractional order convolutional neural networks (FOCNNs) with population extremal optimization (PEO). The Caputo fractional-order gradient method (CFOGM) is adopted to improve the dynamic updating effectiveness of the biases and weights for convolutional neural networks (CNN). Moreover, considering the significance of the initial biases and weights and their updating mechanisms to the optimization performance of FOCNN, the PEO algorithm is used to seek an optimal selection from lots of the initial biases and weights. The optimization effect of PEO method for FOCNN is demonstrated by the training and testing accuracies of PEO-FOCNN compared with standard FOCNN. And, the superiority of the proposed PEO-FOCNN to FOCNN based on some other popular optimization algorithms, such as the genetic algorithm-based FOCNN (GA-FOCNN), differential evolution-based FOCNN (DE-FOCNN) and particle swarm optimization-based FOCNN (PSO-FOCNN), is verified by the experiments on the MNIST dataset in terms of three types of statistical tests. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:36 / 45
页数:10
相关论文
共 50 条
  • [41] Order-Dependent Sampling Control of Uncertain Fractional-Order Neural Networks System
    Chao Ge
    Qi Zhang
    Ruonan Zhang
    Li Yang
    Neural Processing Letters, 2023, 55 : 10773 - 10787
  • [42] Event-triggered impulsive synchronization of fractional-order coupled neural networks
    Tan, Hailian
    Wu, Jianwei
    Bao, Haibo
    APPLIED MATHEMATICS AND COMPUTATION, 2022, 429
  • [43] New Approach to Quasi-Synchronization of Fractional-Order Delayed Neural Networks
    Zhang, Shilong
    Du, Feifei
    Chen, Diyi
    FRACTAL AND FRACTIONAL, 2023, 7 (11)
  • [44] Stability analysis of fractional-order Hopfield neural networks with discontinuous activation functions
    Zhang, Shuo
    Yu, Yongguang
    Wang, Qing
    NEUROCOMPUTING, 2016, 171 : 1075 - 1084
  • [45] Sliding Mode Matrix-Projective Synchronization for Fractional-Order Neural Networks
    He, Jinman
    Lei, Tengfei
    Jiang, Limin
    JOURNAL OF MATHEMATICS, 2021, 2021
  • [46] Local Stabilization of Delayed Fractional-Order Neural Networks Subject to Actuator Saturation
    Fan, Yingjie
    Huang, Xia
    Wang, Zhen
    FRACTAL AND FRACTIONAL, 2022, 6 (08)
  • [47] Synchronization analysis for delayed spatio-temporal neural networks with fractional-order
    Zheng, Bibo
    Hu, Cheng
    Yu, Juan
    Jiang, Haijun
    NEUROCOMPUTING, 2021, 441 : 226 - 236
  • [48] Projective Synchronization Analysis of Fractional-Order Neural Networks With Mixed Time Delays
    Liu, Peng
    Kong, Minxue
    Zeng, Zhigang
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (07) : 6798 - 6808
  • [49] Different impulsive effects on synchronization of fractional-order memristive BAM neural networks
    Zhang, Lingzhong
    Yang, Yongqing
    NONLINEAR DYNAMICS, 2018, 93 (02) : 233 - 250
  • [50] Synchronization Control of Fractional-Order Neural Networks with Time-Varying Delays
    Yin, Ting
    Chen, Boshan
    Zhong, Jie
    2017 NINTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2017, : 79 - 83