Convolutional neural networks based on fractional-order momentum for parameter training

被引:12
|
作者
Kan, Tao [1 ]
Gao, Zhe [1 ,2 ]
Yang, Chuang [1 ]
Jian, Jing [1 ]
机构
[1] Liaoning Univ, Sch Math, Shenyang 110036, Peoples R China
[2] Liaoning Univ, Coll Light Ind, Shenyang 110036, Peoples R China
关键词
Convolutional neural networks; Fractional-order difference; Momentum; MNIST; CIFAR-10; RECOGNITION; STABILITY; DISCRETE; TERM;
D O I
10.1016/j.neucom.2021.03.075
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a parameter training method via the fractional-order momentum for convolutional neural networks (CNNs). To update the parameters of CNNs more smoothly, the parameter training method via the fractional-order momentum is proposed based on the Gr & uuml;nwald-Letnikov (G-L) difference operation. The stochastic classical momentum (SCM) algorithm and adaptive moment (Adam) estimation algorithm are improved by replacing the integer-order difference with the fractional-order difference. Meanwhile, the linear and the nonlinear methods are discussed to adjust the fractional-order. Therefore, the proposed methods can improve the flexibility and the adaptive ability of CNN parameters. We analyze the validity of the methods by using MNIST dataset and CIFAR-10 dataset, and the experimental results show that the proposed methods can improve the recognition accuracy and the learning convergence speed of CNNs compared with the traditional SCM and Adam methods. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:85 / 99
页数:15
相关论文
共 50 条
  • [1] A fractional-order momentum optimization approach of deep neural networks
    Yu, ZhongLiang
    Sun, Guanghui
    Lv, Jianfeng
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (09) : 7091 - 7111
  • [2] Fractional-order convolutional neural networks with population extremal optimization
    Chen, Bi-Peng
    Chen, Yun
    Zeng, Guo-Qiang
    She, Qingshan
    NEUROCOMPUTING, 2022, 477 : 36 - 45
  • [3] Robust synchronization of memristor-based fractional-order Hopfield neural networks with parameter uncertainties
    Liu, Shuxin
    Yu, Yongguang
    Zhang, Shuo
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (08) : 3533 - 3542
  • [4] Stochastic Gradient Descent Method of Convolutional Neural Network Using Fractional-Order Momentum
    Kan T.
    Gao Z.
    Yang C.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2020, 33 (06): : 559 - 567
  • [5] Dynamics of fractional-order neural networks
    Kaslik, Eva
    Sivasundaram, Seenith
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 611 - 618
  • [6] Dynamics in fractional-order neural networks
    Song, Chao
    Cao, Jinde
    NEUROCOMPUTING, 2014, 142 : 494 - 498
  • [7] Learning rule with fractional-order average momentum based on Tustin generating function for convolution neural networks
    Jian, Jing
    Gao, Zhe
    Kan, Tao
    PROCEEDINGS OF 2020 IEEE 9TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE (DDCLS'20), 2020, : 461 - 466
  • [8] LMI Conditions for Global Stability of Fractional-Order Neural Networks
    Zhang, Shuo
    Yu, Yongguang
    Yu, Junzhi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (10) : 2423 - 2433
  • [9] Stability analysis of fractional-order neural networks: An LMI approach
    Yang, Ying
    He, Yong
    Wang, Yong
    Wu, Min
    NEUROCOMPUTING, 2018, 285 : 82 - 93
  • [10] Global attractivity of memristor-based fractional-order neural networks
    Zhang, Shuo
    Yu, Yongguang
    Gu, Yajuan
    NEUROCOMPUTING, 2017, 227 : 64 - 73