Convergence Analysis of Novel Fractional-Order Backpropagation Neural Networks With Regularization Terms

被引:4
作者
Ma, Mingjie [1 ]
Yang, Jianhui [1 ]
机构
[1] South China Univ Technol, Sch Business Adm, Guangzhou 510640, Peoples R China
关键词
Neural networks; Convergence; Cost function; Training; Stability criteria; Fractional calculus; Taylor series; Backpropagation (BP); convergence; cross-entropy loss; fractional-order gradient; squared regularization term; GRADIENT-METHOD; SYNCHRONIZATION;
D O I
10.1109/TCYB.2023.3247453
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Fractional-order derivatives have the potential to improve the performance of backpropagation (BP) neural networks. Several studies have found that the fractional-order gradient learning methods may not converge to real extreme points. The truncation and the modification of the fractional-order derivative are applied to guarantee convergence to the real extreme point. Nonetheless, the real convergence ability is based on the assumption that the algorithm is convergent, which limits the practicality of the algorithm. In this article, a novel truncated fractional-order BP neural network (TFO-BPNN) and a novel hybrid TFO-BPNN (HTFO-BPNN) are designed to solve the above problem. First, to avoid overfitting, a squared regularization term is introduced into the fractional-order BP neural network. Second, a novel dual cross-entropy cost function is proposed and employed as a loss function for the two neural networks. The penalty parameter helps to adjust the effect of the penalty term and further alleviates the gradient vanishing problem. In terms of convergence, the convergence ability of the two proposed neural networks is first proved. Then, the convergence ability to the real extreme point is further analyzed theoretically. Finally, the simulation results effectively illustrate the feasibility, high accuracy, and good generalization ability of the proposed neural networks. Comparative studies among the proposed neural networks and some related methods further substantiate the superiority of the TFO-BPNN and the HTFO-BPNN.
引用
收藏
页码:3039 / 3050
页数:12
相关论文
共 48 条
  • [1] Synchronization of Fractional Order Fuzzy BAM Neural Networks With Time Varying Delays and Reaction Diffusion Terms
    Ali, M. Syedy
    Hymavathi, M.
    Rajchakit, Grienggrai
    Saroha, Sumit
    Palanisamy, L.
    Hammachukiattikul, Porpattama
    [J]. IEEE ACCESS, 2020, 8 : 186551 - 186571
  • [2] Fractional-Order Deep Backpropagation Neural Network
    Bao, Chunhui
    Pu, Yifei
    Zhang, Yi
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2018, 2018
  • [3] Almost Periodicity in Impulsive Fractional-Order Reaction-Diffusion Neural Networks With Time-Varying Delays
    Cao, Jinde
    Stamov, Gani
    Stamova, Ivanka
    Simeonov, Stanislav
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (01) : 151 - 161
  • [4] Study on fractional order gradient methods
    Chen, Yuquan
    Gao, Qing
    Wei, Yiheng
    Wang, Yong
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2017, 314 : 310 - 321
  • [5] Application of fractional theory in quantum back propagation neural network
    Dong, Yumin
    Li, Xiang
    Zhang, Jinlei
    Li, Ziyi
    Hou, Dong
    [J]. MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2023, 46 (03) : 3080 - 3090
  • [6] Du B, 2016, CHIN CONTR CONF, P10510, DOI 10.1109/ChiCC.2016.7555022
  • [7] A backpropagation learning algorithm with graph regularization for feedforward neural networks
    Fan, Yetian
    Yang, Wenyu
    [J]. INFORMATION SCIENCES, 2022, 607 : 263 - 277
  • [8] Gorenflo R., 1997, FRACTAL FRACT, P223, DOI DOI 10.1007/978-3-7091-2664-6_5
  • [9] Global Stability Analysis of Fractional-Order Quaternion-Valued Bidirectional Associative Memory Neural Networks
    Humphries, Usa
    Rajchakit, Grienggrai
    Kaewmesri, Pramet
    Chanthorn, Pharunyou
    Sriraman, Ramalingam
    Samidurai, Rajendran
    Lim, Chee Peng
    [J]. MATHEMATICS, 2020, 8 (05)
  • [10] Kingma D.P., 2014, arXiv, DOI 10.48550/arXiv.1412.6980