Convergence Analysis of Novel Fractional-Order Backpropagation Neural Networks With Regularization Terms

被引:4
作者
Ma, Mingjie [1 ]
Yang, Jianhui [1 ]
机构
[1] South China Univ Technol, Sch Business Adm, Guangzhou 510640, Peoples R China
关键词
Neural networks; Convergence; Cost function; Training; Stability criteria; Fractional calculus; Taylor series; Backpropagation (BP); convergence; cross-entropy loss; fractional-order gradient; squared regularization term; GRADIENT-METHOD; SYNCHRONIZATION;
D O I
10.1109/TCYB.2023.3247453
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Fractional-order derivatives have the potential to improve the performance of backpropagation (BP) neural networks. Several studies have found that the fractional-order gradient learning methods may not converge to real extreme points. The truncation and the modification of the fractional-order derivative are applied to guarantee convergence to the real extreme point. Nonetheless, the real convergence ability is based on the assumption that the algorithm is convergent, which limits the practicality of the algorithm. In this article, a novel truncated fractional-order BP neural network (TFO-BPNN) and a novel hybrid TFO-BPNN (HTFO-BPNN) are designed to solve the above problem. First, to avoid overfitting, a squared regularization term is introduced into the fractional-order BP neural network. Second, a novel dual cross-entropy cost function is proposed and employed as a loss function for the two neural networks. The penalty parameter helps to adjust the effect of the penalty term and further alleviates the gradient vanishing problem. In terms of convergence, the convergence ability of the two proposed neural networks is first proved. Then, the convergence ability to the real extreme point is further analyzed theoretically. Finally, the simulation results effectively illustrate the feasibility, high accuracy, and good generalization ability of the proposed neural networks. Comparative studies among the proposed neural networks and some related methods further substantiate the superiority of the TFO-BPNN and the HTFO-BPNN.
引用
收藏
页码:3039 / 3050
页数:12
相关论文
共 48 条
  • [21] Finite-Time Mittag-Leffler Stability of Fractional-Order Quaternion-Valued Memristive Neural Networks with Impulses
    Pratap, A.
    Raja, R.
    Alzabut, J.
    Dianavinnarasi, J.
    Cao, J.
    Rajchakit, G.
    [J]. NEURAL PROCESSING LETTERS, 2020, 51 (02) : 1485 - 1526
  • [22] Mittag-Leffler state estimator design and synchronization analysis for fractional-order BAM neural networks with time delays
    Pratap, A.
    Raja, R.
    Rajchakit, Grienggrai
    Cao, Jinde
    Bagdasar, O.
    [J]. INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2019, 33 (05) : 855 - 874
  • [23] Mittag-Leffler stability and adaptive impulsive synchronization of fractional order neural networks in quaternion field
    Pratap, Anbalagan
    Raja, Ramachandran
    Alzabut, Jehad
    Cao, Jinde
    Rajchakit, Grienggrai
    Huang, Chuangxia
    [J]. MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2020, 43 (10) : 6223 - 6253
  • [24] Global projective lag synchronization of fractional order memristor based BAM neural networks with mixed time varying delays
    Pratap, Anbalagan
    Raja, Ramachandran
    Sowmiya, Chandran
    Bagdasar, Ovidiu
    Cao, Jinde
    Rajchakit, Grienggrai
    [J]. ASIAN JOURNAL OF CONTROL, 2020, 22 (01) : 570 - 583
  • [25] Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method
    Pu, Yi-fei
    Wang, Jian
    [J]. FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2020, 21 (06) : 809 - 833
  • [26] Fractional Extreme Value Adaptive Training Method: Fractional Steepest Descent Approach
    Pu, Yi-Fei
    Zhou, Ji-Liu
    Zhang, Yi
    Zhang, Ni
    Huang, Guo
    Siarry, Patrick
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (04) : 653 - 662
  • [27] The Weighted Euler Curve Transform for Shape and Image Analysis
    Jiang, Qitong
    Kurtek, Sebastian
    Needham, Tom
    [J]. 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 3676 - 3685
  • [28] Impulsive effects on stability and passivity analysis of memristor-based fractional-order competitive neural networks
    Rajchakit, G.
    Chanthorn, P.
    Niezabitowski, M.
    Raja, R.
    Baleanu, D.
    Pratap, A.
    [J]. NEUROCOMPUTING, 2020, 417 (417) : 290 - 301
  • [29] Global Mittag-Leffler Stability and Stabilization Analysis of Fractional-Order Quaternion-Valued Memristive Neural Networks
    Rajchakit, Grienggrai
    Chanthorn, Pharunyou
    Kaewmesri, Pramet
    Sriraman, Ramalingam
    Lim, Chee Peng
    [J]. MATHEMATICS, 2020, 8 (03)
  • [30] Hybrid Control Scheme for Projective Lag Synchronization of Riemann-Liouville Sense Fractional Order Memristive BAM NeuralNetworks with Mixed Delays
    Rajchakit, Grienggrai
    Pratap, Anbalagan
    Raja, Ramachandran
    Cao, Jinde
    Alzabut, Jehad
    Huang, Chuangxia
    [J]. MATHEMATICS, 2019, 7 (08)