Improving the Accuracy of Neural Network Pattern Recognition by Fractional Gradient Descent

被引:1
|
作者
Abdulkadirov, Ruslan I. [1 ]
Lyakhov, Pavel A. [1 ]
Baboshina, Valentina A. [1 ]
Nagornov, Nikolay N. [2 ]
机构
[1] North Caucasus Fed Univ, North Caucasus Ctr Math Res, Stavropol 355017, Russia
[2] North Caucasus Fed Univ, Dept Math Modeling, Stavropol 355017, Russia
来源
IEEE ACCESS | 2024年 / 12卷
基金
俄罗斯科学基金会;
关键词
Neural networks; Optimization; Pattern recognition; Accuracy; Convergence; Training; Transformers; Heuristic algorithms; Stability analysis; Multilayer perceptrons; Convolutional neural networks; fractional derivatives of Riemann-Liouville; Caputo; Grunwald-Letnikov; multilayer perceptron; optimization algorithms; stochastic gradient descent; OPTIMIZATION;
D O I
10.1109/ACCESS.2024.3491614
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper we propose the fractional gradient descent for increasing the training and work of modern neural networks. This optimizer searches the global minimum of the loss function considering the fractional gradient directions achieved by Riemann-Liouville, Caputo, and Grunwald-Letnikov derivatives. The adjusting of size and direction of the fractional gradient, supported by momentum and Nesterov condition, let the proposed optimizer descend into the global minimum of loss functions of neural networks. Utilizing the proposed optimization algorithm in a linear neural network and a visual transformer lets us attain higher accuracy, precision, recall, Macro F1 score by 1.8-4 percentage points than known analogs than state-of-the-art methods in solving pattern recognition problems on images from MNIST and CIFAR10 datasets. Further research of fractional calculus in modern neural network methodology can improve the quality of solving various challenges such as pattern recognition, time series forecasting, moving object detection, and data generation.
引用
收藏
页码:168428 / 168444
页数:17
相关论文
共 50 条
  • [21] Pattern recognition with stochastic resonance in a generic neural network
    Tan, Z
    Ali, MK
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2000, 11 (08): : 1585 - 1593
  • [22] HAVNET: A new neural network architecture for pattern recognition
    Rosandich, RG
    NEURAL NETWORKS, 1997, 10 (01) : 139 - 151
  • [23] Fuzzy neural network for invariant optical pattern recognition
    Wen, ZQ
    Yeh, PC
    Yang, XY
    OPTICAL ENGINEERING, 1996, 35 (08) : 2188 - 2195
  • [24] Pattern recognition in hydraulic backlash using neural network
    Borrás, C
    Stalford, HL
    PROCEEDINGS OF THE 2002 AMERICAN CONTROL CONFERENCE, VOLS 1-6, 2002, 1-6 : 400 - 405
  • [25] STADIA: Photonic Stochastic Gradient Descent for Neural Network Accelerators
    Xia, Chengpeng
    Chen, Yawen
    Zhang, Haibo
    Wu, Jigang
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)
  • [26] Independent neural network modeling of class analogy for classification pattern recognition and optimization
    Liu, HL
    Cao, XW
    Xu, RJ
    Chen, NY
    ANALYTICA CHIMICA ACTA, 1997, 342 (2-3) : 223 - 228
  • [27] The Improved Stochastic Fractional Order Gradient Descent Algorithm
    Yang, Yang
    Mo, Lipo
    Hu, Yusen
    Long, Fei
    FRACTAL AND FRACTIONAL, 2023, 7 (08)
  • [28] Stochastic Gradient Descent Combines Second-Order Information for Training Neural Network
    Chen, Minyu
    ICOMS 2018: 2018 INTERNATIONAL CONFERENCE ON MATHEMATICS AND STATISTICS, 2018, : 69 - 73
  • [29] Impact of RTN on Pattern Recognition Accuracy of RRAM-Based Synaptic Neural Network
    Chai, Zheng
    Freitas, Pedro
    Zhang, Weidong
    Hatem, Firas
    Zhang, Jian Fu
    Marsland, John
    Govoreanu, Bogdan
    Goux, Ludovic
    Kar, Gouri Sankar
    IEEE ELECTRON DEVICE LETTERS, 2018, 39 (11) : 1652 - 1655
  • [30] Intelligent Feature Selection Using GA and Neural Network Optimization for Real-Time Driving Pattern Recognition
    Tao, Jili
    Zhang, Ridong
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (08) : 12665 - 12674