The Improved Stochastic Fractional Order Gradient Descent Algorithm

被引:2
|
作者
Yang, Yang [1 ]
Mo, Lipo [1 ,2 ]
Hu, Yusen [1 ]
Long, Fei [3 ]
机构
[1] Beijing Technol & Business Univ, Sch Math & Stat, Beijing 100048, Peoples R China
[2] Beijing Technol & Business Univ, Sch Future Technol, Beijing 100048, Peoples R China
[3] Guizhou Inst Technol, Sch Artificial Intelligence & Elect Engn, Special Key Lab Artificial Intelligence & Intellig, Guiyang 550003, Peoples R China
关键词
machine learning; fractional calculus; stochastic gradient descent; convex optimization; online optimization; NEURAL-NETWORKS;
D O I
10.3390/fractalfract7080631
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This paper mainly proposes some improved stochastic gradient descent (SGD) algorithms with a fractional order gradient for the online optimization problem. For three scenarios, including standard learning rate, adaptive gradient learning rate, and momentum learning rate, three new SGD algorithms are designed combining a fractional order gradient and it is shown that the corresponding regret functions are convergent at a sub-linear rate. Then we discuss the impact of the fractional order on the convergence and monotonicity and prove that the better performance can be obtained by adjusting the order of the fractional gradient. Finally, several practical examples are given to verify the superiority and validity of the proposed algorithm.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
    Deanna Needell
    Nathan Srebro
    Rachel Ward
    Mathematical Programming, 2016, 155 : 549 - 573
  • [22] DAC-SGD: A Distributed Stochastic Gradient Descent Algorithm Based on Asynchronous Connection
    He, Aijia
    Chen, Zehong
    Li, Weichen
    Li, Xingying
    Li, Hongjun
    Zhao, Xin
    IIP'17: PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON INTELLIGENT INFORMATION PROCESSING, 2017,
  • [23] Stochastic Modified Flows, Mean-Field Limits and Dynamics of Stochastic Gradient Descent
    Gess, Benjamin
    Kassing, Sebastian
    Konarovskyi, Vitalii
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [24] Convergent Stochastic Almost Natural Gradient Descent
    Sanchez-Lopez, Borja
    Cerquides, Jesus
    ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT, 2019, 319 : 54 - 63
  • [25] A fractional gradient descent algorithm robust to the initial weights of multilayer perceptron
    Xie, Xuetao
    Pu, Yi-Fei
    Wang, Jian
    NEURAL NETWORKS, 2023, 158 : 154 - 170
  • [26] The Impact of Synchronization in Parallel Stochastic Gradient Descent
    Backstrom, Karl
    Papatriantafilou, Marina
    Tsigas, Philippas
    DISTRIBUTED COMPUTING AND INTELLIGENT TECHNOLOGY, ICDCIT 2022, 2022, 13145 : 60 - 75
  • [27] An Efficient Stochastic Gradient Descent Algorithm to Maximize the Coverage of Cellular Networks
    Liu, Yaxi
    Wei Huangfu
    Zhang, Haijun
    Long, Keping
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2019, 18 (07) : 3424 - 3436
  • [28] SW-SGD: The Sliding Window Stochastic Gradient Descent Algorithm
    Chakroun, Imen
    Haber, Tom
    Ashby, Thomas J.
    INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE (ICCS 2017), 2017, 108 : 2318 - 2322
  • [29] Conservative SPDEs as fluctuating mean field limits of stochastic gradient descent
    Gess, Benjamin
    Gvalani, Rishabh S.
    Konarovskyi, Vitalii
    PROBABILITY THEORY AND RELATED FIELDS, 2025,
  • [30] Stochastic gradient descent analysis for the evaluation of a speaker recognition
    Nasef, Ashrf
    Marjanovic-Jakovljevic, Marina
    Njegus, Angelina
    ANALOG INTEGRATED CIRCUITS AND SIGNAL PROCESSING, 2017, 90 (02) : 389 - 397