Adaptive Stochastic Gradient Descent (SGD) for erratic datasets

被引:3
|
作者
Dagal, Idriss [1 ]
Tanrioven, Kursat [1 ]
Nayir, Ahmet [1 ]
Akin, Burak [2 ]
机构
[1] Istanbul Beykent Univ, Elect Engn, Hadim Koruyolu Caddesi 19, TR-34450 Istanbul, Turkiye
[2] Yildiz Tech Univ, Elect Engn, Davutpasa Caddesi, TR-34220 Istanbul, Turkiye
来源
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE | 2025年 / 166卷
关键词
Gradient descent; Stochastic Gradient Descent; Accuracy; Principal Component Analysis; QUASI-NEWTON METHOD; NEURAL NETWORKS; ALGORITHM; MLP;
D O I
10.1016/j.future.2024.107682
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Stochastic Gradient Descent (SGD) is a highly efficient optimization algorithm, particularly well suited for large datasets due to its incremental parameter updates. In this study, we apply SGD to a simple linear classifier using logistic regression, a widely used method for binary classification tasks. Unlike traditional batch Gradient Descent (GD), which processes the entire dataset simultaneously, SGD offers enhanced scalability and performance for streaming and large-scale data. Our experiments reveal that SGD outperforms GD across multiple performance metrics, achieving 45.83% accuracy compared to GD's 41.67 %, and excelling in precision (60 % vs. 45.45 %), recall (100 % vs. 60 %), and F1-score (100 % vs. 62 %). Additionally, SGD achieves 99.99 % of Principal Component Analysis (PCA) accuracy, slightly surpassing GD's 99.92 %. These results highlight SGD's superior efficiency and flexibility for large-scale data environments, driven by its ability to balance precision and recall effectively. To further enhance SGD's robustness, the proposed method incorporates adaptive learning rates, momentum, and logistic regression, addressing traditional GD drawbacks. These modifications improve the algorithm's stability, convergence behavior, and applicability to complex, large-scale optimization tasks where standard GD often struggles, making SGD a highly effective solution for challenging data-driven scenarios.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Efficiency Ordering of Stochastic Gradient Descent
    Hu, Jie
    Doshi, Vishwaraj
    Eun, Do Young
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [42] Adaptive stochastic gradient descent on the Grassmannian for robust low-rank subspace recovery
    He, Jun
    Zhang, Yue
    Zhou, Yuan
    Zhang, Lei
    IET SIGNAL PROCESSING, 2016, 10 (08) : 1000 - 1008
  • [43] AFCGD: an adaptive fuzzy classifier based on gradient descent
    Shahparast, Homeira
    Mansoori, Eghbal G.
    Jahromi, Mansoor Zolghadri
    SOFT COMPUTING, 2019, 23 (12) : 4557 - 4571
  • [44] Overlap Removal by Stochastic Gradient Descent With(out) Shape Awareness
    Giovannangeli, Loann
    Lalanne, Frederic
    Giot, Romain
    Bourqui, Romain
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2024, 30 (12) : 7500 - 7517
  • [45] Weighted Aggregating Stochastic Gradient Descent for Parallel Deep Learning
    Guo, Pengzhan
    Ye, Zeyang
    Xiao, Keli
    Zhu, Wei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (10) : 5037 - 5050
  • [46] Design of Momentum Fractional Stochastic Gradient Descent for Recommender Systems
    Khan, Zeshan Aslam
    Zubair, Syed
    Alquhayz, Hani
    Azeem, Muhammad
    Ditta, Allah
    IEEE ACCESS, 2019, 7 : 179575 - 179590
  • [47] Stochastic Modified Flows, Mean-Field Limits and Dynamics of Stochastic Gradient Descent
    Gess, Benjamin
    Kassing, Sebastian
    Konarovskyi, Vitalii
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [48] On the diffusion approximation of nonconvex stochastic gradient descent
    Hu, Wenqing
    Li, Chris Junchi
    Li, Lei
    Liu, Jian-Guo
    ANNALS OF MATHEMATICAL SCIENCES AND APPLICATIONS, 2019, 4 (01) : 3 - 32
  • [49] Stochastic Gradient Descent Support Vector Clustering
    Tung Pham
    Hang Dang
    Trung Le
    Hoang-Thai Le
    PROCEEDINGS OF 2015 2ND NATIONAL FOUNDATION FOR SCIENCE AND TECHNOLOGY DEVELOPMENT CONFERENCE ON INFORMATION AND COMPUTER SCIENCE NICS 2015, 2015, : 88 - 93
  • [50] Convergence analysis of gradient descent stochastic algorithms
    Shapiro, A
    Wardi, Y
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1996, 91 (02) : 439 - 454