Adaptive Stochastic Gradient Descent (SGD) for erratic datasets

被引:3
|
作者
Dagal, Idriss [1 ]
Tanrioven, Kursat [1 ]
Nayir, Ahmet [1 ]
Akin, Burak [2 ]
机构
[1] Istanbul Beykent Univ, Elect Engn, Hadim Koruyolu Caddesi 19, TR-34450 Istanbul, Turkiye
[2] Yildiz Tech Univ, Elect Engn, Davutpasa Caddesi, TR-34220 Istanbul, Turkiye
来源
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE | 2025年 / 166卷
关键词
Gradient descent; Stochastic Gradient Descent; Accuracy; Principal Component Analysis; QUASI-NEWTON METHOD; NEURAL NETWORKS; ALGORITHM; MLP;
D O I
10.1016/j.future.2024.107682
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Stochastic Gradient Descent (SGD) is a highly efficient optimization algorithm, particularly well suited for large datasets due to its incremental parameter updates. In this study, we apply SGD to a simple linear classifier using logistic regression, a widely used method for binary classification tasks. Unlike traditional batch Gradient Descent (GD), which processes the entire dataset simultaneously, SGD offers enhanced scalability and performance for streaming and large-scale data. Our experiments reveal that SGD outperforms GD across multiple performance metrics, achieving 45.83% accuracy compared to GD's 41.67 %, and excelling in precision (60 % vs. 45.45 %), recall (100 % vs. 60 %), and F1-score (100 % vs. 62 %). Additionally, SGD achieves 99.99 % of Principal Component Analysis (PCA) accuracy, slightly surpassing GD's 99.92 %. These results highlight SGD's superior efficiency and flexibility for large-scale data environments, driven by its ability to balance precision and recall effectively. To further enhance SGD's robustness, the proposed method incorporates adaptive learning rates, momentum, and logistic regression, addressing traditional GD drawbacks. These modifications improve the algorithm's stability, convergence behavior, and applicability to complex, large-scale optimization tasks where standard GD often struggles, making SGD a highly effective solution for challenging data-driven scenarios.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] AG-SGD: Angle-Based Stochastic Gradient Descent
    Song, Chongya
    Pons, Alexander
    Yen, Kang
    IEEE ACCESS, 2021, 9 : 23007 - 23024
  • [2] Multicriteria Scalable Graph Drawing via Stochastic Gradient Descent, (SGD)2
    Ahmed, Reyan
    De Luca, Felice
    Devkota, Sabin
    Kobourov, Stephen
    Li, Mingwei
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2022, 28 (06) : 2388 - 2399
  • [3] Efficient distance metric learning by adaptive sampling and mini-batch stochastic gradient descent (SGD)
    Qi Qian
    Rong Jin
    Jinfeng Yi
    Lijun Zhang
    Shenghuo Zhu
    Machine Learning, 2015, 99 : 353 - 372
  • [4] Guided Stochastic Gradient Descent Algorithm for inconsistent datasets
    Sharma, Anuraganand
    APPLIED SOFT COMPUTING, 2018, 73 : 1068 - 1080
  • [5] Bangla Text Document Categorization Using Stochastic Gradient Descent (SGD) Classifier
    Kabir, Fasihul
    Siddique, Sabbir
    Kotwal, Mohammed Rokibul Alam
    Huda, Mohammad Nurul
    2015 INTERNATIONAL CONFERENCE ON COGNITIVE COMPUTING AND INFORMATION PROCESSING (CCIP), 2015,
  • [6] SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent
    Bordes, Antoine
    Bottou, Leon
    Gallinari, Patrick
    JOURNAL OF MACHINE LEARNING RESEARCH, 2009, 10 : 1737 - 1754
  • [7] DAC-SGD: A Distributed Stochastic Gradient Descent Algorithm Based on Asynchronous Connection
    He, Aijia
    Chen, Zehong
    Li, Weichen
    Li, Xingying
    Li, Hongjun
    Zhao, Xin
    IIP'17: PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON INTELLIGENT INFORMATION PROCESSING, 2017,
  • [8] Adaptive Stochastic Gradient Descent Optimisation for Image Registration
    Stefan Klein
    Josien P. W. Pluim
    Marius Staring
    Max A. Viergever
    International Journal of Computer Vision, 2009, 81
  • [9] ADINE: An Adaptive Momentum Method for Stochastic Gradient Descent
    Srinivasan, Vishwak
    Sankar, Adepu Ravi
    Balasubramanian, Vineeth N.
    PROCEEDINGS OF THE ACM INDIA JOINT INTERNATIONAL CONFERENCE ON DATA SCIENCE AND MANAGEMENT OF DATA (CODS-COMAD'18), 2018, : 249 - 256
  • [10] Adaptive Stochastic Gradient Descent Optimisation for Image Registration
    Klein, Stefan
    Pluim, Josien P. W.
    Staring, Marius
    Viergever, Max A.
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2009, 81 (03) : 227 - 239