Fast gradient descent algorithm for image classification with neural networks

被引:0
|
作者
Abdelkrim El Mouatasim
机构
[1] Ibn Zohr University,Faculty of Polydisciplinary Ouarzazate (FPO)
来源
关键词
Gradient algorithm; Nesterov algorithm; Learning rate control; Image classification; Neural networks;
D O I
暂无
中图分类号
学科分类号
摘要
Any optimization of gradient descent methods involves selecting a learning rate. Tuning the learning rate can quickly become repetitive with deeper models of image classification, does not necessarily lead to optimal convergence. We proposed in this paper, a modification of the gradient descent algorithm in which the Nestrove step is added, and the learning rate is update in each epoch. Instead, we learn learning rate itself, either by Armijo rule, or by control step. Our algorithm called fast gradient descent (FGD) for solving image classification with neural networks problems, the quadratic convergence rate o(k2)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$o(k^2)$$\end{document} of FGD algorithm are proved. FGD algorithm are applicate to a MNIST dataset. The numerical experiment, show that our approach FGD algorithm is faster than gradient descent algorithms.
引用
收藏
页码:1565 / 1572
页数:7
相关论文
共 50 条
  • [21] The Dynamics of Gradient Descent for Overparametrized Neural Networks
    Satpathi, Siddhartha
    Srikant, R.
    LEARNING FOR DYNAMICS AND CONTROL, VOL 144, 2021, 144
  • [22] Applying Gradient Descent in Convolutional Neural Networks
    Cui, Nan
    2ND INTERNATIONAL CONFERENCE ON MACHINE VISION AND INFORMATION TECHNOLOGY (CMVIT 2018), 2018, 1004
  • [23] Fast Convergence Stochastic Parallel Gradient Descent Algorithm
    Hu Dongting
    Shen Wen
    Ma Wenchao
    Liu Xinyu
    Su Zhouping
    Zhu Huaxin
    Zhang Xiumei
    Que Lizhi
    Zhu Zhuowei
    Zhang Yixin
    Chen Guoqing
    Hu Lifa
    LASER & OPTOELECTRONICS PROGRESS, 2019, 56 (12)
  • [24] Optimization of learning process for Fourier series neural networks using gradient descent algorithm
    Halawa, Krzysztof
    PRZEGLAD ELEKTROTECHNICZNY, 2008, 84 (06): : 128 - 130
  • [25] Adaptive orthogonal gradient descent algorithm for fully complex-valued neural networks
    Zhao, Weijing
    Huang, He
    NEUROCOMPUTING, 2023, 546
  • [26] Gradient descent learning algorithm for hierarchical neural networks: A case study in industrial quality
    Baratta, D
    Diotalevi, F
    Valle, M
    Caviglia, DD
    ENGINEERING APPLICATIONS OF BIO-INSPIRED ARTIFICIAL NEURAL NETWORKS, VOL II, 1999, 1607 : 578 - 587
  • [27] Hinge Classification Algorithm Based on Asynchronous Gradient Descent
    Yan, Xiaodan
    Zhang, Tianxin
    Cui, Baojiang
    Deng, Jiangdong
    ADVANCES ON BROAD-BAND WIRELESS COMPUTING, COMMUNICATION AND APPLICATIONS, BWCCA-2017, 2018, 12 : 459 - 468
  • [28] Fast competitive learning algorithm for image compression neural networks
    Jiang, J
    ELECTRONICS LETTERS, 1996, 32 (15) : 1380 - 1381
  • [29] Study on fast speed fractional order gradient descent method and its application in neural networks
    Wang, Yong
    He, Yuli
    Zhu, Zhiguang
    NEUROCOMPUTING, 2022, 489 : 366 - 376
  • [30] Understanding approximate Fisher information for fast convergence of natural gradient descent in wide neural networks*
    Karakida, Ryo
    Osawa, Kazuki
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2021, 2021 (12):