Fast gradient descent algorithm for image classification with neural networks

被引:0
|
作者
Abdelkrim El Mouatasim
机构
[1] Ibn Zohr University,Faculty of Polydisciplinary Ouarzazate (FPO)
来源
关键词
Gradient algorithm; Nesterov algorithm; Learning rate control; Image classification; Neural networks;
D O I
暂无
中图分类号
学科分类号
摘要
Any optimization of gradient descent methods involves selecting a learning rate. Tuning the learning rate can quickly become repetitive with deeper models of image classification, does not necessarily lead to optimal convergence. We proposed in this paper, a modification of the gradient descent algorithm in which the Nestrove step is added, and the learning rate is update in each epoch. Instead, we learn learning rate itself, either by Armijo rule, or by control step. Our algorithm called fast gradient descent (FGD) for solving image classification with neural networks problems, the quadratic convergence rate o(k2)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$o(k^2)$$\end{document} of FGD algorithm are proved. FGD algorithm are applicate to a MNIST dataset. The numerical experiment, show that our approach FGD algorithm is faster than gradient descent algorithms.
引用
收藏
页码:1565 / 1572
页数:7
相关论文
共 50 条
  • [41] Fractional Gradient Descent Method for Spiking Neural Networks
    Yang, Honggang
    Chen, Jiejie
    Jiang, Ping
    Xu, Mengfei
    Zhao, Haiming
    2023 2ND CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, CFASTA, 2023, : 636 - 641
  • [42] Understanding the Convolutional Neural Networks with Gradient Descent and Backpropagation
    Zhou, XueFei
    2ND INTERNATIONAL CONFERENCE ON MACHINE VISION AND INFORMATION TECHNOLOGY (CMVIT 2018), 2018, 1004
  • [43] Neural Networks can Learn Representations with Gradient Descent
    Damian, Alex
    Lee, Jason D.
    Soltanolkotabi, Mahdi
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [44] A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks
    Xu, Yan
    Zeng, Xiaoqin
    Han, Lixin
    Yang, Jing
    NEURAL NETWORKS, 2013, 43 : 99 - 113
  • [45] A self-adaptive gradient descent search algorithm for fully-connected neural networks
    Xue, Yu
    Wang, Yankang
    Liang, Jiayu
    NEUROCOMPUTING, 2022, 478 : 70 - 80
  • [46] A data-reusing gradient descent algorithm for complex-valued recurrent neural networks
    Goh, SL
    Mandic, DP
    KNOWLEDGE-BASED INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS, PT 2, PROCEEDINGS, 2003, 2774 : 340 - 350
  • [47] Hardness Prediction of 7003 Aluminum Alloy by Gradient Descent Algorithm in BP Artificial Neural Networks
    Ren, J. P.
    Song, R. G.
    HIGH PERFORMANCE STRUCTURES AND MATERIALS ENGINEERING, PTS 1 AND 2, 2011, 217-218 : 1458 - 1461
  • [48] A NOVEL FAST CLEAN ALGORITHM USING THE GRADIENT DESCENT METHOD
    Choi, Young-Jae
    Choi, In-Sik
    MICROWAVE AND OPTICAL TECHNOLOGY LETTERS, 2017, 59 (05) : 1018 - 1022
  • [49] A fast road image segmentation algorithm based on cellular neural networks
    Xu Guobao
    Yin Yixin
    Yin Lu
    Hao Yanshuang
    Zhou Meijuan
    PROCEEDINGS OF THE 26TH CHINESE CONTROL CONFERENCE, VOL 4, 2007, : 114 - +
  • [50] Classification of Honey as Genuine or Fake via Artificial Neural Network using Gradient Descent Backpropagation Algorithm
    Hortinela, Carlos C.
    Balbin, Jessie R.
    Tibayan, Patrick Jonas A.
    Cabela, John Myrrh D.
    Magwili, Glenn, V
    2020 IEEE 12TH INTERNATIONAL CONFERENCE ON HUMANOID, NANOTECHNOLOGY, INFORMATION TECHNOLOGY, COMMUNICATION AND CONTROL, ENVIRONMENT, AND MANAGEMENT (HNICEM), 2020,