Evaluation of Gradient Descent Optimization: Using Android Applications in Neural Networks

被引:2
|
作者
Alshahrani, Hani [1 ]
Alzahrani, Abdulrahman [1 ]
Alshehri, Ali [1 ]
Alharthi, Raed [1 ]
Fu, Huirong [1 ]
机构
[1] Oakland Univ, Sch Engn & Comp Sci, Rochester, MI 48309 USA
来源
PROCEEDINGS 2017 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI) | 2017年
基金
美国国家科学基金会;
关键词
neural networks; gradient descent optimizers; loss function; Android;
D O I
10.1109/CSCI.2017.257
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Artificial neural networks have gained prominence by being used in different aspects such as medical diagnosis and detecting malware applications. However, neural network models could have an error rate that indicates their performance. Thus, optimization algorithms can minimize the error rate by updating the neural network parameters to reach an optimal solution. This paper explores the use of permissions and underlying Linux system information features in Android platform to evaluate gradient descent optimization algorithms in neural networks. Those optimizers are evaluated by running them on a set of Android applications to find the optimum one. Furthermore, each optimizer is assessed based on its default and adjusted parameters values. This evaluation shows that the best accuracy score is 92.21% collected by Adam optimizer.
引用
收藏
页码:1471 / 1476
页数:6
相关论文
共 50 条
  • [31] MVDroid: an android malicious VPN detector using neural networks
    Seraj, Saeed
    Khodambashi, Siavash
    Pavlidis, Michalis
    Polatidis, Nikolaos
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (29): : 21555 - 21565
  • [32] Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks
    Hashem, Ibrahim Abaker Targio
    Alaba, Fadele Ayotunde
    Jumare, Muhammad Haruna
    Ibrahim, Ashraf Osman
    Abulfaraj, Anas Waleed
    IEEE ACCESS, 2024, 12 : 33757 - 33768
  • [33] Hopfield neural network learning using direct gradient descent of energy function
    Tang, Z
    Tashima, K
    Hebishima, H
    Ishizuka, O
    Tanno, K
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 1996, E79A (02) : 258 - 261
  • [34] Study on fast speed fractional order gradient descent method and its application in neural networks
    Wang, Yong
    He, Yuli
    Zhu, Zhiguang
    NEUROCOMPUTING, 2022, 489 : 366 - 376
  • [35] A comparison of gradient ascent, gradient descent and genetic-algorithm-based artificial neural networks for the binary classification problem
    Pendharkar, Parag C.
    EXPERT SYSTEMS, 2007, 24 (02) : 65 - 86
  • [36] Gradient-based PIV using neural networks
    Kimura, I
    Susaki, Y
    Kiyohara, R
    Kaga, A
    Kuroe, Y
    JOURNAL OF VISUALIZATION, 2002, 5 (04) : 363 - 370
  • [37] Gradient-based PIV using neural networks
    I. Kimura
    Y. Susaki
    R. Kiyohara
    A. Kaga
    Y. Kuroe
    Journal of Visualization, 2002, 5 : 363 - 370
  • [38] Weights optimization for multi-instance multi-label RBF neural networks using steepest descent method
    Li, Cunhe
    Shi, Guoqiang
    NEURAL COMPUTING & APPLICATIONS, 2013, 22 (7-8): : 1563 - 1569
  • [39] Traditional and Accelerated Gradient Descent for Neural Architecture Search
    Trillos, Nicolas Garcia
    Morales, Felix
    Morales, Javier
    GEOMETRIC SCIENCE OF INFORMATION (GSI 2021), 2021, 12829 : 507 - 514
  • [40] Weights optimization for multi-instance multi-label RBF neural networks using steepest descent method
    Cunhe Li
    Guoqiang Shi
    Neural Computing and Applications, 2013, 22 : 1563 - 1569