Robust Neural Pruning with Gradient Sampling Optimization for Residual Neural Networks

被引:0
|
作者
Yun, Juyoung [1 ]
机构
[1] SUNY Stony Brook, Dept Comp Sci, New York, NY 11794 USA
来源
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 | 2024年
关键词
Neural Networks; Optimization; Neural Pruning;
D O I
10.1109/IJCNN60899.2024.10650301
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This research embarks on pioneering the integration of gradient sampling optimization techniques, particularly StochGradAdam, into the pruning process of neural networks. Our main objective is to address the significant challenge of maintaining accuracy in pruned neural models, critical in resource-constrained scenarios. Through extensive experimentation, we demonstrate that gradient sampling significantly preserves accuracy during and after the pruning process compared to traditional optimization methods. Our study highlights the pivotal role of gradient sampling in robust learning and maintaining crucial information post substantial model simplification. The results across CIFAR-10 datasets and residual neural architectures validate the versatility and effectiveness of our approach. This work presents a promising direction for developing efficient neural networks without compromising performance, even in environments with limited computational resources.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Optimization of welding process with neural networks
    Hascoet, JY
    Legoff, O
    MECANIQUE INDUSTRIELLE ET MATERIAUX, 1998, 51 (03): : 121 - 126
  • [22] Robust chaos in neural networks
    Potapov, A
    Ali, MK
    PHYSICS LETTERS A, 2000, 277 (06) : 310 - 322
  • [23] Evaluation of Gradient Descent Optimization: Using Android Applications in Neural Networks
    Alshahrani, Hani
    Alzahrani, Abdulrahman
    Alshehri, Ali
    Alharthi, Raed
    Fu, Huirong
    PROCEEDINGS 2017 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI), 2017, : 1471 - 1476
  • [24] Neural Networks in Recommender Systems with an Optimization to the Neural Attentive Recommender Model
    Suraj, K. C.
    Shyam, R.
    2021 IEEE INTERNATIONAL CONFERENCE ON MOBILE NETWORKS AND WIRELESS COMMUNICATIONS (ICMNWC), 2021,
  • [25] Adaptive training and pruning for neural networks: algorithms and application
    Chen, S
    Chang, SJ
    Yuan, JH
    Zhang, YX
    Wong, KW
    ACTA PHYSICA SINICA, 2001, 50 (04) : 674 - 681
  • [26] Conditional Automated Channel Pruning for Deep Neural Networks
    Liu, Yixin
    Guo, Yong
    Guo, Jiaxin
    Jiang, Luoqian
    Chen, Jian
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 1275 - 1279
  • [27] Gibbs sampling the posterior of neural networks
    Piccioli, Giovanni
    Troiani, Emanuele
    Zdeborova, Lenka
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2024, 57 (12)
  • [28] Active sampling in evolving neural networks
    Parisi, D
    HUMAN DEVELOPMENT, 1997, 40 (05) : 320 - 324
  • [29] Pruning convolutional neural networks for inductive conformal prediction
    Zhao, Xindi
    Farjudian, Amin
    Bellotti, Anthony
    NEUROCOMPUTING, 2025, 611
  • [30] Modeling of Microwave Filters Using Gradient Particle Swarm Optimization Neural Networks
    Erredir, Ch.
    Riabi, M. L.
    Ammari, H.
    Bouarroudj, E.
    2017 SEMINAR ON DETECTION SYSTEMS ARCHITECTURES AND TECHNOLOGIES (DAT), 2017,