Robust Neural Pruning with Gradient Sampling Optimization for Residual Neural Networks

被引:0
|
作者
Yun, Juyoung [1 ]
机构
[1] SUNY Stony Brook, Dept Comp Sci, New York, NY 11794 USA
来源
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 | 2024年
关键词
Neural Networks; Optimization; Neural Pruning;
D O I
10.1109/IJCNN60899.2024.10650301
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This research embarks on pioneering the integration of gradient sampling optimization techniques, particularly StochGradAdam, into the pruning process of neural networks. Our main objective is to address the significant challenge of maintaining accuracy in pruned neural models, critical in resource-constrained scenarios. Through extensive experimentation, we demonstrate that gradient sampling significantly preserves accuracy during and after the pruning process compared to traditional optimization methods. Our study highlights the pivotal role of gradient sampling in robust learning and maintaining crucial information post substantial model simplification. The results across CIFAR-10 datasets and residual neural architectures validate the versatility and effectiveness of our approach. This work presents a promising direction for developing efficient neural networks without compromising performance, even in environments with limited computational resources.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Robust design optimization with mathematical programming neural networks
    Gupta, KC
    Li, JM
    COMPUTERS & STRUCTURES, 2000, 76 (04) : 507 - 516
  • [2] Adaptive Stochastic Conjugate Gradient Optimization for Backpropagation Neural Networks
    Hashem, Ibrahim Abaker Targio
    Alaba, Fadele Ayotunde
    Jumare, Muhammad Haruna
    Ibrahim, Ashraf Osman
    Abulfaraj, Anas Waleed
    IEEE ACCESS, 2024, 12 : 33757 - 33768
  • [3] Structure optimization strategy of Neural Networks - Research on pruning algorithm
    Zhang, M
    Xu, YM
    PROCEEDINGS OF THE 3RD WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-5, 2000, : 877 - 881
  • [4] A pruning method for neural networks and its application for optimization in electromagnetic
    Guimaraes, FG
    Ramírez, JA
    IEEE TRANSACTIONS ON MAGNETICS, 2004, 40 (02) : 1160 - 1163
  • [5] Structure Optimization of Artificial Neural Networks Using Pruning Methods
    Ciganek, Jan
    Osusky, Jakub
    2018 CYBERNETICS & INFORMATICS (K&I), 2018,
  • [6] Strengthening Gradient Descent by Sequential Motion Optimization for Deep Neural Networks
    Le-Duc, Thang
    Nguyen, Quoc-Hung
    Lee, Jaehong
    Nguyen-Xuan, H.
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2023, 27 (03) : 565 - 579
  • [7] Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks
    Neftci, Emre O.
    Mostafa, Hesham
    Zenke, Friedemann
    IEEE SIGNAL PROCESSING MAGAZINE, 2019, 36 (06) : 51 - 63
  • [8] Methods for Pruning Deep Neural Networks
    Vadera, Sunil
    Ameen, Salem
    IEEE ACCESS, 2022, 10 : 63280 - 63300
  • [9] GROWING AND PRUNING NEURAL TREE NETWORKS
    SANKAR, A
    MAMMONE, RJ
    IEEE TRANSACTIONS ON COMPUTERS, 1993, 42 (03) : 291 - 299
  • [10] PRUNING ARTIFICIAL NEURAL NETWORKS USING NEURAL COMPLEXITY MEASURES
    Jorgensen, Thomas D.
    Haynes, Barry P.
    Norlund, Charlotte C. F.
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2008, 18 (05) : 389 - 403