Robust Neural Pruning with Gradient Sampling Optimization for Residual Neural Networks

被引:0
|
作者
Yun, Juyoung [1 ]
机构
[1] SUNY Stony Brook, Dept Comp Sci, New York, NY 11794 USA
来源
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 | 2024年
关键词
Neural Networks; Optimization; Neural Pruning;
D O I
10.1109/IJCNN60899.2024.10650301
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This research embarks on pioneering the integration of gradient sampling optimization techniques, particularly StochGradAdam, into the pruning process of neural networks. Our main objective is to address the significant challenge of maintaining accuracy in pruned neural models, critical in resource-constrained scenarios. Through extensive experimentation, we demonstrate that gradient sampling significantly preserves accuracy during and after the pruning process compared to traditional optimization methods. Our study highlights the pivotal role of gradient sampling in robust learning and maintaining crucial information post substantial model simplification. The results across CIFAR-10 datasets and residual neural architectures validate the versatility and effectiveness of our approach. This work presents a promising direction for developing efficient neural networks without compromising performance, even in environments with limited computational resources.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Pruning neural networks using multi-armed bandits
    Ameen S.
    Vadera S.
    Computer Journal, 2020, 63 (07): : 1099 - 1108
  • [42] Optimizing Turning Parameters based on Correlation Pruning Neural Networks
    Wang Wu
    Zhang Yuan-min
    2009 IITA INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS ENGINEERING, PROCEEDINGS, 2009, : 319 - 322
  • [43] Deep Neural Networks Pruning via the Structured Perspective Regularization
    Cacciola, Matteo
    Frangioni, Antonio
    Li, Xinlin
    Lodi, Andrea
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (04): : 1051 - 1077
  • [44] PRUNING OF CONVOLUTIONAL NEURAL NETWORKS USING ISING ENERGY MODEL
    Salehinejad, Hojjat
    Valaee, Shahrokh
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3935 - 3939
  • [45] Application of Sensitivity Pruning Neural Networks in Surface Roughness Prediction
    Wu, Wang
    Zhang Yuan-min
    Wang Hong-ling
    ICICTA: 2009 SECOND INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTATION TECHNOLOGY AND AUTOMATION, VOL I, PROCEEDINGS, 2009, : 48 - 51
  • [46] Subset-based training and pruning of sigmoid neural networks
    Zhou, C
    Si, J
    NEURAL NETWORKS, 1999, 12 (01) : 79 - 89
  • [47] Automatic Compression Ratio Allocation for Pruning Convolutional Neural Networks
    Liu, Yunfeng
    Kong, Huihui
    Yu, Peihua
    ICVISP 2019: PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING, 2019,
  • [48] Optimized combination, regularization, and pruning in Parallel Consensual Neural Networks
    Benediktsson, JA
    Larsen, J
    Sveinsson, JR
    Hansen, LK
    IMAGE AND SIGNAL PROCESSING FOR REMOTE SENSING IV, 1998, 3500 : 301 - 311
  • [49] Weight and Gradient Centralization in Deep Neural Networks
    Fuhl, Wolfgang
    Kasneci, Enkelejda
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 227 - 239
  • [50] A new spectral conjugate gradient method for unconstrained optimization and its application in neural networks
    Abdulrahman, Asmaa M.
    Fathi, Bayda G.
    Najm, Huda Y.
    JOURNAL OF MATHEMATICS AND COMPUTER SCIENCE-JMCS, 2025, 36 (03): : 326 - 332