Robust Neural Pruning with Gradient Sampling Optimization for Residual Neural Networks

被引:0
|
作者
Yun, Juyoung [1 ]
机构
[1] SUNY Stony Brook, Dept Comp Sci, New York, NY 11794 USA
来源
2024 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN 2024 | 2024年
关键词
Neural Networks; Optimization; Neural Pruning;
D O I
10.1109/IJCNN60899.2024.10650301
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This research embarks on pioneering the integration of gradient sampling optimization techniques, particularly StochGradAdam, into the pruning process of neural networks. Our main objective is to address the significant challenge of maintaining accuracy in pruned neural models, critical in resource-constrained scenarios. Through extensive experimentation, we demonstrate that gradient sampling significantly preserves accuracy during and after the pruning process compared to traditional optimization methods. Our study highlights the pivotal role of gradient sampling in robust learning and maintaining crucial information post substantial model simplification. The results across CIFAR-10 datasets and residual neural architectures validate the versatility and effectiveness of our approach. This work presents a promising direction for developing efficient neural networks without compromising performance, even in environments with limited computational resources.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Simple Evolutionary Optimization Can Rival Stochastic Gradient Descent in Neural Networks
    Morse, Gregory
    Stanley, Kenneth O.
    GECCO'16: PROCEEDINGS OF THE 2016 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2016, : 477 - 484
  • [32] Training oscillatory neural networks using natural gradient particle swarm optimization
    Shahbazi, Hamed
    Jamshidi, Kamal
    Monadjemi, Amir Hasan
    Manoochehri, Hafez Eslami
    ROBOTICA, 2015, 33 (07) : 1551 - 1567
  • [33] Neural networks for synthesis and optimization of antenna arrays
    Merad, Loo
    Bendimerad, Fethi Tarik
    Meriah, Sidi Mohamed
    Djennas, Sidi Ahmed
    RADIOENGINEERING, 2007, 16 (01) : 23 - 30
  • [34] Application of Artificial Neural Networks for Filtration Optimization
    Griffiths, K. A.
    Andrews, R. C.
    JOURNAL OF ENVIRONMENTAL ENGINEERING-ASCE, 2011, 137 (11): : 1040 - 1047
  • [35] diffGrad: An Optimization Method for Convolutional Neural Networks
    Dubey, Shiv Ram
    Chakraborty, Soumendu
    Roy, Swalpa Kumar
    Mukherjee, Snehasis
    Singh, Satish Kumar
    Chaudhuri, Bidyut Baran
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (11) : 4500 - 4511
  • [36] Optimization in electric power installation by neural networks
    Swirski, Konrad
    Milewski, Jaroslaw
    PRZEGLAD ELEKTROTECHNICZNY, 2009, 85 (08): : 155 - 157
  • [37] Optimization energy management in buildings with neural networks
    Foggia, G.
    Pham, T. T. Ha
    Warkozek, G.
    Wurtz, F.
    INTERNATIONAL JOURNAL OF APPLIED ELECTROMAGNETICS AND MECHANICS, 2009, 30 (3-4) : 237 - 244
  • [38] Microcode optimization with neural networks
    Bharitkar, S
    Tsuchiya, K
    Takefuji, Y
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (03): : 698 - 703
  • [39] Pruning Neural Networks Using Multi-Armed Bandits
    Ameen, Salem
    Vadera, Sunil
    COMPUTER JOURNAL, 2020, 63 (07): : 1099 - 1108
  • [40] Structured Term Pruning for Computational Efficient Neural Networks Inference
    Huang, Kai
    Li, Bowen
    Chen, Siang
    Claesen, Luc
    Xi, Wei
    Chen, Junjian
    Jiang, Xiaowen
    Liu, Zhili
    Xiong, Dongliang
    Yan, Xiaolang
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2023, 42 (01) : 190 - 203