CHANNEL PRUNING VIA GRADIENT OF MUTUAL INFORMATION FOR LIGHTWEIGHT CONVOLUTIONAL NEURAL NETWORKS

被引:0
|
作者
Lee, Min Kyu [1 ]
Lee, Seunghyun [1 ]
Lee, Sang Hyuk [1 ]
Song, Byung Cheol [1 ]
机构
[1] Inha Univ, Dept Elect Engn, Incheon, South Korea
来源
2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP) | 2020年
关键词
convolutional neural network; pruning; model compression; mutual information;
D O I
暂无
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Channel pruning for light-weighting networks is very effective in reducing memory footprint and computational cost. Many channel pruning methods assume that the magnitude of a particular element corresponding to each channel reflects the importance of the channel. Unfortunately, such an assumption does not always hold. To solve this problem, this paper proposes a new method to measure the importance of channels based on gradients of mutual information. The proposed method computes and measures gradients of mutual information during back-propagation by arranging a module capable of estimating mutual information. By using the measured statistics as the importance of the channel, less important channels can be removed. Finally, the fine-tuning enables robust performance restoration of the pruned model. Experimental results show that the proposed method provides better performance with smaller parameter sizes and FLOPs than the conventional schemes.
引用
收藏
页码:1751 / 1755
页数:5
相关论文
共 50 条
  • [21] Recursive least squares method for training and pruning convolutional neural networks
    Tianzong Yu
    Chunyuan Zhang
    Meng Ma
    Yuan Wang
    Applied Intelligence, 2023, 53 : 24603 - 24618
  • [22] Recursive least squares method for training and pruning convolutional neural networks
    Yu, Tianzong
    Zhang, Chunyuan
    Ma, Meng
    Wang, Yuan
    APPLIED INTELLIGENCE, 2023, 53 (20) : 24603 - 24618
  • [23] Convolutional Neural Network Compression via Dynamic Parameter Rank Pruning
    Sharma, Manish
    Heard, Jamison
    Saber, Eli
    Markopoulos, Panagiotis
    IEEE ACCESS, 2025, 13 : 18441 - 18456
  • [24] Accelerating Convolutional Neural Network Pruning via Spatial Aura Entropy
    Musat, Bogdan
    Andonie, Razvan
    2023 27TH INTERNATIONAL CONFERENCE INFORMATION VISUALISATION, IV, 2023, : 286 - 291
  • [25] Gradual Channel Pruning While Training Using Feature Relevance Scores for Convolutional Neural Networks
    Aketi, Sai Aparna
    Roy, Sourjya
    Raghunathan, Anand
    Roy, Kaushik
    IEEE ACCESS, 2020, 8 : 171924 - 171932
  • [26] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [27] Metaheuristics for pruning convolutional neural networks: A comparative study
    Palakonda, Vikas
    Tursunboev, Jamshid
    Kang, Jae-Mo
    Moon, Sunghwan
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 268
  • [28] Review of Lightweight Deep Convolutional Neural Networks
    Chen, Fanghui
    Li, Shouliang
    Han, Jiale
    Ren, Fengyuan
    Yang, Zhen
    ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING, 2024, 31 (04) : 1915 - 1937
  • [29] CNNPruner: Pruning Convolutional Neural Networks with Visual Analytics
    Li, Guan
    Wang, Junpeng
    Shen, Han-Wei
    Chen, Kaixin
    Shan, Guihua
    Lu, Zhonghua
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2021, 27 (02) : 1364 - 1373
  • [30] Pruning convolutional neural networks for inductive conformal prediction
    Zhao, Xindi
    Farjudian, Amin
    Bellotti, Anthony
    NEUROCOMPUTING, 2025, 611