Filter Pruning by Switching to Neighboring CNNs With Good Attributes

被引:43
|
作者
He, Yang [1 ,2 ]
Liu, Ping [1 ,2 ]
Zhu, Linchao [1 ]
Yang, Yi [3 ]
机构
[1] Univ Technol Sydney, Australian Artificial Intelligence Inst, ReLER Lab, Sydney, NSW 2007, Australia
[2] ASTAR, Ctr Frontier AI Res CFAR, Singapore 138632, Singapore
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310000, Peoples R China
基金
澳大利亚研究理事会;
关键词
Neural networks; Training; Training data; Neurons; Libraries; Graphics processing units; Electronic mail; Filter pruning; meta-attributes; network compression; neural networks; CLASSIFICATION; ACCURACY;
D O I
10.1109/TNNLS.2022.3149332
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose meta-attribute-based filter pruning (MFP). First, to expand the existing magnitude information-based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.
引用
收藏
页码:8044 / 8056
页数:13
相关论文
共 50 条
  • [1] FPWT: Filter pruning via wavelet transform for CNNs
    Liu, Yajun
    Fan, Kefeng
    Zhou, Wenju
    NEURAL NETWORKS, 2024, 179
  • [2] Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks
    He, Yang
    Dong, Xuanyi
    Kang, Guoliang
    Fu, Yanwei
    Yan, Chenggang
    Yang, Yi
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) : 3594 - 3604
  • [3] A Simple and Effective Convolutional Filter Pruning based on Filter Dissimilarity Analysis
    Erick, F. X.
    Sawant, Shrutika S.
    Goeb, Stephan
    Holzer, N.
    Lang, E. W.
    Goetz, Th
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 139 - 145
  • [4] Cluster Pruning: An Efficient Filter Pruning Method for Edge AI Vision Applications
    Gamanayake, Chinthaka
    Jayasinghe, Lahiru
    Ng, Benny Kai Kiat
    Yuen, Chau
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 802 - 816
  • [5] Manipulating Identical Filter Redundancy for Efficient Pruning on Deep and Complicated CNN
    Hao, Tianxiang
    Ding, Xiaohan
    Han, Jungong
    Guo, Yuchen
    Ding, Guiguang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16831 - 16844
  • [6] FALF ConvNets: Fatuous auxiliary loss based filter-pruning for efficient deep CNNs
    Singh, Pravendra
    Kadi, Vinay Sameer Raja
    Namboodiri, Vinay P.
    IMAGE AND VISION COMPUTING, 2020, 93
  • [7] Towards efficient filter pruning via topology
    Xu, Xiaozhou
    Chen, Jun
    Su, Hongye
    Xie, Lei
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2022, 19 (03) : 639 - 649
  • [8] Towards efficient filter pruning via topology
    Xiaozhou Xu
    Jun Chen
    Hongye Su
    Lei Xie
    Journal of Real-Time Image Processing, 2022, 19 : 639 - 649
  • [9] Pruning-and-distillation: One-stage joint compression framework for CNNs via clustering
    Niu, Tao
    Teng, Yinglei
    Jin, Lei
    Zou, Panpan
    Liu, Yiding
    IMAGE AND VISION COMPUTING, 2023, 136
  • [10] Soft and Hard Filter Pruning via Dimension Reduction
    Cai, Linhang
    An, Zhulin
    Yang, Chuanguang
    Xu, Yongjun
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,