Filter Pruning by Switching to Neighboring CNNs With Good Attributes

被引:43
|
作者
He, Yang [1 ,2 ]
Liu, Ping [1 ,2 ]
Zhu, Linchao [1 ]
Yang, Yi [3 ]
机构
[1] Univ Technol Sydney, Australian Artificial Intelligence Inst, ReLER Lab, Sydney, NSW 2007, Australia
[2] ASTAR, Ctr Frontier AI Res CFAR, Singapore 138632, Singapore
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310000, Peoples R China
基金
澳大利亚研究理事会;
关键词
Neural networks; Training; Training data; Neurons; Libraries; Graphics processing units; Electronic mail; Filter pruning; meta-attributes; network compression; neural networks; CLASSIFICATION; ACCURACY;
D O I
10.1109/TNNLS.2022.3149332
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose meta-attribute-based filter pruning (MFP). First, to expand the existing magnitude information-based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.
引用
收藏
页码:8044 / 8056
页数:13
相关论文
共 50 条
  • [41] MDPruner: Meta-Learning Driven Dynamic Filter Pruning for Efficient Object Detection
    Zhou, Lingyun
    Liu, Xiaoyong
    IEEE ACCESS, 2024, 12 : 136925 - 136935
  • [42] Filter Pruning via Measuring Feature Map Information
    Shao, Linsong
    Zuo, Haorui
    Zhang, Jianlin
    Xu, Zhiyong
    Yao, Jinzhen
    Wang, Zhixing
    Li, Hong
    SENSORS, 2021, 21 (19)
  • [43] Filter Pruning by High-Order Spectral Clustering
    Lin, Hang
    Peng, Yifan
    Zhang, Yubo
    Bie, Lin
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (04) : 2402 - 2415
  • [44] Filter pruning - deeper layers need fewer filters
    Wang, Heng
    Ye, Xiang
    Li, Yong
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (03) : 1977 - 1990
  • [45] Filter Pruning Algorithm Based on Deep Reinforcement Learning
    Liu Y.
    Teng Y.
    Niu T.
    Zhi J.
    Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2023, 46 (03): : 31 - 36
  • [46] Gated filter pruning via sample manifold relationships
    Wu, Pingfan
    Huang, Hengyi
    Sun, Han
    Liu, Ningzhong
    APPLIED INTELLIGENCE, 2024, 54 (20) : 9848 - 9863
  • [47] Convolutional Neural Network Pruning Using Filter Attenuation
    Mousa-Pasandi, Morteza
    Hajabdollahi, Mohsen
    Karimi, Nader
    Samavi, Shadrokh
    Shirani, Shahram
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 2905 - 2909
  • [48] LDP: A Large Diffuse Filter Pruning to Slim the CNN
    Wei, Wenyue
    Wang, Yizhen
    Li, Yun
    Xia, Yinfeng
    Yin, Baoqun
    PROCEEDINGS OF 2022 THE 6TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND SOFT COMPUTING, ICMLSC 20222, 2022, : 26 - 32
  • [49] Progressive Local Filter Pruning for Image Retrieval Acceleration
    Wang, Xiaodong
    Zheng, Zhedong
    He, Yang
    Yan, Fei
    Zeng, Zhiqiang
    Yang, Yi
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 9597 - 9607
  • [50] BFRIFP: Brain Functional Reorganization Inspired Filter Pruning
    Qiu, Shoumeng
    Gu, Yuzhang
    Zhang, Xiaolin
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 16 - 28