Filter pruning by quantifying feature similarity and entropy of feature maps

被引:12
|
作者
Liu, Yajun [1 ]
Fan, Kefeng [2 ]
Wu, Dakui [1 ]
Zhou, Wenju [1 ]
机构
[1] Shanghai Univ, Sch Mechatron Engn & Automat, Shanghai 200444, Peoples R China
[2] China Elect Standardizat Inst, Beijing 100007, Peoples R China
关键词
Filter pruning; Feature similarity (FSIM); Two-dimensional entropy (2D entropy); Feature maps; MODEL;
D O I
10.1016/j.neucom.2023.126297
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Filter pruning can effectively reduce the time cost and computing resources of convolutional neural net-works (CNNs), and is well applied to lightweight edge devices. However, most of the current pruning methods focus on the inherent properties of the filters themselves to prune the network, and pay less attention to the connection between the filters and the feature maps. Feature similarity (FSIM) utilizes the fact that the human visual system is more sensitive to the underlying features of the images to more accurately assess image quality. We discover that FSIM is also suitable for evaluating feature maps of CNNs. In addition, the information richness in the feature maps reflects the degree of importance of the filters. Based on the above research, we propose to quantify the importance of feature maps with FSIM and two-dimensional entropy (2D Entropy) indicator to further guide filter pruning (FSIM-E). The FSIM-E is executed on CIFAR-10 and ILSVRC-2012 to demonstrate that FSIM-E can effectively compress and accelerate the network model. For example, for ResNet-110 on CIFAR-10, FSIM-E prunes 71.1% of the FLOPs and 66.5% of the parameters, while improving the accuracy by 0.1%. With ResNet-50, FSIM-E can achieve 57.2% pruning rate of FLOPs and 53.1% pruning rate of parameters on ILSVRC-2012 with loss of only 0.42% of Top-5 accuracy. (c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] FEATURE MATCHING BASED ON TOP K RANK SIMILARITY
    Jiang, Junjun
    Ma, Qing
    Lu, Tao
    Wang, Zhongyuan
    Ma, Jiayi
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 2316 - 2320
  • [32] Neuromorphic building blocks for adaptable cortical feature maps
    Markan, C. M.
    Gupta, Priti
    VLSI-SOC 2007: PROCEEDINGS OF THE 2007 IFIP WG 10.5 INTERNATIONAL CONFERENCE ON VERY LARGE SCALE INTEGRATION, 2007, : 7 - 12
  • [33] Theory for the alignment of cortical feature maps during development
    Bressloff, Paul C.
    Oster, Andrew M.
    PHYSICAL REVIEW E, 2010, 82 (02):
  • [34] REAF: Remembering Enhancement and Entropy-Based Asymptotic Forgetting for Filter Pruning
    Zhang, Xin
    Xie, Weiying
    Li, Yunsong
    Jiang, Kai
    Fang, Leyuan
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 3912 - 3923
  • [35] A feature-based method for tire pattern similarity detection
    Li Hongling
    Dong Yude
    Ding Heng
    Wang Tao
    Wang Jinbiao
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART D-JOURNAL OF AUTOMOBILE ENGINEERING, 2023, 237 (10-11) : 2539 - 2552
  • [36] Person Identification From Video by Similarity Between Feature Set
    Chithra, M.
    Arunkumar, R.
    2015 INTERNATIONAL CONFERENCE ON INNOVATIONS IN INFORMATION, EMBEDDED AND COMMUNICATION SYSTEMS (ICIIECS), 2015,
  • [37] Development of Parametric Filter Banks for Sound Feature Extraction
    Cai, Xiangyu
    Ko, Sunwoo
    IEEE ACCESS, 2023, 11 : 109856 - 109867
  • [38] Magnitude and Similarity Based Variable Rate Filter Pruning for Efficient Convolution Neural Networks
    Ghimire, Deepak
    Kim, Seong-Heum
    APPLIED SCIENCES-BASEL, 2023, 13 (01):
  • [39] Integrating feature maps and competitive layer architectures for motion segmentation
    Steffen, Jan
    Pardowitz, Michael
    Steil, Jochen J.
    Ritter, Helge
    NEUROCOMPUTING, 2011, 74 (09) : 1372 - 1381
  • [40] Efficient χ2 Kernel Linearization via Random Feature Maps
    Yuan, Xiao-Tong
    Wang, Zhenzhen
    Deng, Jiankang
    Liu, Qingshan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (11) : 2448 - 2453