Filter pruning by quantifying feature similarity and entropy of feature maps

被引:12
|
作者
Liu, Yajun [1 ]
Fan, Kefeng [2 ]
Wu, Dakui [1 ]
Zhou, Wenju [1 ]
机构
[1] Shanghai Univ, Sch Mechatron Engn & Automat, Shanghai 200444, Peoples R China
[2] China Elect Standardizat Inst, Beijing 100007, Peoples R China
关键词
Filter pruning; Feature similarity (FSIM); Two-dimensional entropy (2D entropy); Feature maps; MODEL;
D O I
10.1016/j.neucom.2023.126297
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Filter pruning can effectively reduce the time cost and computing resources of convolutional neural net-works (CNNs), and is well applied to lightweight edge devices. However, most of the current pruning methods focus on the inherent properties of the filters themselves to prune the network, and pay less attention to the connection between the filters and the feature maps. Feature similarity (FSIM) utilizes the fact that the human visual system is more sensitive to the underlying features of the images to more accurately assess image quality. We discover that FSIM is also suitable for evaluating feature maps of CNNs. In addition, the information richness in the feature maps reflects the degree of importance of the filters. Based on the above research, we propose to quantify the importance of feature maps with FSIM and two-dimensional entropy (2D Entropy) indicator to further guide filter pruning (FSIM-E). The FSIM-E is executed on CIFAR-10 and ILSVRC-2012 to demonstrate that FSIM-E can effectively compress and accelerate the network model. For example, for ResNet-110 on CIFAR-10, FSIM-E prunes 71.1% of the FLOPs and 66.5% of the parameters, while improving the accuracy by 0.1%. With ResNet-50, FSIM-E can achieve 57.2% pruning rate of FLOPs and 53.1% pruning rate of parameters on ILSVRC-2012 with loss of only 0.42% of Top-5 accuracy. (c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Model pruning based on filter similarity for edge device deployment
    Wu, Tingting
    Song, Chunhe
    Zeng, Peng
    FRONTIERS IN NEUROROBOTICS, 2023, 17
  • [22] GLEE: A granularity filter for feature selection
    Ba, Jing
    Wang, Pingxin
    Yang, Xibei
    Yu, Hualong
    Yu, Dongjun
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 122
  • [23] A conjunctive feature similarity effect for visual search
    Takeda, Yuji
    Phillips, Steven
    Kumada, Takatsune
    QUARTERLY JOURNAL OF EXPERIMENTAL PSYCHOLOGY, 2007, 60 (02) : 186 - 190
  • [24] VNGEP: Filter pruning based on von Neumann graph entropy
    Shi, Chaokun
    Hao, Yuexing
    Li, Gongyan
    Xu, Shaoyun
    NEUROCOMPUTING, 2023, 528 : 113 - 124
  • [25] FPC: Feature Map Pruning using Channel Attention Mechanism
    Liu, Yang
    Hu, Jianqiang
    Zhou, Xiaobao
    Wu, Jiaxin
    INTERNATIONAL CONFERENCE ON INTELLIGENT TRAFFIC SYSTEMS AND SMART CITY (ITSSC 2021), 2022, 12165
  • [26] Redundant feature pruning for accelerated inference in deep neural networks
    Ayinde, Babajide O.
    Inanc, Tamer
    Zurada, Jacek M.
    NEURAL NETWORKS, 2019, 118 : 148 - 158
  • [27] NEURAL NETWORK FEATURE MAPS FOR CHINESE PHONEMES
    WU, P
    WARWICK, K
    KOSKA, M
    NEUROCOMPUTING, 1992, 4 (1-2) : 109 - 112
  • [28] Slow feature-based feature fusion methodology for machinery similarity-based prognostics
    Xue, Bin
    Xu, Haoyan
    Huang, Xing
    Xu, Zhongbin
    ISA TRANSACTIONS, 2024, 152 : 96 - 112
  • [29] Similarity Based Filter Pruning for Efficient Super-Resolution Models
    Chu, Chu
    Chen, Li
    Gao, Zhiyong
    2020 IEEE INTERNATIONAL SYMPOSIUM ON BROADBAND MULTIMEDIA SYSTEMS AND BROADCASTING (BMSB), 2020,
  • [30] Feature Selection Using Neighborhood based Entropy
    Farnaghi-Zadeh, Fatemeh
    Rahmani, Mohsen
    Amiri, Maryam
    JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2022, 28 (11) : 1169 - 1192