Channel Pruning of Transfer Learning Models Using Novel Techniques

被引:0
|
作者
Thaker, Pragnesh [1 ]
Mohan, Biju R. [1 ]
机构
[1] Natl Inst Technol Karnataka, Srinivasnagar, Surathkal 575025, Karnataka, India
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Computational modeling; Convolutional neural networks; Kernel; Clustering algorithms; Accuracy; Transfer learning; Filtering algorithms; Deep compression of CNN; channel pruning; structured pruning; neural network compression; transfer learning;
D O I
10.1109/ACCESS.2024.3416997
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This research paper delves into the challenges associated with deep learning models, specifically focusing on transfer learning. Despite the effectiveness of widely used models such as VGGNet, ResNet, and GoogLeNet, their deployment on resource-constrained devices is impeded by high memory bandwidth and computational costs, and to overcome these limitations, the study proposes pruning as a viable solution. Numerous parameters, particularly in fully connected layers, contribute minimally to computational costs, so we focus on convolution layers' pruning. The research explores and evaluates three innovative pruning methods: the Max3 Saliency pruning method, the K-Means clustering algorithm, and the Singular Value Decomposition (SVD) approach. The Max3 Saliency pruning method introduces a slight variation by using the three maximum values of the kernel instead of all nine to compute the saliency score. This method is the most effective, substantially reducing parameter and Floating Point Operations (FLOPs) for both VGG16 and ResNet56 models. Notably, VGG16 demonstrates a remarkable 46.19% reduction in parameters and a 61.91% reduction in FLOPs. Using the Max3 Saliency pruning method, ResNet56 shows a 35.15% reduction in parameters and FLOPs. The K-Means pruning algorithm is also successful, resulting in a 40.00% reduction in parameters for VGG16 and a 49.20% reduction in FLOPs. In the case of ResNet56, the K-Means algorithm achieved a 31.01% reduction in both parameters and FLOPs. While the Singular Value Decomposition (SVD) approach provides a new set of values for condensed channels, its overall pruning ratio is smaller than the Max3 Saliency and K-Means methods. The SVD pruning method prunes 20.07% parameter reduction and a 24.64% reduction in FLOPs achieved for VGG16, along with a 16.94% reduction in both FLOPs and parameters for ResNet56. Compared with the state-of-the-art methods, the Max3 Saliency and K-Means pruning methods performed better in Flops reduction metrics.
引用
收藏
页码:94914 / 94925
页数:12
相关论文
共 50 条
  • [1] Optimizing energy consumption in deep learning models using pruning and quantization techniques
    Al-Alshaikh, Halah A.
    JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES, 2024, 45 (05): : 1453 - 1463
  • [2] Transfer channel pruning for compressing deep domain adaptation models
    Yu, Chaohui
    Wang, Jindong
    Chen, Yiqiang
    Qin, Xin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (11) : 3129 - 3144
  • [3] Transfer channel pruning for compressing deep domain adaptation models
    Chaohui Yu
    Jindong Wang
    Yiqiang Chen
    Xin Qin
    International Journal of Machine Learning and Cybernetics, 2019, 10 : 3129 - 3144
  • [4] Transfer Channel Pruning for Compressing Deep Domain Adaptation Models
    Yu, Chaohui
    Wang, Jindong
    Chen, Yiqiang
    Wu, Zijing
    TRENDS AND APPLICATIONS IN KNOWLEDGE DISCOVERY AND DATA MINING: PAKDD 2019 WORKSHOPS, 2019, 11607 : 257 - 273
  • [5] Eye Disease Detection Using Deep Learning Models with Transfer Learning Techniques
    Vardhan, Kalla Bharath
    Nidhish, Mandava
    Kiran, C. Surya
    Shameem, D. Nahid
    Charan, V. Sai
    Bhavadharini, R. M.
    EAI ENDORSED TRANSACTIONS ON SCALABLE INFORMATION SYSTEMS, 2025, 12 (01):
  • [6] Channel Pruning Method for Signal Modulation Recognition Deep Learning Models
    Chen, Zhuangzhi
    Wang, Zhangwei
    Gao, Xuzhang
    Zhou, Jinchao
    Xu, Dongwei
    Zheng, Shilian
    Xuan, Qi
    Yang, Xiaoniu
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2024, 10 (02) : 442 - 453
  • [7] A Novel Approach Using Transfer Learning Architectural Models Based Deep Learning Techniques for Identification and Classification of Malignant Skin Cancer
    Subramanian, Balambigai
    Muthusamy, Suresh
    Thangaraj, Kokilavani
    Panchal, Hitesh
    Kasirajan, Elavarasi
    Marimuthu, Abarna
    Ravi, Abinaya
    WIRELESS PERSONAL COMMUNICATIONS, 2024, 134 (04) : 2183 - 2201
  • [8] Speech Emotion Recognition Using Deep Learning Transfer Models and Explainable Techniques
    Kim, Tae-Wan
    Kwak, Keun-Chang
    APPLIED SCIENCES-BASEL, 2024, 14 (04):
  • [9] Improving Efficiency of Brain Tumor Classification Models Using Pruning Techniques
    Sivakumar, M.
    Padmapriya, S. T.
    CURRENT MEDICAL IMAGING, 2024,
  • [10] Rewarded Meta-Pruning: Meta Learning with Rewards for Channel Pruning
    Shibu, Athul
    Kumar, Abhishek
    Jung, Heechul
    Lee, Dong-Gyu
    Yang, Xinsong
    MATHEMATICS, 2023, 11 (23)