DEEP LEARNING BASED METHOD FOR PRUNING DEEP NEURAL NETWORKS

被引:18
作者
Li, Lianqiang [1 ]
Zhu, Jie [1 ]
Sun, Ming-Ting [2 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Elect Engn, Shanghai, Peoples R China
[2] Univ Washington, Dept Elect & Comp Engn, Seattle, WA 98195 USA
来源
2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW) | 2019年
基金
中国国家自然科学基金;
关键词
Network pruning; filter-level; deep learning;
D O I
10.1109/ICMEW.2019.00-68
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In order to implement Deep Neural Networks (DNNs) into mobile devices, network pruning has been widely explored for lightening the complexity of Deep Neural Networks in terms of computational cost and parameter-storage load. In this paper, we propose a novel filter-level pruning method which utilizes a deep learning method to pursue compact DNNs. Specifically, we use a DNN model to extract features from the filters at first. Then, we employ a clustering algorithm to force the extracted features roll into different groups. By mapping the clustering results to the filters, we get the "similarity" relationships among the filters. At last, we keep the filter which is closest to the centroid in each cluster, prune out the others, and retrain the pruned DNN model. Compared with previous methods that employ heuristic ways on filters directly or selecting shallow features from filters manually, our method takes advantages of the deep learning method which can represent the raw filters in a more precise way. Experimental results show that our method outperforms several state-of-the-art pruning methods with negligible accuracy loss.
引用
收藏
页码:312 / 317
页数:6
相关论文
共 50 条
  • [1] QLP: Deep Q-Learning for Pruning Deep Neural Networks
    Camci, Efe
    Gupta, Manas
    Wu, Min
    Lin, Jie
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (10) : 6488 - 6501
  • [2] Neuroplasticity-Based Pruning Method for Deep Convolutional Neural Networks
    Camacho, Jose David
    Villasenor, Carlos
    Lopez-Franco, Carlos
    Arana-Daniel, Nancy
    APPLIED SCIENCES-BASEL, 2022, 12 (10):
  • [3] EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks
    Poyatos, Javier
    Molina, Daniel
    Martinez, Aritz D.
    Del Ser, Javier
    Herrera, Francisco
    NEURAL NETWORKS, 2023, 158 : 59 - 82
  • [4] Methods for Pruning Deep Neural Networks
    Vadera, Sunil
    Ameen, Salem
    IEEE ACCESS, 2022, 10 : 63280 - 63300
  • [5] Deep neural network pruning method based on sensitive layers and reinforcement learning
    Wenchuan Yang
    Haoran Yu
    Baojiang Cui
    Runqi Sui
    Tianyu Gu
    Artificial Intelligence Review, 2023, 56 : 1897 - 1917
  • [6] Deep neural network pruning method based on sensitive layers and reinforcement learning
    Yang, Wenchuan
    Yu, Haoran
    Cui, Baojiang
    Sui, Runqi
    Gu, Tianyu
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (SUPPL 2) : 1897 - 1917
  • [7] Fast Convex Pruning of Deep Neural Networks
    Aghasi, Alireza
    Abdi, Afshin
    Romberg, Justin
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 158 - 188
  • [8] Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 23
  • [9] Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    Journal of Machine Learning Research, 2021, 22
  • [10] SSFP: A Structured Stripe-Filter Pruning Method for Deep Neural Networks
    Liu, Jingjing
    Huang, Lingjin
    Feng, Manlong
    Guo, Aiying
    2024 13TH INTERNATIONAL CONFERENCE ON COMMUNICATIONS, CIRCUITS AND SYSTEMS, ICCCAS 2024, 2024, : 80 - 84