ON THE ROLE OF STRUCTURED PRUNING FOR NEURAL NETWORK COMPRESSION

被引:6
作者
Bragagnolo, Andrea [1 ,2 ]
Tartaglione, Enzo [1 ]
Fiandrotti, Attilio [1 ]
Grangetto, Marco [1 ]
机构
[1] Univ Turin, Comp Sci Dept, I-10149 Turin, Italy
[2] Synesthesia Srl, Turin, TO, Italy
来源
2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP) | 2021年
关键词
Pruning; Deep learning; Compression; MPEG-7;
D O I
10.1109/ICIP42928.2021.9506708
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This works explores the benefits of structured parameter pruning in the framework of the MPEG standardization efforts for neural network compression. First less relevant parameters are pruned from the network, then remaining parameters are quantized and finally quantized parameters are entropy coded. We consider an unstructured pruning strategy that maximizes the number of pruned parameters at the price of randomly sparse tensors and a structured strategy that prunes fewer parameters yet yields regularly sparse tensors. We show that structured pruning enables better end-to-end compression despite lower pruning ratio because it boosts the efficiency of the arithmetic coder. As a bonus, once decompressed, the network memory footprint is lower as well as its inference time.
引用
收藏
页码:3527 / 3531
页数:5
相关论文
共 18 条
  • [1] [Anonymous], 2019, 7 INT C LEARN REPR I, DOI DOI 10.1080/09593985.2019.1709234
  • [2] Brutzkus A., 2018, SGD LEARNS PARAMETER
  • [3] Han S., 2015, ADV NEURAL INFORM PR, P1135
  • [4] Liu Z., 2018, ARXIV181005270
  • [5] Louizos C., 2018, 6 INT C LEARN REPR
  • [6] ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression
    Luo, Jian-Hao
    Wu, Jianxin
    Lin, Weiyao
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5068 - 5076
  • [7] Deep vs. shallow networks: An approximation theory perspective
    Mhaskar, H. N.
    Poggio, T.
    [J]. ANALYSIS AND APPLICATIONS, 2016, 14 (06) : 829 - 848
  • [8] Molchanov D., 2017, INT C MACH LEARN ICM
  • [9] Neumann D, 2020, IEEE IMAGE PROC, P21, DOI 10.1109/ICIP40778.2020.9190821
  • [10] Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data
    Sattler, Felix
    Wiedemann, Simon
    Mueller, Klaus-Robert
    Samek, Wojciech
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (09) : 3400 - 3413