Deep Convolutional Neural Network Compression based on the Intrinsic Dimension of the Training Data

被引:0
作者
Hadi, Abir Mohammad [1 ]
Won, Kwanghee [1 ]
机构
[1] South Dakota State Univ, Brookings, SD 57007 USA
来源
APPLIED COMPUTING REVIEW | 2024年 / 24卷 / 01期
关键词
Deep neural network pruning; Intrinsic data dimension; Deep reinforcement learning; PRUNING METHOD;
D O I
10.1145/3599957.3606236
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Selecting the optimal deep learning architecture for a particular task and dataset remains an ongoing challenge. Typically, this decision-making process involves exhaustive searches for neural network architectures or multi-phase optimization, which includes initial training, compression or pruning, and fine-tuning steps. In this study, we introduce an approach utilizing a deep reinforcement learning-based agent to dynamically compress a deep convolutional neural network throughout its training process. We integrate the concept of the intrinsic dimension of the training data to provide the agent with insights into the task's complexity. The agent employs two distinct ranking criteria, L1-norm-based and attention-based measures, to selectively prune filters from each layer as it determines necessary. In the experiments, we used the CIFAR-10 dataset and its subsets (2-class and 5-class subsets) to model the task complexity and showed that the agent learns different policies depending on the intrinsic dimension. The agent, on average, pruned off 78.48%, 77.9%, and 83.12% filters from all the layers of VGG-16 network for CIFAR-10 full, 5-class, and 2-class subsets respectively.
引用
收藏
页码:14 / 23
页数:10
相关论文
共 34 条
[1]   Estimating Local Intrinsic Dimensionality [J].
Amsaleg, Laurent ;
Chelly, Oussama ;
Furon, Teddy ;
Girard, Stephane ;
Houle, Michael E. ;
Kawarabayashi, Ken-ichi ;
Nett, Michael .
KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, :29-38
[2]   Using filter banks in Convolutional Neural Networks for texture classification [J].
Andrearczyk, Vincent ;
Whelan, Paulf. .
PATTERN RECOGNITION LETTERS, 2016, 84 :63-69
[3]   Structured Pruning of Deep Convolutional Neural Networks [J].
Anwar, Sajid ;
Hwang, Kyuyeon ;
Sung, Wonyong .
ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
[4]   Neuroplasticity-Based Pruning Method for Deep Convolutional Neural Networks [J].
Camacho, Jose David ;
Villasenor, Carlos ;
Lopez-Franco, Carlos ;
Arana-Daniel, Nancy .
APPLIED SCIENCES-BASEL, 2022, 12 (10)
[5]   FPC: Filter pruning via the contribution of output feature map for deep convolutional neural networks acceleration [J].
Chen, Yanming ;
Wen, Xiang ;
Zhang, Yiwen ;
He, Qiang .
KNOWLEDGE-BASED SYSTEMS, 2022, 238
[6]   Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks [J].
Chen, Zhiqiang ;
Xu, Ting-Bing ;
Du, Changde ;
Liu, Cheng-Lin ;
He, Huiguang .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (02) :799-813
[7]  
Cun Y.Le., 1990, ADV NEURAL INFORM PR, V2, P396
[8]   Estimating the intrinsic dimension of datasets by a minimal neighborhood information [J].
Facco, Elena ;
d'Errico, Maria ;
Rodriguez, Alex ;
Laio, Alessandro .
SCIENTIFIC REPORTS, 2017, 7
[9]   TESTING THE MANIFOLD HYPOTHESIS [J].
Fefferman, Charles ;
Mitter, Sanjoy ;
Narayanan, Hariharan .
JOURNAL OF THE AMERICAN MATHEMATICAL SOCIETY, 2016, 29 (04) :983-1049
[10]   A Novel Filter-Level Deep Convolutional Neural Network Pruning Method Based on Deep Reinforcement Learning [J].
Feng, Yihao ;
Huang, Chao ;
Wang, Long ;
Luo, Xiong ;
Li, Qingwen .
APPLIED SCIENCES-BASEL, 2022, 12 (22)