Efficient densely connected convolutional neural networks

被引:96
作者
Li, Guoqing [1 ]
Zhang, Meng [1 ]
Li, Jiaojie [2 ]
Lv, Feng [2 ]
Tong, Guodong [1 ]
机构
[1] Southeast Univ, Natl ASIC Engn Technol Res Ctr, Sch Elect Sci & Engn, Nanjing 210096, Peoples R China
[2] Southeast Univ, Sch Microelect, Nanjing 210096, Peoples R China
基金
中国国家自然科学基金;
关键词
Convolutional neural networks; Classification; Parameter efficiency; Densely connected;
D O I
10.1016/j.patcog.2020.107610
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent works have shown that convolutional neural networks (CNNs) are parameter redundant, which limits the application of CNNs in Mobile devices with limited memory and computational resources. In this paper, two novel and efficient lightweight CNNs architectures are proposed, which are called DenseDsc and Dense2Net. Two proposed CNNs are densely connected and the dense connectivity facilitates feature re-use in the networks. Dense2Net adopts efficient group convolution and DenseDsc adopts more efficient depthwise separable convolution. The novel dense blocks of DenseDsc and Dense2Net improve the parameter efficiency. The proposed DenseDsc and Dense2Net are evaluated on highly competitive classification benchmark datasets (CIFAR and ImageNet). The experimental results show that DenseDsc and Dense2Net have higher accuracy than DenseNet with similar parameters or FLOPs. Compared with other efficient CNNs with less than 0.5 M parameters for CIFAR, Dense2Net and DenseDsc achieve state-of-the-art results on CIFAR-10 and CIFAR-10 0, respectively. DenseDsc and Dense2Net are very competitive in efficient CNNs with less than 1.0 M parameters on CIFAR. Furthermore, Dense2Net achieves state-of-the-art results on ImageNet in manual CNNs with less than 10 M parameters. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:9
相关论文
共 43 条
[1]  
[Anonymous], 2019, P CVPR
[2]  
[Anonymous], 2016, P 29 IEEE C COMPUTER
[3]   Res2Net: A New Multi-Scale Backbone Architecture [J].
Gao, Shang-Hua ;
Cheng, Ming-Ming ;
Zhao, Kai ;
Zhang, Xin-Yu ;
Yang, Ming-Hsuan ;
Torr, Philip .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (02) :652-662
[4]   Recent advances in convolutional neural networks [J].
Gu, Jiuxiang ;
Wang, Zhenhua ;
Kuen, Jason ;
Ma, Lianyang ;
Shahroudy, Amir ;
Shuai, Bing ;
Liu, Ting ;
Wang, Xingxing ;
Wang, Gang ;
Cai, Jianfei ;
Chen, Tsuhan .
PATTERN RECOGNITION, 2018, 77 :354-377
[5]  
Han Y, 2016, I C INF COMM TECH CO, P4, DOI 10.1109/ICTC.2016.7763421
[6]   Identity Mappings in Deep Residual Networks [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
COMPUTER VISION - ECCV 2016, PT IV, 2016, 9908 :630-645
[7]  
Howard AG., 2017, ARXIV, V2017
[8]   Densely Connected Convolutional Networks [J].
Huang, Gao ;
Liu, Zhuang ;
van der Maaten, Laurens ;
Weinberger, Kilian Q. .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :2261-2269
[9]  
Hubara I, 2016, ADV NEUR IN, V29
[10]   Macro unit-based convolutional neural network for very light-weight deep learning [J].
Kim, Dae Ha ;
Lee, Min Kyu ;
Lee, Seung Hyun ;
Song, Byung Cheol .
IMAGE AND VISION COMPUTING, 2019, 87 :68-75