Multipath-DenseNet: A Supervised ensemble architecture of densely connected convolutional networks

被引:24
作者
Lodhi, Bilal [1 ]
Kang, Jaewoo [1 ]
机构
[1] Korea Univ, Dept Comp Sci & Engn, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
Image classification; Neural network; Deep-learning;
D O I
10.1016/j.ins.2019.01.012
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep networks with skip-connections such as ResNets have achieved great results in recent years. DenseNet exploits the ResNet skip-connections by connecting each layer in convolution neural network to all preceding layers and achieves state-of-the-art accuracy. It is well-known that deeper networks are more efficient and easier to train than shallow or wider networks. Despite the high performance of very deep networks, they are limited in terms of vanishing gradient, diminishing forward flow, and slower training time. In this paper, we propose to combine the benefits of the depth and width of networks. We train supervised independent shallow networks on the same input in a block fashion. We use a state-of-the-art DenseNet block to increase the number of paths for gradient flow. Our proposed architecture has several advantages over other deeper networks including DenseNet: our architecture which we call Multipath-DenseNet is deeper as well as wider, reduces training time, and uses a smaller number of parameters. We evaluate our proposed architecture on the following four object recognition datasets: CIFAR-10, CIFAR-100, SVHN, and ImageNet. The evaluation results show that Multipath-DenseNet achieves significant improvement in performance over DenseNet on the benchmark datasets. (C) 2019 Elsevier Inc. All rights reserved.
引用
收藏
页码:63 / 72
页数:10
相关论文
共 26 条
[1]  
[Anonymous], P 3 INT C LEARNING R
[2]  
[Anonymous], 2015, NeurIPS
[3]  
[Anonymous], 2016, ABS160507648 CORR
[4]  
[Anonymous], ADV NEURAL INFORM PR
[5]  
[Anonymous], P 13 INT C ART INT S
[6]  
[Anonymous], 2016, ABS161105431 CORR
[7]  
[Anonymous], 2016, CORR
[8]  
[Anonymous], 2009 IEEE COMP SOC C
[9]  
[Anonymous], ABS150502496 CORR
[10]  
[Anonymous], 2014, ABS14121710 CORR