AlphaMEX: A smarter global pooling method for convolutional neural networks

被引:31
|
作者
Zhang, Boxue [1 ]
Zhao, Qi [1 ]
Feng, Wenquan [1 ]
Lyu, Shuchang [1 ]
机构
[1] Beihang Univ, Sch Elect & Informat Engn, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
CNN; Global Pooling; Feature-map sparsity; AlphaMEX; Network compression; OBJECT RECOGNITION; FIRE DETECTION; SURVEILLANCE;
D O I
10.1016/j.neucom.2018.07.079
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep convolutional neural networks have achieved great success on image classification. A series of feature extractors learned from CNN have been used in many computer vision tasks. Global pooling layer plays a very important role in deep convolutional neural networks. It is found that the input featuremaps of global pooling become sparse, as the increasing use of Batch Normalization and ReLU layer combination, which makes the original global pooling low efficiency. In this paper, we proposed a novel end-to-end trainable global pooling operator AlphaMEX Global Pool for convolutional neural network. A nonlinear smooth log-mean-exp function is designed, called AlphaMEX, to extract features effectively and make networks smarter. Compared to the original global pooling layer, our proposed method can improve classification accuracy without increasing any layers or too much redundant parameters. Experimental results on CIFAR-10/CIFAR100, SVHN and ImageNet demonstrate the effectiveness of the proposed method. The AlphaMEX-ResNet outperforms original ResNet-110 by 8.3% on CIFAR10+, and the top-1 error rate of AlphaMEX-DenseNet (k = 12) reaches 5.03% which outperforms original DenseNet (k = 12) by 4.0%. (c) 2018 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license.
引用
收藏
页码:36 / 48
页数:13
相关论文
共 50 条
  • [31] Convolutional Neural Networks: A Roundup and Benchmark of Their Pooling Layer Variants
    Galanis, Nikolaos-Ioannis
    Vafiadis, Panagiotis
    Mirzaev, Kostas-Gkouram
    Papakostas, George A.
    ALGORITHMS, 2022, 15 (11)
  • [32] Convolutional Neural Networks with Generalized Attentional Pooling for Action Recognition
    Wang, Yunfeng
    Zhou, Wengang
    Zhang, Qilin
    Li, Houqiang
    2018 IEEE INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (IEEE VCIP), 2018,
  • [33] Mixed fuzzy pooling in convolutional neural networks for image classification
    Teena Sharma
    Nishchal K. Verma
    Shahrukh Masood
    Multimedia Tools and Applications, 2023, 82 : 8405 - 8421
  • [34] Implications of Pooling Strategies in Convolutional Neural Networks: A Deep Insight
    Sharma, Shallu
    Mehra, Rajesh
    FOUNDATIONS OF COMPUTING AND DECISION SCIENCES, 2019, 44 (03) : 303 - 330
  • [35] Rank-based pooling for deep convolutional neural networks
    Shi, Zenglin
    Ye, Yangdong
    Wu, Yunpeng
    NEURAL NETWORKS, 2016, 83 : 21 - 31
  • [36] Information Entropy Based Feature Pooling for Convolutional Neural Networks
    Wan, Weitao
    Chen, Jiansheng
    Li, Tianpeng
    Huang, Yiqing
    Tian, Jingqi
    Yu, Cheng
    Xue, Youze
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 3404 - 3413
  • [37] Deep Convolutional Neural Networks for Pedestrian Detection with Skip Pooling
    Liu, Jie
    Gao, Xingkun
    Bao, Nianyuan
    Tang, Jie
    Wu, Gangshan
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2056 - 2063
  • [38] Weighted pooling for image recognition of deep convolutional neural networks
    Xiaoning Zhu
    Qingyue Meng
    Bojian Ding
    Lize Gu
    Yixian Yang
    Cluster Computing, 2019, 22 : 9371 - 9383
  • [39] Max-Pooling Dropout for Regularization of Convolutional Neural Networks
    Wu, Haibing
    Gu, Xiaodong
    NEURAL INFORMATION PROCESSING, PT I, 2015, 9489 : 46 - 54
  • [40] Mixed fuzzy pooling in convolutional neural networks for image classification
    Sharma, Teena
    Verma, Nishchal K.
    Masood, Shahrukh
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (06) : 8405 - 8421