TI-POOLING: transformation-invariant pooling for feature learning in Convolutional Neural Networks

被引:156
|
作者
Laptev, Dmitry [1 ]
Savinov, Nikolay [1 ]
Buhmann, Joachim M. [1 ]
Pollefeys, Marc [1 ]
机构
[1] Swiss Fed Inst Technol, Dept Comp Sci, Zurich, Switzerland
来源
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2016年
基金
瑞士国家科学基金会;
关键词
RECOGNITION; ART;
D O I
10.1109/CVPR.2016.38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we present a deep neural network topology that incorporates a simple to implement transformation-invariant pooling operator (TI-POOLING). This operator is able to efficiently handle prior knowledge on nuisance variations in the data, such as rotation or scale changes. Most current methods usually make use of dataset augmentation to address this issue, but this requires larger number of model parameters and more training data, and results in significantly increased training time and larger chance of under-or overfitting. The main reason for these drawbacks is that that the learned model needs to capture adequate features for all the possible transformations of the input. On the other hand, we formulate features in convolutional neural networks to be transformation-invariant. We achieve that using parallel siamese architectures for the considered transformation set and applying the TI-POOLING operator on their outputs before the fully-connected layers. We show that this topology internally finds the most optimal "canonical" instance of the input image for training and therefore limits the redundancy in learned features. This more efficient use of training data results in better performance on popular benchmark datasets with smaller number of parameters when comparing to standard convolutional neural networks with dataset augmentation and to other baselines.
引用
收藏
页码:289 / 297
页数:9
相关论文
共 50 条
  • [1] Feature Pooling - A Feature Compression Method Used in Convolutional Neural Networks
    Pei, Ge
    Gao, Hai-Chang
    Zhou, Xin
    Cheng, Nuo
    JOURNAL OF INFORMATION SCIENCE AND ENGINEERING, 2020, 36 (03) : 577 - 596
  • [2] Information Entropy Based Feature Pooling for Convolutional Neural Networks
    Wan, Weitao
    Chen, Jiansheng
    Li, Tianpeng
    Huang, Yiqing
    Tian, Jingqi
    Yu, Cheng
    Xue, Youze
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 3404 - 3413
  • [3] Transformation-invariant Gabor convolutional networks
    Zhuang, Lei
    Da, Feipeng
    Gai, Shaoyan
    Li, Mengxiang
    SIGNAL IMAGE AND VIDEO PROCESSING, 2020, 14 (07) : 1413 - 1420
  • [4] Transformation-invariant Gabor convolutional networks
    Lei Zhuang
    Feipeng Da
    Shaoyan Gai
    Mengxiang Li
    Signal, Image and Video Processing, 2020, 14 : 1413 - 1420
  • [5] Kernel Pooling for Convolutional Neural Networks
    Cui, Yin
    Zhou, Feng
    Wang, Jiang
    Liu, Xiao
    Lin, Yuanqing
    Belongie, Serge
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 3049 - 3058
  • [6] Pooling in Graph Convolutional Neural Networks
    Cheung, Mark
    Shi, John
    Jiang, Lavender
    Wright, Oren
    Moura, Jose M. F.
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 462 - 466
  • [7] Cascaded pooling for Convolutional Neural Networks
    Devi, Nilakshi
    Borah, Bhogeswar
    2018 FOURTEENTH INTERNATIONAL CONFERENCE ON INFORMATION PROCESSING (ICINPRO) - 2018, 2018, : 155 - 159
  • [8] Mixed Pooling for Convolutional Neural Networks
    Yu, Dingjun
    Wang, Hanli
    Chen, Peiqiu
    Wei, Zhihua
    ROUGH SETS AND KNOWLEDGE TECHNOLOGY, RSKT 2014, 2014, 8818 : 364 - 375
  • [9] Learning Pooling for Convolutional Neural Network
    Sun, Manli
    Song, Zhanjie
    Jiang, Xiaoheng
    Pan, Jing
    Pang, Yanwei
    NEUROCOMPUTING, 2017, 224 : 96 - 104
  • [10] Universal pooling-A new pooling method for convolutional neural networks
    Hyun, Junhyuk
    Seong, Hongje
    Kim, Euntai
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 180