Rotation Invariant Local Binary Convolution Neural Networks

被引:20
作者
Zhang, Xin [1 ]
Liu, Li [1 ,2 ]
Xie, Yuxiang [1 ]
Chen, Jie [2 ]
Wu, Lingda [3 ]
Pietikainen, Matti [2 ]
机构
[1] Natl Univ Def Technol, Coll Informat Syst & Management, Changsha, Hunan, Peoples R China
[2] Univ Oulu, CMVS, Oulu, Finland
[3] Acad Equipment, Key Lab, Beijing, Peoples R China
来源
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017) | 2017年
基金
中国国家自然科学基金;
关键词
D O I
10.1109/ICCVW.2017.146
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although Convolution Neural Networks (CNNs) are unprecedentedly powerful to learn effective representations, they are still parameter expensive and limited by the lack of ability to handle with the orientation transformation of the input data. To alleviate this problem, we propose a deep architecture named Rotation Invariant Local Binary Convolution Neural Network(RI-LBCNN). RI-LBCNN is a deep convolution neural network consisting of Local Binary orientation Module(LBoM). A LBoM is composed of two parts, i.e., three layers steerable module (two layers for the first and one for the second part), which is a combination of Local Binary Convolution (LBC)[19] and Active Rotating Filters (ARFs)[38]. Through replacing the basic convolution layer in DCNN with LBoMs, RI-LBCNN can be easily implemented and LBoM can be naturally inserted to other popular models without any extra modification to the optimisation process. Meanwhile, the proposed RI-LBCNN thus can be easily trained end to end. Extensive experiments show that the updating with the proposed LBoMs leads to significant reduction of learnable parameters and the reasonable performance improvement on three benchmarks.
引用
收藏
页码:1210 / 1219
页数:10
相关论文
共 38 条
  • [1] [Anonymous], 2016, BinaryNet: Training deep neural networks with weights and activa
  • [2] [Anonymous], 2015, ARXIV151000149
  • [3] [Anonymous], PROC CVPR IEEE
  • [4] [Anonymous], 2016, ARXIV PREPRINT ARXIV
  • [5] [Anonymous], 2016, BMVC
  • [6] [Anonymous], 2014, Training Deep Neural Networks with Low Precision Multiplications
  • [7] [Anonymous], 2013, NIPS
  • [8] [Anonymous], 2011, BIGLEARN NIPS WORKSH
  • [9] [Anonymous], 2015, IEEE I CONF COMP VIS, DOI DOI 10.1109/ICCV.2015.123
  • [10] [Anonymous], 2007, IEEE INT C ICML