XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks

被引:2772
作者
Rastegari, Mohammad [1 ]
Ordonez, Vicente [1 ]
Redmon, Joseph [2 ]
Farhadi, Ali [1 ,2 ]
机构
[1] Allen Inst AI, Seattle, WA 98103 USA
[2] Univ Washington, Seattle, WA 98195 USA
来源
COMPUTER VISION - ECCV 2016, PT IV | 2016年 / 9908卷
关键词
D O I
10.1007/978-3-319-46493-0_32
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose two efficient approximations to standard convolutional neural networks: Binary-Weight-Networks and XNOR-Networks. In Binary-Weight-Networks, the filters are approximated with binary values resulting in 32x memory saving. In XNOR-Networks, both the filters and the input to convolutional layers are binary. XNOR-Networks approximate convolutions using primarily binary operations. This results in 58x faster convolutional operations (in terms of number of the high precision operations) and 32x memory savings. XNOR-Nets offer the possibility of running state-of-the-art networks on CPUs (rather than GPUs) in real-time. Our binary networks are simple, accurate, efficient, and work on challenging visual tasks. We evaluate our approach on the ImageNet classification task. The classification accuracy with a Binary-Weight-Network version of AlexNet is the same as the full-precision AlexNet. We compare our method with recent network binarization methods, BinaryConnect and BinaryNets, and outperform these methods by large margins on ImageNet, more than 16% in top-1 accuracy. Our code is available at: http://allenai.org/plato/xnornet.
引用
收藏
页码:525 / 542
页数:18
相关论文
共 44 条
  • [1] [Anonymous], 2015, ARXIV151000149
  • [2] [Anonymous], P DEEP LEARN UNS FEA
  • [3] [Anonymous], 2014, Training Deep Neural Networks with Low Precision Multiplications
  • [4] [Anonymous], 2013, NIPS
  • [5] [Anonymous], 2015, ARXIV150404788
  • [6] [Anonymous], 2011, P INT C FLOR IT 27 3
  • [7] [Anonymous], PROC CVPR IEEE
  • [8] [Anonymous], 2015, PROC CVPR IEEE
  • [9] [Anonymous], IEEE T PATTERN ANAL
  • [10] [Anonymous], 2016, ABS160202830 CORR