In-Place Activated BatchNorm for Memory-Optimized Training of DNNs

被引:182
作者
Bulo, Samuel Rota [1 ]
Porzi, Lorenzo [1 ]
Kontschieder, Peter [1 ]
机构
[1] Mapillary Res, Graz, Austria
来源
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2018年
关键词
D O I
10.1109/CVPR.2018.00591
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work we present In-Place Activated Batch Normalization (INPLACE-ABN) - a novel approach to drastically reduce the training memory footprint of modern deep neural networks in a computationally efficient way. Our solution substitutes the conventionally used succession of BatchNorm + Activation layers with a single plugin layer, hence avoiding invasive framework surgery while providing straightforward applicability for existing deep learning frameworks. We obtain memory savings of up to 50% by dropping intermediate results and by recovering required information during the backward pass through the inversion of stored forward results, with only minor increase (0.8-2%) in computation time. Also, we demonstrate how frequently used checkpointing approaches can be made computationally as efficient as INPLACE-ABN. In our experiments on image classification, we demonstrate on-par results on ImageNet-1k with state-of-the-art approaches. On the memory-demanding task of semantic segmentation, we report competitive results for COCO-Stuff and set new state-of-the-art results for Cityscapes and Mapillary Vistas. Code can be found at https://github.com/mapillary/inplace_abn.
引用
收藏
页码:5639 / 5647
页数:9
相关论文
共 34 条
  • [1] [Anonymous], 2017, ABS170605587 CORR
  • [2] [Anonymous], 2016, CoRR
  • [3] [Anonymous], 2016, ABS160805442 CORR
  • [4] [Anonymous], ABS160406174 CORR
  • [5] [Anonymous], 2016, CoRR abs/1606.00915
  • [6] [Anonymous], 2017, ICCV
  • [7] [Anonymous], 2017, ABS171003740 CORR
  • [8] [Anonymous], NIPS
  • [9] [Anonymous], 2016, ABS161105431 CORR
  • [10] [Anonymous], 2017, ABS170208502 CORR