Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization

被引:3196
作者
Huang, Xun [1 ]
Belongie, Serge
机构
[1] Cornell Univ, Dept Comp Sci, Ithaca, NY 14853 USA
来源
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV) | 2017年
关键词
D O I
10.1109/ICCV.2017.167
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gatys et al. recently introduced a neural algorithm that renders a content image in the style of another image, achieving so-called style transfer. However, their framework requires a slow iterative optimization process, which limits its practical application. Fast approximations with feed-forward neural networks have been proposed to speed up neural style transfer. Unfortunately, the speed improvement comes at a cost: the network is usually tied to a fixed set of styles and cannot adapt to arbitrary new styles. In this paper, we present a simple yet effective approach that for the first time enables arbitrary style transfer in real-time. At the heart of our method is a novel adaptive instance normalization (AdaIN) layer that aligns the mean and variance of the content features with those of the style features. Our method achieves speed comparable to the fastest existing approach, without the restriction to a pre-defined set of styles. In addition, our approach allows flexible user controls such as content-style trade-off, style interpolation, color & spatial controls, all using a single feed-forward neural network.
引用
收藏
页码:1510 / 1519
页数:10
相关论文
共 54 条
[1]  
[Anonymous], 2017, ARXIV170203275
[2]  
[Anonymous], 2016, ECCV
[3]  
[Anonymous], 2016, NIPS
[4]  
[Anonymous], 2016, CVPR
[5]  
[Anonymous], 2015, NIPS
[6]  
[Anonymous], 2016, Semantic style transfer and turning two-bit doodles into fine artworks
[7]  
[Anonymous], 2016, ICML
[8]  
[Anonymous], 1995, SIGGRAPH
[9]  
[Anonymous], 2016, ICLR
[10]  
[Anonymous], 2016, AAAI