ETNet: Error Transition Network for Arbitrary Style Transfer

被引:0
|
作者
Song, Chunjin [1 ]
Wu, Zhijie [1 ]
Zhou, Yang [1 ]
Gong, Minglun [2 ]
Huang, Hui [1 ]
机构
[1] Shenzhen Univ, Shenzhen, Peoples R China
[2] Univ Guelph, Guelph, ON, Canada
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019) | 2019年 / 32卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Numerous valuable efforts have been devoted to achieving arbitrary style transfer since the seminal work of Gatys et al. However, existing state-of-the-art approaches often generate insufficiently stylized results under challenging cases. We believe a fundamental reason is that these approaches try to generate the stylized result in a single shot and hence fail to fully satisfy the constraints on semantic structures in the content images and style patterns in the style images. Inspired by the works on error-correction, instead, we propose a self-correcting model to predict what is wrong with the current stylization and refine it accordingly in an iterative manner. For each refinement, we transit the error features across both the spatial and scale domain and invert the processed features into a residual image, with a network we call Error Transition Network (ETNet). The proposed model improves over the state-of-the-art methods with better semantic structures and more adaptive style pattern details. Various qualitative and quantitative experiments show that the key concept of both progressive strategy and error-correction leads to better results. Code and models are available at https://github.com/zhijieW94/ETNet.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Flow style-aware network for arbitrary style transfer
    Hu, Zhenshan
    Ge, Bin
    Xia, Chenxing
    Wu, Wenyan
    Zhou, Guangao
    Wang, Baotong
    COMPUTERS & GRAPHICS-UK, 2024, 124
  • [2] Pyramid style-attentional network for arbitrary style transfer
    Gaoming Yang
    Shicheng Zhang
    Xianjin Fang
    Ji Zhang
    Multimedia Tools and Applications, 2024, 83 : 13483 - 13502
  • [3] Pyramid style-attentional network for arbitrary style transfer
    Yang, Gaoming
    Zhang, Shicheng
    Fang, Xianjin
    Zhang, Ji
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (05) : 13483 - 13502
  • [4] Arbitrary Style Transfer with Adaptive Channel Network
    Wang, Yuzhuo
    Geng, Yanlin
    MULTIMEDIA MODELING (MMM 2022), PT I, 2022, 13141 : 481 - 492
  • [5] Deep Content Guidance Network for Arbitrary Style Transfer
    Shi, Di-Bo
    Xie, Huan
    Ji, Yi
    Li, Ying
    Liu, Chun-Ping
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Arbitrary Style Transfer via Multi-Adaptation Network
    Deng, Yingying
    Tang, Fan
    Dong, Weiming
    Sun, Wen
    Huang, Feiyue
    Xu, Changsheng
    MM '20: PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 2020, : 2719 - 2727
  • [7] EFANet: Exchangeable Feature Alignment Network for Arbitrary Style Transfer
    Wu, Zhijie
    Song, Chunjin
    Zhou, Yang
    Gong, Minglun
    Huang, Hui
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 12305 - 12312
  • [8] Style Permutation for Diversified Arbitrary Style Transfer
    Li, Pan
    Zhang, Dan
    Zhao, Lei
    Xu, Duanqing
    Lu, Dongming
    IEEE ACCESS, 2020, 8 (08): : 199147 - 199158
  • [9] Collaborative Learning and Style-Adaptive Pooling Network for Perceptual Evaluation of Arbitrary Style Transfer
    Chen, Hangwei
    Shao, Feng
    Chai, Xiongli
    Jiang, Qiuping
    Meng, Xiangchao
    Ho, Yo-Sung
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 15387 - 15401
  • [10] Arbitrary Style Transfer with Style Enhancement and Structure Retention
    Yang, Sijia
    Zhou, Yun
    ADVANCES IN COMPUTER GRAPHICS, CGI 2023, PT II, 2024, 14496 : 401 - 413