Arbitrary Style Transfer via Multi-Adaptation Network

被引:110
|
作者
Deng, Yingying [1 ,2 ]
Tang, Fan [2 ]
Dong, Weiming [2 ,3 ]
Sun, Wen [1 ,4 ]
Huang, Feiyue [5 ]
Xu, Changsheng [2 ,3 ]
机构
[1] UCAS, Sch Artificial Intelligence, Beijing, Peoples R China
[2] Chinese Acad Sci, NLPR, Inst Automat, Beijing, Peoples R China
[3] CASIA LLVis Joint Lab, Beijing, Peoples R China
[4] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[5] Tencent, Youtu Lab, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
Arbitrary style transfer; Feature disentanglement; Adaptation;
D O I
10.1145/3394171.3414015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Arbitrary style transfer is a significant topic with research value and application prospect. A desired style transfer, given a content image and referenced style painting, would render the content image with the color tone and vivid stroke patterns of the style painting while synchronously maintaining the detailed content structure information. Style transfer approaches would initially learn content and style representations of the content and style references and then generate the stylized images guided by these representations. In this paper, we propose the multi-adaptation network which involves two self-adaptation (SA) modules and one co-adaptation (CA) module: the SA modules adaptively disentangle the content and style representations, i.e., content SA module uses position-wise self-attention to enhance content representation and style SA module uses channel-wise self-attention to enhance style representation; the CA module rearranges the distribution of style representation based on content representation distribution by calculating the local similarity between the disentangled content and style features in a non-local fashion. Moreover, a new disentanglement loss function enables our network to extract main style patterns and exact content structures to adapt to various input images, respectively. Various qualitative and quantitative experiments demonstrate that the proposed multi-adaptation network leads to better results than the state-of-the-art style transfer methods.
引用
收藏
页码:2719 / 2727
页数:9
相关论文
共 50 条
  • [1] Arbitrary style transfer via multi-feature correlation
    Xiang, Jin
    Zhao, Huihuang
    Li, Pengfei
    Deng, Yue
    Meng, Weiliang
    COMPUTERS & GRAPHICS-UK, 2024, 123
  • [2] RAST: Restorable Arbitrary Style Transfer via Multi-restoration
    Ma, Yingnan
    Zhao, Chenqiu
    Li, Xudong
    Basu, Anup
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 331 - 340
  • [3] Arbitrary Video Style Transfer via Multi-Channel Correlation
    Deng, Yingying
    Tang, Fan
    Dong, Weiming
    Huang, Haibin
    Ma, Chongyang
    Xu, Changsheng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 1210 - 1217
  • [4] Multi-Source Style Transfer via Style Disentanglement Network
    Wang, Quan
    Li, Sheng
    Wang, Zichi
    Zhang, Xinpeng
    Feng, Guorui
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 1373 - 1383
  • [5] Flow style-aware network for arbitrary style transfer
    Hu, Zhenshan
    Ge, Bin
    Xia, Chenxing
    Wu, Wenyan
    Zhou, Guangao
    Wang, Baotong
    COMPUTERS & GRAPHICS-UK, 2024, 124
  • [6] Pyramid style-attentional network for arbitrary style transfer
    Gaoming Yang
    Shicheng Zhang
    Xianjin Fang
    Ji Zhang
    Multimedia Tools and Applications, 2024, 83 : 13483 - 13502
  • [7] Pyramid style-attentional network for arbitrary style transfer
    Yang, Gaoming
    Zhang, Shicheng
    Fang, Xianjin
    Zhang, Ji
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (05) : 13483 - 13502
  • [8] Arbitrary Style Transfer with Adaptive Channel Network
    Wang, Yuzhuo
    Geng, Yanlin
    MULTIMEDIA MODELING (MMM 2022), PT I, 2022, 13141 : 481 - 492
  • [9] Arbitrary style transfer via content consistency and style consistency
    Yu, Xiaoming
    Zhou, Gan
    VISUAL COMPUTER, 2024, 40 (03): : 1369 - 1382
  • [10] Arbitrary style transfer via content consistency and style consistency
    Xiaoming Yu
    Gan Zhou
    The Visual Computer, 2024, 40 : 1369 - 1382