Arbitrary Style Transfer with Parallel Self-Attention

被引:3
|
作者
Zhang, Tiange [1 ]
Gao, Ying [1 ]
Gao, Feng [1 ]
Qi, Lin [1 ]
Dong, Junyu [1 ]
机构
[1] Ocean Univ China, Sch Informat Sci & Engn, Qingdao 266100, Peoples R China
基金
中国国家自然科学基金;
关键词
style transfer; attention mechanism; instance normalization; laplacian matrix;
D O I
10.1109/ICPR48806.2021.9412049
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural style transfer aims to create artistic images by synthesizing patterns from a given style image. Recently, the Adaptive Instance Normalization (AdaIN) layer is proposed to achieve real-time arbitrary style transfer. However, we observed that if crucial features based on AdaIN can be further emphasized during transfer, both content and style information will be better reflected in stylized images. Furthermore, it is always essential to preserve more details and reduce unexpected artifacts in order to generate appealing results. In this paper, we introduce an improved arbitrary style transfer method based on the self-attention mechanism. A self-attention module is designed to learn what and where to emphasize in the input image. In addition, an extra Laplacian loss is applied to preserve structure details of the content while eliminating artifacts. Experimental results demonstrate that the proposed method outperforms AdaIN and can generate more appealing results.
引用
收藏
页码:1406 / 1413
页数:8
相关论文
共 50 条
  • [21] Deep Transfer Learning With Self-Attention for Industry Sensor Fusion Tasks
    Zhang, Ze
    Farnsworth, Michael
    Song, Boyang
    Tiwari, Divya
    Tiwari, Ashutosh
    IEEE SENSORS JOURNAL, 2022, 22 (15) : 15235 - 15247
  • [22] Light-Weight Vision Transformer with Parallel Local and Global Self-Attention
    Ebert, Nikolas
    Reichardt, Laurenz
    Stricker, Didier
    Wasenmueller, Oliver
    2023 IEEE 26TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, ITSC, 2023, : 452 - 459
  • [23] On the Integration of Self-Attention and Convolution
    Pan, Xuran
    Ge, Chunjiang
    Lu, Rui
    Song, Shiji
    Chen, Guanfu
    Huang, Zeyi
    Huang, Gao
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 805 - 815
  • [24] Self-Attention for Cyberbullying Detection
    Pradhan, Ankit
    Yatam, Venu Madhav
    Bera, Padmalochan
    2020 INTERNATIONAL CONFERENCE ON CYBER SITUATIONAL AWARENESS, DATA ANALYTICS AND ASSESSMENT (CYBER SA 2020), 2020,
  • [25] Self-Attention Parallel Fusion Network for Wind Turbine Gearboxes Fault Diagnosis
    Yang, Qichao
    Tang, Baoping
    Shen, Yizhe
    Li, Qikang
    IEEE SENSORS JOURNAL, 2023, 23 (19) : 23210 - 23220
  • [26] PLG-ViT: Vision Transformer with Parallel Local and Global Self-Attention
    Ebert, Nikolas
    Stricker, Didier
    Wasenmueller, Oliver
    SENSORS, 2023, 23 (07)
  • [27] On The Computational Complexity of Self-Attention
    Keles, Feyza Duman
    Wijewardena, Pruthuvi Mahesakya
    Hegde, Chinmay
    INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201, 2023, 201 : 597 - 619
  • [28] The Lipschitz Constant of Self-Attention
    Kim, Hyunjik
    Papamakarios, George
    Mnih, Andriy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [29] The function of the self-attention network
    Cunningham, Sheila J.
    COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 21 - 22
  • [30] Self-Attention Graph Pooling
    Lee, Junhyun
    Lee, Inyeop
    Kang, Jaewoo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97