Arbitrary Style Transfer with Parallel Self-Attention

被引:3
|
作者
Zhang, Tiange [1 ]
Gao, Ying [1 ]
Gao, Feng [1 ]
Qi, Lin [1 ]
Dong, Junyu [1 ]
机构
[1] Ocean Univ China, Sch Informat Sci & Engn, Qingdao 266100, Peoples R China
基金
中国国家自然科学基金;
关键词
style transfer; attention mechanism; instance normalization; laplacian matrix;
D O I
10.1109/ICPR48806.2021.9412049
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural style transfer aims to create artistic images by synthesizing patterns from a given style image. Recently, the Adaptive Instance Normalization (AdaIN) layer is proposed to achieve real-time arbitrary style transfer. However, we observed that if crucial features based on AdaIN can be further emphasized during transfer, both content and style information will be better reflected in stylized images. Furthermore, it is always essential to preserve more details and reduce unexpected artifacts in order to generate appealing results. In this paper, we introduce an improved arbitrary style transfer method based on the self-attention mechanism. A self-attention module is designed to learn what and where to emphasize in the input image. In addition, an extra Laplacian loss is applied to preserve structure details of the content while eliminating artifacts. Experimental results demonstrate that the proposed method outperforms AdaIN and can generate more appealing results.
引用
收藏
页码:1406 / 1413
页数:8
相关论文
共 50 条
  • [31] Convolutional Self-Attention Networks
    Yang, Baosong
    Wang, Longyue
    Wong, Derek F.
    Chao, Lidia S.
    Tu, Zhaopeng
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4040 - 4045
  • [32] FOCUS OF ATTENTION IN GROUPS - A SELF-ATTENTION PERSPECTIVE
    MULLEN, B
    CHAPMAN, JG
    PEAUGH, S
    JOURNAL OF SOCIAL PSYCHOLOGY, 1989, 129 (06): : 807 - 817
  • [33] Parallel Self-Attention and Spatial-Attention Fusion for Human Pose Estimation and Running Movement Recognition
    Wu, Qingtian
    Zhang, Yu
    Zhang, Liming
    Yu, Haoyong
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (01) : 358 - 368
  • [34] Style Permutation for Diversified Arbitrary Style Transfer
    Li, Pan
    Zhang, Dan
    Zhao, Lei
    Xu, Duanqing
    Lu, Dongming
    IEEE ACCESS, 2020, 8 (08): : 199147 - 199158
  • [35] SELF-ATTENTION, CONCEPT ACTIVATION, AND THE CAUSAL SELF
    FENIGSTEIN, A
    LEVINE, MP
    JOURNAL OF EXPERIMENTAL SOCIAL PSYCHOLOGY, 1984, 20 (03) : 231 - 245
  • [36] Image highlight removal method based on parallel multi-axis self-attention
    Li P.
    Xu X.
    Tang Y.
    Zhang Z.
    Han X.
    Yue H.
    Hongwai yu Jiguang Gongcheng/Infrared and Laser Engineering, 2024, 53 (03):
  • [37] A Novel Self-Attention Transfer Adaptive Learning Approach for Brain Tumor Categorization
    Shawly, Tawfeeq
    Alsheikhy, Ahmed A.
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2024, 2024
  • [38] Deep Multiscale Siamese Network With Parallel Convolutional Structure and Self-Attention for Change Detection
    Guo, Qingle
    Zhang, Junping
    Zhu, Shengyu
    Zhong, Chongxiao
    Zhang, Ye
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [39] Cross-modal hashing network based on self-attention similarity transfer
    Liang H.
    Wang H.
    Wang D.
    Beijing Hangkong Hangtian Daxue Xuebao/Journal of Beijing University of Aeronautics and Astronautics, 2024, 50 (02): : 615 - 622
  • [40] Adversarial Transfer Learning for Chinese Named Entity Recognition with Self-Attention Mechanism
    Cao, Pengfei
    Chen, Yubo
    Liu, Kang
    Zhao, Jun
    Liu, Shengping
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 182 - 192