Non-parallel text style transfer with domain adaptation and an attention model

被引:0
|
作者
Mingxuan Hu
Min He
机构
[1] Yunnan University,School of Information Science and Engineering
来源
Applied Intelligence | 2021年 / 51卷
关键词
Text style transfer; Non-parallel text; Domain adaptation; Attention model;
D O I
暂无
中图分类号
学科分类号
摘要
Text style transfer, the aim of which is to convert a specific style in a given sentence to another target style while maintaining the style-independent content information of the original sentence, can face challenges when applied to non-parallel text. In this paper, we combine domain adaptation learning and an attention model to propose a new framework to accomplish the task. Domain adaptation can leverage relative information from the source domain to improve the generative model’s capacity for reconstructing data. The attention model can give the importance weights of generated words for the target style in a sentence; therefore, the generative model can concentrate on generating words with higher importance weights to accomplish text style transfer effectively. We evaluate our framework using Yelp, Amazon and Captions corpora. The results of automatic and human evaluation demonstrate the effectiveness of our framework compared with previous works under non-parallel and limited training data. The available codes are in https://github.com/mingxuan007/text-style-transfer-with-adversarial-network-and-domain-adaptation.
引用
收藏
页码:4609 / 4622
页数:13
相关论文
共 50 条
  • [1] Non-parallel text style transfer with domain adaptation and an attention model
    Hu, Mingxuan
    He, Min
    APPLIED INTELLIGENCE, 2021, 51 (07) : 4609 - 4622
  • [2] Correction to: Non-parallel text style transfer with domain adaptation and an attention model
    Mingxuan Hu
    Min He
    Applied Intelligence, 2021, 51 : 8564 - 8564
  • [3] Non-parallel text style transfer with domain adaptation and an attention model (vol 51, pg 4609, 2021)
    Hu, Mingxuan
    He, Min
    APPLIED INTELLIGENCE, 2021, 51 (11) : 8564 - 8564
  • [4] Disentangled Representation Learning for Non-Parallel Text Style Transfer
    John, Vineet
    Mou, Lili
    Bahuleyan, Hareesh
    Vechtomova, Olga
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 424 - 434
  • [5] An Unsupervised Framework With Attention Mechanism and Embedding Perturbed Encoder for Non-Parallel Text Sentiment Style Transfer
    Liu, Yuanzhi
    He, Min
    Yang, Qingqing
    Jeon, Gwanggil
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 2134 - 2144
  • [6] Utilizing Non-Parallel Text for Style Transfer by Making Partial Comparisons
    Yin, Di
    Huang, Shujian
    Dai, Xin-Yu
    Chen, Jiajun
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5379 - 5386
  • [7] Style Transfer from Non-Parallel Text by Cross-Alignment
    Shen, Tianxiao
    Lei, Tao
    Barzilay, Regina
    Jaakkola, Tommi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [8] Non-Parallel Text Style Transfer using Self-Attentional Discriminator as Supervisor
    Feng, Kuan
    Zhu, Yanmin
    Yu, Jiadi
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 416 - 426
  • [9] A Multi-Discriminator CycleGAN for Unsupervised Non-Parallel Speech Domain Adaptation
    Hosseini-Asl, Ehsan
    Zhou, Yingbo
    Xiong, Caiming
    Socher, Richard
    19TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2018), VOLS 1-6: SPEECH RESEARCH FOR EMERGING MARKETS IN MULTILINGUAL SOCIETIES, 2018, : 3758 - 3762
  • [10] NON-PARALLEL MANY-TO-MANY VOICE CONVERSION BY KNOWLEDGE TRANSFER FROM A TEXT-TO-SPEECH MODEL
    Yu, Xinyuan
    Mak, Brian
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 5924 - 5928