Reinforcement Learning Based Text Style Transfer without Parallel Training Corpus

被引:0
|
作者
Gong, Hongyu [1 ]
Bhat, Suma [1 ]
Wu, Lingfei [2 ]
Xiong, Jinjun [2 ]
Hwu, Wen-Mei [1 ]
机构
[1] Univ Illinois, Champaign, IL 61801 USA
[2] IBM Corp, TJ Watson Res Ctr, Ossining, NY USA
来源
2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1 | 2019年
关键词
MACHINE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Text style transfer rephrases a text from a source style (e.g., informal) to a target style (e.g., formal) while keeping its original meaning. Despite the success existing works have achieved using a parallel corpus for the two styles, transferring text style has proven significantly more challenging when there is no parallel training corpus. In this paper, we address this challenge by using a reinforcementlearning-based generator-evaluator architecture. Our generator employs an attentionbased encoder-decoder to transfer a sentence from the source style to the target style. Our evaluator is an adversarially trained style discriminator with semantic and syntactic constraints that score the generated sentence for style, meaning preservation, and fluency. Experimental results on two different style transfer tasks (sentiment transfer and formality transfer) show that our model outperforms state-of-the-art approaches. Furthermore, we perform a manual evaluation that demonstrates the effectiveness of the proposed method using subjective metrics of generated text quality.
引用
收藏
页码:3168 / 3180
页数:13
相关论文
共 50 条
  • [1] A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer
    Luo, Fuli
    Li, Peng
    Zhou, Jie
    Yang, Pengcheng
    Chang, Baobao
    Sun, Xu
    Sui, Zhifang
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5116 - 5122
  • [2] Disentangled Learning with Synthetic Parallel Data for Text Style Transfer
    Han, Jingxuan
    Wang, Quan
    Guo, Zikang
    Xu, Benfeng
    Zhang, Licheng
    Mao, Zhendong
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 15187 - 15201
  • [3] Disentangled Representation Learning for Non-Parallel Text Style Transfer
    John, Vineet
    Mou, Lili
    Bahuleyan, Hareesh
    Vechtomova, Olga
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 424 - 434
  • [4] Text style transfer between classical and modern chinese through prompt-based reinforcement learning
    Xu, Minzhang
    Peng, Min
    Liu, Fang
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (02): : 733 - 750
  • [5] Text style transfer between classical and modern chinese through prompt-based reinforcement learning
    Minzhang Xu
    Min Peng
    Fang Liu
    World Wide Web, 2023, 26 : 733 - 750
  • [6] Revision in Continuous Space: Unsupervised Text Style Transfer without Adversarial Learning
    Liu, Dayiheng
    Fu, Jie
    Zhang, Yidan
    Pal, Chris
    Lv, Jiancheng
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8376 - 8383
  • [7] CONDITIONAL SENTENCE REPHRASING WITHOUT PARALLEL TRAINING CORPUS
    Lee, Yen-Ting
    Li, Cheng-Te
    Lin, Shou-De
    2022 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (IEEE ICMEW 2022), 2022,
  • [8] Efficient Style-Corpus Constrained Learning for Photorealistic Style Transfer
    Qiao, Yingxu
    Cui, Jiabao
    Huang, Fuxian
    Liu, Hongmin
    Bao, Cuizhu
    Li, Xi
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 3154 - 3166
  • [9] Deep Learning for Text Style Transfer: A Survey
    Jin, Di
    Jin, Zhijing
    Hu, Zhiting
    Vechtomova, Olga
    Mihalcea, Rada
    COMPUTATIONAL LINGUISTICS, 2022, 48 (01) : 155 - 205
  • [10] Transductive Learning for Unsupervised Text Style Transfer
    Xiao, Fei
    Pang, Liang
    Lan, Yanyan
    Wang, Yan
    Shen, Huawei
    Cheng, Xueqi
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2510 - 2521