Rephrasing the Reference for Non-autoregressive Machine Translation

被引:0
|
作者
Shao, Chenze [1 ,2 ]
Zhang, Jinchao [3 ]
Zhou, Jie [3 ]
Feng, Yang [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Tencent Inc, Pattern Recognit Ctr, WeChat AI, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Non-autoregressive neural machine translation (NAT) models suffer from the multi-modality problem that there may exist multiple possible translations of a source sentence, so the reference sentence may be inappropriate for the training when the NAT output is closer to other translations. In response to this problem, we introduce a rephraser to provide a better training target for NAT by rephrasing the reference sentence according to the NAT output. As we train NAT based on the rephraser output rather than the reference sentence, the rephraser output should fit well with the NAT output and not deviate too far from the reference, which can be quantified as reward functions and optimized by reinforcement learning. Experiments on major WMT benchmarks and NAT baselines show that our approach consistently improves the translation quality of NAT. Specifically, our best variant achieves comparable performance to the autoregressive Transformer, while being 14.7 times more efficient in inference.
引用
收藏
页码:13538 / 13546
页数:9
相关论文
共 50 条
  • [1] Integrating Translation Memories into Non-Autoregressive Machine Translation
    Xu, Jitao
    Crego, Josep
    Yvon, Francois
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1326 - 1338
  • [2] Enhanced encoder for non-autoregressive machine translation
    Wang, Shuheng
    Shi, Shumin
    Huang, Heyan
    MACHINE TRANSLATION, 2021, 35 (04) : 595 - 609
  • [3] Acyclic Transformer for Non-Autoregressive Machine Translation
    Huang, Fei
    Zhou, Hao
    Liu, Yang
    Li, Hang
    Huang, Minlie
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [4] Non-Autoregressive Machine Translation with Auxiliary Regularization
    Wang, Yiren
    Tian, Fei
    He, Di
    Qin, Tao
    Zhai, ChengXiang
    Liu, Tie-Yan
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5377 - 5384
  • [5] A Survey of Non-Autoregressive Neural Machine Translation
    Li, Feng
    Chen, Jingxian
    Zhang, Xuejun
    ELECTRONICS, 2023, 12 (13)
  • [6] Non-Autoregressive Machine Translation as Constrained HMM
    Li, Haoran
    Jie, Zhanming
    Lui, Wei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 12361 - 12372
  • [7] Non-Autoregressive Machine Translation with Latent Alignments
    Saharia, Chitwan
    Chan, William
    Saxena, Saurabh
    Norouzi, Mohammad
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1098 - 1108
  • [8] Modeling Coverage for Non-Autoregressive Neural Machine Translation
    Shan, Yong
    Feng, Yang
    Shao, Chenze
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [9] Incorporating history and future into non-autoregressive machine translation
    Wang, Shuheng
    Huang, Heyan
    Shi, Shumin
    COMPUTER SPEECH AND LANGUAGE, 2022, 77
  • [10] Non-Autoregressive Machine Translation: It's Not as Fast as it Seems
    Helel, Jindrich
    Haddow, Barry
    Birch, Alexandra
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 1780 - 1790