A Simple and Efficient Multi-Task Learning Approach for Conditioned Dialogue Generation

被引:0
|
作者
Zeng, Yan [1 ]
Nie, Jian-Yun [1 ]
机构
[1] Univ Montreal, DIRO, Montreal, PQ, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conditioned dialogue generation suffers from the scarcity of labeled responses. In this work, we exploit labeled non-dialogue text data related to the condition, which are much easier to collect. We propose a multi-task learning approach to leverage both labeled dialogue and text data. The 3 tasks jointly optimize the same pre-trained Transformer - conditioned dialogue generation task on the labeled dialogue data, conditioned language encoding task and conditioned language generation task on the labeled text data. Experimental results show that our approach outperforms the state-of-the-art models by leveraging the labeled texts, and it also obtains larger improvement in performance comparing to the previous methods to leverage text data.
引用
收藏
页码:4927 / 4939
页数:13
相关论文
共 50 条
  • [1] A Simple Approach to Balance Task Loss in Multi-Task Learning
    Liang, Sicong
    Deng, Chang
    Zhang, Yu
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 812 - 823
  • [2] Multi-task Learning for Natural Language Generation in Task-Oriented Dialogue
    Zhu, Chenguang
    Zeng, Michael
    Huang, Xuedong
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1261 - 1266
  • [3] Multi-Task Learning of Generation and Classification for Emotion-Aware Dialogue Response Generation
    Ide, Tatsuya
    Kawahara, Daisuke
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 119 - 125
  • [4] Incorporating Background Knowledge into Dialogue Generation Using Multi-task Transformer Learning
    Yuan, Yiming
    Cai, Xiantao
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 1046 - 1051
  • [5] A Simple, Effective and Extendible Approach to Deep Multi-task Learning
    Gao, Yang
    Li, Yi-Fan
    Lin, Yu
    Tao, Hemeng
    Khan, Latifur
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 924 - 929
  • [6] Simple, Efficient and Convenient Decentralized Multi-task Learning for Neural Networks
    Pilet, Amaury Bouchra
    Frey, Davide
    Taiani, Francois
    ADVANCES IN INTELLIGENT DATA ANALYSIS XIX, IDA 2021, 2021, 12695 : 37 - 49
  • [7] A multi-task learning approach for improving travel recommendation with keywords generation
    Chen, Lei
    Cao, Jie
    Zhu, Guixiang
    Wang, Youquan
    Liang, Weichao
    KNOWLEDGE-BASED SYSTEMS, 2021, 233 (233)
  • [8] An efficient active learning method for multi-task learning
    Xiao, Yanshan
    Chang, Zheng
    Liu, Bo
    KNOWLEDGE-BASED SYSTEMS, 2020, 190
  • [9] Multi-Task Learning with Sequence-Conditioned Transporter Networks
    Lim, Michael H.
    Zeng, Andy
    Ichter, Brian
    Bandari, Maryam
    Coumans, Erwin
    Tomlin, Claire
    Schaal, Stefan
    Faust, Aleksandra
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 2489 - 2496
  • [10] Chinese Dialogue Analysis Using Multi-Task Learning Framework
    Zhang, Xuejing
    Lv, Xueqiang
    Zhou, Qiang
    2018 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2018, : 102 - 107