Knowledge-Grounded Dialogue Generation with Pre-trained Language Models

被引:0
作者
Zhao, Xueliang [1 ,2 ]
Wu, Wei [3 ]
Xu, Can [4 ]
Tao, Chongyang [4 ]
Zhao, Dongyan [1 ,2 ]
Yan, Rui [1 ,2 ,5 ]
机构
[1] Peking Univ, Wangxuan Inst Comp Technol, Beijing, Peoples R China
[2] Peking Univ, Ctr Data Sci, AAIS, Beijing, Peoples R China
[3] Meituan, Beijing, Peoples R China
[4] Microsoft Corp, Beijing, Peoples R China
[5] Beijing Acad Artificial Intelligence BAAI, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP) | 2020年
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study knowledge-grounded dialogue generation with pre-trained language models. To leverage the redundant external knowledge under capacity constraint, we propose equipping response generation defined by a pre-trained language model with a knowledge selection module, and an unsupervised approach to jointly optimizing knowledge selection and response generation with unlabeled dialogues. Empirical results on two benchmarks indicate that our model can significantly outperform state-of-the-art methods in both automatic evaluation and human judgment.
引用
收藏
页码:3377 / 3390
页数:14
相关论文
共 50 条
  • [41] Bridging the Gap between Prior and Posterior Knowledge Selection for Knowledge-Grounded Dialogue Generation
    Chen, Xiuyi
    Meng, Fandong
    Li, Peng
    Chen, Feilong
    Xu, Shuang
    Xu, Bo
    Zhou, Jie
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 3426 - 3437
  • [42] Enhancing radiology report generation through pre-trained language models
    Leonardi, Giorgio
    Portinale, Luigi
    Santomauro, Andrea
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2024,
  • [43] Non-Autoregressive Text Generation with Pre-trained Language Models
    Su, Yixuan
    Cai, Deng
    Wang, Yan
    Vandyke, David
    Baker, Simon
    Li, Piji
    Collier, Nigel
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 234 - 243
  • [44] GLM-Dialog: Noise-tolerant Pre-training for Knowledge-grounded Dialogue Generation
    Zhang, Jing
    Zhang, Xiaokang
    Zhang-Li, Daniel
    Yu, Jifan
    Yao, Zijun
    Ma, Zeyao
    Xu, Yiqi
    Wang, Haohua
    Zhang, Xiaohan
    Lin, Nianyi
    Lu, Sunrui
    Li, Juanzi
    Tang, Jie
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5564 - 5575
  • [45] Knowledge-Grounded Dialogue Generation with Term-level De-noising
    Zheng, Wen
    Milic-Frayling, Natasa
    Zhou, Ke
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2972 - 2983
  • [46] Towards Fewer Hallucinations in Knowledge-Grounded Dialogue Generation via Augmentative and Contrastive Knowledge-Dialogue
    Sun, Bin
    Li, Yitong
    Mi, Fei
    Bie, FanHu
    Li, Yiwei
    Li, Kan
    61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1741 - 1750
  • [47] Annotating Columns with Pre-trained Language Models
    Suhara, Yoshihiko
    Li, Jinfeng
    Li, Yuliang
    Zhang, Dan
    Demiralp, Cagatay
    Chen, Chen
    Tan, Wang-Chiew
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 1493 - 1503
  • [48] LaoPLM: Pre-trained Language Models for Lao
    Lin, Nankai
    Fu, Yingwen
    Yang, Ziyu
    Chen, Chuwei
    Jiang, Shengyi
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 6506 - 6512
  • [49] HinPLMs: Pre-trained Language Models for Hindi
    Huang, Xixuan
    Lin, Nankai
    Li, Kexin
    Wang, Lianxi
    Gan, Suifu
    2021 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2021, : 241 - 246
  • [50] Evaluating Commonsense in Pre-Trained Language Models
    Zhou, Xuhui
    Zhang, Yue
    Cui, Leyang
    Huang, Dandan
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9733 - 9740