Grounding Dialogue Systems via Knowledge Graph Aware Decoding with Pre-trained Transformers

被引:3
作者
Chaudhuri, Debanjan [2 ]
Rony, Md Rashad Al Hasan [1 ]
Lehmann, Jens [1 ,2 ]
机构
[1] Fraunhofer IAIS, Dresden, Germany
[2] Univ Bonn, Smart Data Analyt Grp, Bonn, Germany
来源
SEMANTIC WEB, ESWC 2021 | 2021年 / 12731卷
关键词
Knowledge graph; Dialogue system; Graph encoding; Knowledge integration;
D O I
10.1007/978-3-030-77385-4_19
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generating knowledge grounded responses in both goal and non-goal oriented dialogue systems is an important research challenge. Knowledge Graphs (KG) can be viewed as an abstraction of the real world, which can potentially facilitate a dialogue system to produce knowledge grounded responses. However, integrating KGs into the dialogue generation process in an end-to-end manner is a non-trivial task. This paper proposes a novel architecture for integrating KGs into the response generation process by training a BERT model that learns to answer using the elements of the KG (entities and relations) in a multi-task, end-to-end setting. The k-hop subgraph of the KG is incorporated into the model during training and inference using Graph Laplacian. Empirical evaluation suggests that the model achieves better knowledge groundedness (measured via Entity F1 score) compared to other state-of-the-art models for both goal and non-goal oriented dialogues.
引用
收藏
页码:323 / 339
页数:17
相关论文
共 38 条
  • [1] Interpretable Biomedical Reasoning via Deep Fusion of Knowledge Graph and Pre-trained Language Models
    Xu Y.
    Yang Z.
    Lin Y.
    Hu J.
    Dong S.
    Beijing Daxue Xuebao (Ziran Kexue Ban)/Acta Scientiarum Naturalium Universitatis Pekinensis, 2024, 60 (01): : 62 - 70
  • [2] Billion-scale pre-trained knowledge graph model for conversational chatbot
    Wong, Chi-Man
    Feng, Fan
    Zhang, Wen
    Chen, Huajun
    Vong, Chi-Man
    Chen, Chuangquan
    NEUROCOMPUTING, 2024, 606
  • [3] Assisted Process Knowledge Graph Building Using Pre-trained Language Models
    Bellan, Patrizio
    Dragoni, Mauro
    Ghidini, Chiara
    AIXIA 2022 - ADVANCES IN ARTIFICIAL INTELLIGENCE, 2023, 13796 : 60 - 74
  • [4] NMT Enhancement based on Knowledge Graph Mining with Pre-trained Language Model
    Yang, Hao
    Qin, Ying
    Deng, Yao
    Wang, Minghan
    2020 22ND INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): DIGITAL SECURITY GLOBAL AGENDA FOR SAFE SOCIETY!, 2020, : 185 - 189
  • [5] A Pre-trained Universal Knowledge Graph Reasoning Model Based on Rule Prompts
    Cui, Yuanning
    Sun, Zequn
    Hu, Wei
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (08): : 2030 - 2044
  • [6] KG-prompt: Interpretable knowledge graph prompt for pre-trained language models
    Chen, Liyi
    Liu, Jie
    Duan, Yutai
    Wang, Runze
    KNOWLEDGE-BASED SYSTEMS, 2025, 311
  • [7] Construction and application of knowledge graph for grid dispatch fault handling based on pre-trained model
    Ji, Zhixiang
    Wang, Xiaohui
    Zhang, Jie
    Wu, Di
    GLOBAL ENERGY INTERCONNECTION-CHINA, 2023, 6 (04): : 493 - 504
  • [8] Billion-scale Pre-trained E-commerce Product Knowledge Graph Model
    Zhang, Wen
    Wong, Chi-Man
    Ye, Ganqiang
    Wen, Bo
    Zhang, Wei
    Chen, Huajun
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 2476 - 2487
  • [9] VEG-MMKG: Multimodal knowledge graph construction for vegetables based on pre-trained model extraction
    Lv, Bowen
    Wu, Huarui
    Chen, Wenbai
    Chen, Cheng
    Miao, Yisheng
    Zhao, Chunjiang
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2024, 226
  • [10] XDAI: A Tuning-free Framework for Exploiting Pre-trained Language Models in Knowledge Grounded Dialogue Generation
    Yu, Jifan
    Zhang, Xiaohan
    Xu, Yifan
    Lei, Xuanyu
    Guan, Xinyu
    Zhang, Jing
    Hou, Lei
    Li, Juanzi
    Tang, Jie
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4422 - 4432