Enhancing text generation from knowledge graphs with cross-structure attention distillation

被引:3
作者
Shi, Xiayang [1 ]
Xia, Zhenlin [1 ]
Cheng, Pei [1 ]
Li, Yinlin [2 ]
机构
[1] Zhengzhou Univ Light Ind, Zhengzhou 45000, Henan, Peoples R China
[2] Chinese Acad Sci, Inst Automat, State Key Lab Multimodal Artificial Intelligence S, Beijing 100190, Peoples R China
关键词
KG-to-text; Pre-trained models; Knowledge distillation;
D O I
10.1016/j.engappai.2024.108971
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Existing Large-scale pre-trained language models (PLMs) can effectively enhance the knowledge-graph-to-text (KG-to-text) generation by processing the linearized version of a graph. However, recent work ignores the effective interaction between the linear representation of the knowledge graphs and the generated texts. To address this problem, we propose a distillation model that utilizes a cross-structure attention mechanism for generating texts from knowledge graphs. This mechanism is designed to obtain rich contextual semantic representations between linearized structured data and text information in the teacher model. And then, we train a student model to mimic the behavior of the teacher model by outputting logits with online distillation. Experimental results demonstrate that our distillation model outperforms many pre-trained natural language generation (NLG) models on various KG-to-text datasets.
引用
收藏
页数:11
相关论文
共 56 条
[1]   Automatic Story Generation: A Survey of Approaches [J].
Alhussain, Arwa, I ;
Azmi, Aqil M. .
ACM COMPUTING SURVEYS, 2021, 54 (05)
[2]  
Banerjee S, 2005, P ACL WORKSH INTR EX, P65
[3]  
Bordes A, 2014, Arxiv, DOI arXiv:1406.3676
[4]  
Bosselut A, 2019, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), P4762
[5]  
Cai D, 2020, AAAI CONF ARTIF INTE, V34, P7464
[6]  
Chen WH, 2020, PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), P8635
[7]  
Chen Y.-C., 2020, Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, P7893, DOI DOI 10.18653/V1/2020.ACL-MAIN.705
[8]   Toward Subgraph-Guided Knowledge Graph Question Generation With Graph Neural Networks [J].
Chen, Yu ;
Wu, Lingfei ;
Zaki, Mohammed J. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) :12706-12717
[9]  
Colas A., 2021, P NEURAL INFORM PROC
[10]  
Colas Anthony, 2022, P 29 INT C COMP LING, P5755