Improving PLMs for Graph-to-Text Generation by Relational Orientation Attention

被引:1
|
作者
Wang, Tao [1 ,2 ]
Shen, Bo [1 ,2 ]
Zhang, Jinglin [1 ,2 ]
Zhong, Yu [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Sch Elect & Informat Engn, Beijing 100044, Peoples R China
[2] Beijing Municipal Commiss Educ, Key Lab Commun & Informat Syst, Beijing 100044, Peoples R China
关键词
Graph-to-text generation; Pretrained language model; Relational orientation attention; Pretraining task;
D O I
10.1007/s11063-023-11292-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pretrained language models (PLMs) with impressive performances in graph-to-text generation have recently been employed. However, linearized graph data will lead to the loss of triplet structure information and the problem of insufficient syntactic information when graph data are converted into sequence data by PLMs. These defects prevent PLMs from absorbing all the information that knowledge graphs hold and exerting their full potential in graph-to-text generation. To address these issues, we propose two targeted solutions. First, a relational orientation attention (ROA) module is proposed to incorporate triplet structure information into knowledge graph representations. During graph encoding, ROA establishes structural associations between entities and relations by fusing relevant relation information into entity representations. Second, the (knowledge subgraph, text) pairs are used to pretrain PLMs in text and triplet reconstruction tasks. Pretraining tasks with linearized graph data will enable PLMs to transfer learning more seamlessly between graphs and texts. The experiments with the WebNLG, WebQuestions, and PathQuestions datasets demonstrate that relational orientation attention and pretraining tasks (text and triplet reconstruction) can be implemented to capture triplet structure information and boost the learning ability of PLMs on structured data. Additional research reveals that PLMs equipped with designed approaches have superior few-shot learning capability.
引用
收藏
页码:7967 / 7983
页数:17
相关论文
共 50 条
  • [1] Improving PLMs for Graph-to-Text Generation by Relational Orientation Attention
    Tao Wang
    Bo Shen
    Jinglin Zhang
    Yu Zhong
    Neural Processing Letters, 2023, 55 : 7967 - 7983
  • [2] Graph-to-Text Generation with Bidirectional Dual Cross-Attention and Concatenation
    Jimale, Elias Lemuye
    Chen, Wenyu
    Al-antari, Mugahed A.
    Gu, Yeong Hyeon
    Agbesi, Victor Kwaku
    Feroze, Wasif
    Akmel, Feidu
    Assefa, Juhar Mohammed
    Shahzad, Ali
    MATHEMATICS, 2025, 13 (06)
  • [3] Promoting Graph Awareness in Linearized Graph-to-Text Generation
    Hoyle, Alexander
    Marasovic, Ana
    Smith, Noah A.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 944 - 956
  • [4] Topic Controlled Steganography via Graph-to-Text Generation
    Sun, Bowen
    Li, Yamin
    Zhang, Jun
    Xu, Honghong
    Ma, Xiaoqiang
    Xia, Ping
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 136 (01): : 157 - 176
  • [5] FactSpotter: Evaluating the Factual Faithfulness of Graph-to-Text Generation
    Zhang, Kun
    Balalau, Oana
    Manolescu, Ioana
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 10025 - 10042
  • [6] Stage-wise Fine-tuning for Graph-to-Text Generation
    Wang, Qingyun
    Yavuz, Semih
    Lin, Xi Victoria
    Ji, Heng
    Rajani, Nazneen Fatema
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2021, : 16 - 22
  • [7] Few-shot Knowledge Graph-to-Text Generation with Pretrained Language Models
    Li, Junyi
    Tang, Tianyi
    Zhao, Wayne Xin
    Wei, Zhicheng
    Yuan, Nicholas Jing
    Wen, Ji-Rong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 1558 - 1568
  • [8] Fusing graph structural information with pre-trained generative model for knowledge graph-to-text generation
    Shi, Xiayang
    Xia, Zhenlin
    Li, Yinlin
    Wang, Xuhui
    Niu, Yufeng
    KNOWLEDGE AND INFORMATION SYSTEMS, 2025, 67 (03) : 2619 - 2640
  • [9] A Comparative Study of Knowledge Graph-to-Text Generation Architectures in the Context of Conversational Agents
    Ghanem, Hussam
    Cruz, Christophe
    COMPLEX NETWORKS & THEIR APPLICATIONS XII, VOL 1, COMPLEX NETWORKS 2023, 2024, 1141 : 413 - 426
  • [10] Structure-aware Knowledge Graph-to-Text Generation with Planning Selection and Similarity Distinction
    Zhao, Feng
    Zou, Hongzhi
    Yan, Cheng
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 8693 - 8703