Improving PLMs for Graph-to-Text Generation by Relational Orientation Attention

被引:1
|
作者
Wang, Tao [1 ,2 ]
Shen, Bo [1 ,2 ]
Zhang, Jinglin [1 ,2 ]
Zhong, Yu [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Sch Elect & Informat Engn, Beijing 100044, Peoples R China
[2] Beijing Municipal Commiss Educ, Key Lab Commun & Informat Syst, Beijing 100044, Peoples R China
关键词
Graph-to-text generation; Pretrained language model; Relational orientation attention; Pretraining task;
D O I
10.1007/s11063-023-11292-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pretrained language models (PLMs) with impressive performances in graph-to-text generation have recently been employed. However, linearized graph data will lead to the loss of triplet structure information and the problem of insufficient syntactic information when graph data are converted into sequence data by PLMs. These defects prevent PLMs from absorbing all the information that knowledge graphs hold and exerting their full potential in graph-to-text generation. To address these issues, we propose two targeted solutions. First, a relational orientation attention (ROA) module is proposed to incorporate triplet structure information into knowledge graph representations. During graph encoding, ROA establishes structural associations between entities and relations by fusing relevant relation information into entity representations. Second, the (knowledge subgraph, text) pairs are used to pretrain PLMs in text and triplet reconstruction tasks. Pretraining tasks with linearized graph data will enable PLMs to transfer learning more seamlessly between graphs and texts. The experiments with the WebNLG, WebQuestions, and PathQuestions datasets demonstrate that relational orientation attention and pretraining tasks (text and triplet reconstruction) can be implemented to capture triplet structure information and boost the learning ability of PLMs on structured data. Additional research reveals that PLMs equipped with designed approaches have superior few-shot learning capability.
引用
收藏
页码:7967 / 7983
页数:17
相关论文
共 50 条
  • [21] Learning Reasoning Patterns for Relational Triple Extraction with Mutual Generation of Text and Graph
    Chen, Yubo
    Zhang, Yunqi
    Huang, Yongfeng
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 1638 - 1647
  • [22] Transformer-based Scene Graph Generation Network With Relational Attention Module
    Yamamoto, Takuma
    Obinata, Yuya
    Nakayama, Osafumi
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2034 - 2041
  • [23] Text summarization based on semantic graphs: an abstract meaning representation graph-to-text deep learning approach
    Kouris, Panagiotis
    Alexandridis, Georgios
    Stafylopatis, Andreas
    JOURNAL OF BIG DATA, 2024, 11 (01)
  • [24] Line Graph Enhanced AMR-to-Text Generation with Mix-Order Graph Attention Networks
    Zhao, Yanbin
    Chen, Lu
    Chen, Zhi
    Cao, Ruisheng
    Zhu, Su
    Yu, Kai
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 732 - 741
  • [25] Multi-relational graph attention networks for knowledge graph completion
    Li, Zhifei
    Zhao, Yue
    Zhang, Yan
    Zhang, Zhaoli
    KNOWLEDGE-BASED SYSTEMS, 2022, 251
  • [26] Relational Graph Neural Network with Hierarchical Attention for Knowledge Graph Completion
    Zhang, Zhao
    Zhuang, Fuzhen
    Zhu, Hengshu
    Shi, Zhiping
    Xiong, Hui
    He, Qing
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9612 - 9619
  • [27] Improving Cross-Modal Constraints: Text Attribute Person Search With Graph Attention Networks
    Yang, Xi
    Wang, Xiaoqi
    Yang, Dong
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 2493 - 2503
  • [28] Relational Turkish text classification with graph neural networks
    Okur, Halil Ibrahim
    Tohma, Kadir
    Sertbas, Ahmet
    JOURNAL OF POLYTECHNIC-POLITEKNIK DERGISI, 2024,
  • [29] Improving Text-to-Code Generation with Features of Code Graph on GPT-2
    Paik, Incheon
    Wang, Jun-Wei
    ELECTRONICS, 2021, 10 (21)
  • [30] Graph ConvolutionWord Embedding and Attention for Text Classification
    Yang, Yi
    Cui, Qihui
    Ji, Lijun
    Cheng, Zhuoran
    PROCEEDINGS OF 2022 THE 6TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND SOFT COMPUTING, ICMLSC 20222, 2022, : 160 - 166