Improving PLMs for Graph-to-Text Generation by Relational Orientation Attention

被引:1
|
作者
Wang, Tao [1 ,2 ]
Shen, Bo [1 ,2 ]
Zhang, Jinglin [1 ,2 ]
Zhong, Yu [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Sch Elect & Informat Engn, Beijing 100044, Peoples R China
[2] Beijing Municipal Commiss Educ, Key Lab Commun & Informat Syst, Beijing 100044, Peoples R China
关键词
Graph-to-text generation; Pretrained language model; Relational orientation attention; Pretraining task;
D O I
10.1007/s11063-023-11292-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pretrained language models (PLMs) with impressive performances in graph-to-text generation have recently been employed. However, linearized graph data will lead to the loss of triplet structure information and the problem of insufficient syntactic information when graph data are converted into sequence data by PLMs. These defects prevent PLMs from absorbing all the information that knowledge graphs hold and exerting their full potential in graph-to-text generation. To address these issues, we propose two targeted solutions. First, a relational orientation attention (ROA) module is proposed to incorporate triplet structure information into knowledge graph representations. During graph encoding, ROA establishes structural associations between entities and relations by fusing relevant relation information into entity representations. Second, the (knowledge subgraph, text) pairs are used to pretrain PLMs in text and triplet reconstruction tasks. Pretraining tasks with linearized graph data will enable PLMs to transfer learning more seamlessly between graphs and texts. The experiments with the WebNLG, WebQuestions, and PathQuestions datasets demonstrate that relational orientation attention and pretraining tasks (text and triplet reconstruction) can be implemented to capture triplet structure information and boost the learning ability of PLMs on structured data. Additional research reveals that PLMs equipped with designed approaches have superior few-shot learning capability.
引用
收藏
页码:7967 / 7983
页数:17
相关论文
共 50 条
  • [31] Improving Graph Generation by Restricting Graph Bandwidth
    Diamant, Nathaniel
    Tseng, Alex M.
    Chuang, Kangway V.
    Biancalani, Tommaso
    Scalia, Gabriele
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [32] Graph Attention Network with Relational Dynamic Factual Fusion for Knowledge Graph Completion
    Yu, Mei
    Zuo, Yilin
    Zhang, Wenbin
    Zhao, Mankun
    Xu, Tianyi
    Zhao, Yue
    Guo, Jiujiang
    Yu, Jian
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT IV, ECML PKDD 2024, 2024, 14944 : 89 - 106
  • [33] Knowledge graph link prediction based on relational generative graph attention network
    Chen, Cheng
    Zhang, Hao
    Li, Yong-Qiang
    Feng, Yuan-Jing
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2022, 56 (05): : 1025 - 1034
  • [34] Knowledge Graph Entity Type Prediction with Relational Aggregation Graph Attention Network
    Zou, Changlong
    An, Jingmin
    Li, Guanyu
    SEMANTIC WEB, ESWC 2022, 2022, 13261 : 39 - 55
  • [35] MRGAT: Multi-Relational Graph Attention Network for knowledge graph completion
    Dai, Guoquan
    Wang, Xizhao
    Zou, Xiaoying
    Liu, Chao
    Cen, Si
    NEURAL NETWORKS, 2022, 154 : 234 - 245
  • [36] Efficient Graph Generation with Graph Recurrent Attention Networks
    Liao, Renjie
    Li, Yujia
    Song, Yang
    Wang, Shenlong
    Hamilton, William L.
    Duvenaud, David
    Urtasun, Raquel
    Zemel, Richard
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [37] Generalized Supervised Attention for Text Generation
    Liu, Yixian
    Zhang, Liwen
    Zhang, Xinyu
    Jiang, Yong
    Zhang, Yue
    Tu, Kewei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4991 - 5003
  • [38] On Improving Text Generation Via Integrating Text Coherence
    Ai, Lisi
    Gao, Baoli
    Zheng, Jianbing
    Gao, Ming
    PROCEEDINGS OF 2019 6TH IEEE INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND INTELLIGENCE SYSTEMS (CCIS), 2019, : 6 - 10
  • [39] Affection Enhanced Relational Graph Attention Network for Sarcasm Detection
    Li, Guowei
    Lin, Fuqiang
    Chen, Wangqun
    Liu, Bo
    APPLIED SCIENCES-BASEL, 2022, 12 (07):
  • [40] Target relational attention-oriented knowledge graph reasoning
    Zhao, Xiaojuan
    Jia, Yan
    Li, Aiping
    Jiang, Rong
    Chen, Kai
    Wang, Ye
    NEUROCOMPUTING, 2021, 461 : 577 - 586