Assisted Process Knowledge Graph Building Using Pre-trained Language Models

被引:1
|
作者
Bellan, Patrizio [1 ,2 ]
Dragoni, Mauro [1 ]
Ghidini, Chiara [1 ]
机构
[1] Fdn Bruno Kessler, Trento, Italy
[2] Free Univ Bozen Bolzano, Bolzano, Italy
来源
AIXIA 2022 - ADVANCES IN ARTIFICIAL INTELLIGENCE | 2023年 / 13796卷
关键词
Process extraction from text; In-context learning; Knowledge graph; Pre-trained language model; Business process management;
D O I
10.1007/978-3-031-27181-6_5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The automated construction of knowledge graphs from procedural documents is a challenging research area. Here, the lack of annotated data, as well as raw text repositories describing real-world procedural documents, make it extremely difficult to adopt deep learning approaches. Pre-trained language models have shown promising results concerning the knowledge extraction tasks from the models themselves. Although several works explored this strategy to build knowledge graph, the viability of knowledge base construction by using prompt-based learning strategy from such language models has not yet been investigated deeply. In this work, we present a prompt-based in-context learning strategy to extract, from natural language process descriptions, conceptual information that can be converted into their equivalent knowledge graphs. Such a strategy is performed in a multi-turn dialog fashion. We validate the accuracy of the proposed approach from both quantitative and qualitative perspectives. The results highlight the feasibility of the proposed approach within low-resource scenarios.
引用
收藏
页码:60 / 74
页数:15
相关论文
共 50 条
  • [31] Billion-scale pre-trained knowledge graph model for conversational chatbot
    Wong, Chi-Man
    Feng, Fan
    Zhang, Wen
    Chen, Huajun
    Vong, Chi-Man
    Chen, Chuangquan
    NEUROCOMPUTING, 2024, 606
  • [32] Improving Extraction of Chinese Open Relations Using Pre-trained Language Model and Knowledge Enhancement
    Wen, Chaojie
    Jia, Xudong
    Chen, Tao
    DATA INTELLIGENCE, 2023, 5 (04) : 962 - 989
  • [33] A Pre-trained Universal Knowledge Graph Reasoning Model Based on Rule Prompts
    Cui, Yuanning
    Sun, Zequn
    Hu, Wei
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (08): : 2030 - 2044
  • [34] Entity Resolution Based on Pre-trained Language Models with Two Attentions
    Zhu, Liang
    Liu, Hao
    Song, Xin
    Wei, Yonggang
    Wang, Yu
    WEB AND BIG DATA, PT III, APWEB-WAIM 2023, 2024, 14333 : 433 - 448
  • [35] On the Effectiveness of Pre-Trained Language Models for Legal Natural Language Processing: An Empirical Study
    Song, Dezhao
    Gao, Sally
    He, Baosheng
    Schilder, Frank
    IEEE ACCESS, 2022, 10 : 75835 - 75858
  • [36] Neural Transfer Learning For Vietnamese Sentiment Analysis Using Pre-trained Contextual Language Models
    An Pha Le
    Tran Vu Pham
    Thanh-Van Le
    Huynh, Duy, V
    2021 IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLIED NETWORK TECHNOLOGIES (ICMLANT II), 2021, : 84 - 88
  • [37] Automatic Short Answer Grading to Pedagogical Questions Using Knowledge Graphs and Pre-trained Models
    Teranishi D.
    Araki M.
    Transactions of the Japanese Society for Artificial Intelligence, 2022, 37 (04)
  • [38] Explainable reasoning over temporal knowledge graphs by pre-trained language model
    Li, Qing
    Wu, Guanzhong
    INFORMATION PROCESSING & MANAGEMENT, 2025, 62 (01)
  • [39] IMPROVING CTC-BASED SPEECH RECOGNITION VIA KNOWLEDGE TRANSFERRING FROM PRE-TRAINED LANGUAGE MODELS
    Deng, Keqi
    Cao, Songjun
    Zhang, Yike
    Ma, Long
    Cheng, Gaofeng
    Xu, Ji
    Zhang, Pengyuan
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8517 - 8521
  • [40] Grounding Dialogue Systems via Knowledge Graph Aware Decoding with Pre-trained Transformers
    Chaudhuri, Debanjan
    Rony, Md Rashad Al Hasan
    Lehmann, Jens
    SEMANTIC WEB, ESWC 2021, 2021, 12731 : 323 - 339