Assisted Process Knowledge Graph Building Using Pre-trained Language Models

被引:1
|
作者
Bellan, Patrizio [1 ,2 ]
Dragoni, Mauro [1 ]
Ghidini, Chiara [1 ]
机构
[1] Fdn Bruno Kessler, Trento, Italy
[2] Free Univ Bozen Bolzano, Bolzano, Italy
来源
AIXIA 2022 - ADVANCES IN ARTIFICIAL INTELLIGENCE | 2023年 / 13796卷
关键词
Process extraction from text; In-context learning; Knowledge graph; Pre-trained language model; Business process management;
D O I
10.1007/978-3-031-27181-6_5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The automated construction of knowledge graphs from procedural documents is a challenging research area. Here, the lack of annotated data, as well as raw text repositories describing real-world procedural documents, make it extremely difficult to adopt deep learning approaches. Pre-trained language models have shown promising results concerning the knowledge extraction tasks from the models themselves. Although several works explored this strategy to build knowledge graph, the viability of knowledge base construction by using prompt-based learning strategy from such language models has not yet been investigated deeply. In this work, we present a prompt-based in-context learning strategy to extract, from natural language process descriptions, conceptual information that can be converted into their equivalent knowledge graphs. Such a strategy is performed in a multi-turn dialog fashion. We validate the accuracy of the proposed approach from both quantitative and qualitative perspectives. The results highlight the feasibility of the proposed approach within low-resource scenarios.
引用
收藏
页码:60 / 74
页数:15
相关论文
共 50 条
  • [21] Prompt Learning with Structured Semantic Knowledge Makes Pre-Trained Language Models Better
    Zheng, Hai-Tao
    Xie, Zuotong
    Liu, Wenqiang
    Huang, Dongxiao
    Wu, Bei
    Kim, Hong-Gee
    ELECTRONICS, 2023, 12 (15)
  • [22] FedBM: Stealing knowledge from pre-trained language models for heterogeneous federated learning
    Zhu, Meilu
    Yang, Qiushi
    Gao, Zhifan
    Yuan, Yixuan
    Liu, Jun
    MEDICAL IMAGE ANALYSIS, 2025, 102
  • [23] Modeling Second Language Acquisition with pre-trained neural language models
    Palenzuela, Alvaro J. Jimenez
    Frasincar, Flavius
    Trusca, Maria Mihaela
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 207
  • [24] Semantic Importance-Aware Communications Using Pre-Trained Language Models
    Guo, Shuaishuai
    Wang, Yanhu
    Li, Shujing
    Saeed, Nasir
    IEEE COMMUNICATIONS LETTERS, 2023, 27 (09) : 2328 - 2332
  • [25] Abstractive Summarization of Korean Legal Cases using Pre-trained Language Models
    Yoon, Jiyoung
    Junaid, Muhammad
    Ali, Sajid
    Lee, Jongwuk
    PROCEEDINGS OF THE 2022 16TH INTERNATIONAL CONFERENCE ON UBIQUITOUS INFORMATION MANAGEMENT AND COMMUNICATION (IMCOM 2022), 2022,
  • [26] MEM-KGC: Masked Entity Model for Knowledge Graph Completion With Pre-Trained Language Model
    Choi, Bonggeun
    Jang, Daesik
    Ko, Youngjoong
    IEEE ACCESS, 2021, 9 : 132025 - 132032
  • [27] Knowledge-Infused Pre-trained Models for KG Completion
    Yu, Han
    Jiang, Rong
    Zhou, Bin
    Li, Aiping
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2020, PT I, 2020, 12342 : 273 - 285
  • [28] CollRec: Pre-Trained Language Models and Knowledge Graphs Collaborate to Enhance Conversational Recommendation System
    Liu, Shuang
    Ao, Zhizhuo
    Chen, Peng
    Kolmanic, Simon
    IEEE ACCESS, 2024, 12 : 104663 - 104675
  • [29] Hyperbolic Pre-Trained Language Model
    Chen, Weize
    Han, Xu
    Lin, Yankai
    He, Kaichen
    Xie, Ruobing
    Zhou, Jie
    Liu, Zhiyuan
    Sun, Maosong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 3101 - 3112
  • [30] Prompt learning for metonymy resolution: Enhancing performance with internal prior knowledge of pre-trained language models
    Zhao, Biao
    Jin, Weiqiang
    Zhang, Yu
    Huang, Subin
    Yang, Guang
    KNOWLEDGE-BASED SYSTEMS, 2023, 279