LLM-Driven External Knowledge Integration Network for Rumor Detection

被引:0
作者
Hu, Ziang [1 ,2 ]
Wei, Lingwei [1 ,2 ]
Li, Kun [1 ,2 ]
Liu, Zongzhen [1 ,2 ]
Wang, Yuhang [1 ,2 ]
Zhang, Xiaodan [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
来源
ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT VI, ICIC 2024 | 2024年 / 14880卷
基金
国家重点研发计划;
关键词
Rumor Detection; ChatGPT-based; External Knowledge; Evidence;
D O I
10.1007/978-981-97-5678-0_1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Detecting rumors requires a profound understanding of the real-world background. Most existing small language models (SLMs) methods found that extracting external knowledge from knowledge bases (KBs), such as extracting entity concepts and evidence within articles, significantly enhances rumor detection performance. However, two limitations still exist: 1)When extracting entity concepts from KBs, entity ambiguity may introduce inappropriate concepts. 2) Prevailing methods for extracting evidential knowledge rely on individual KBs, limiting the available information scope. Recent large language models (LLMs) have shown remarkable performance in various tasks, but leveraging LLMs' internal knowledge for rumor detection is underexplored. To tackle these limitations, we propose a ChatGPT-based External Knowledge Integration Network (CHKIN) that uses chatgpt to extract Entity Concepts and Evidence to enhance rumor detection. Firstly, CHKIN employs LLM to extract entities and their concepts. In this process, LLM considers contextual content, alleviating issues associated with the ambiguity of entity concepts. Secondly, CHKIN employs LLM, unlike use KBs, to gather evidence. LLM's training knowledge is broader, resulting in more comprehensive and continuous evidence generation. Furthermore, CHKIN serves as a bridge between SLMs and LLMs. Experiments on two real-world datasets demonstrate that CHKIN outperforms three baseline method types: SLM-based, LLM-based, and shallow neural networks.
引用
收藏
页码:3 / 13
页数:11
相关论文
共 19 条
  • [1] [Anonymous], 2018, P 1 WORKSHOP FACT EX
  • [2] Bian T, 2020, AAAI CONF ARTIF INTE, V34, P549
  • [3] Brown TB, 2020, ADV NEUR IN, V33
  • [4] Hanselowski Andreas, 2018, P 1 WORKSH FACT EXTR, P103
  • [5] Hu BZ, 2024, Arxiv, DOI arXiv:2309.12247
  • [6] PEKIN: Prompt-Based External Knowledge Integration Network for Rumor Detection on Social Media
    Hu, Ziang
    Liu, Huan
    Li, Kun
    Wang, Yuhang
    Liu, Zongzhen
    Zhang, Xiaodan
    [J]. PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2022, 13630 : 183 - 196
  • [7] Kai Shu, 2017, ACM SIGKDD Explorations Newsletter, V19, P22, DOI 10.1145/3137597.3137600
  • [8] Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing
    Liu, Pengfei
    Yuan, Weizhe
    Fu, Jinlan
    Jiang, Zhengbao
    Hayashi, Hiroaki
    Neubig, Graham
    [J]. ACM COMPUTING SURVEYS, 2023, 55 (09)
  • [9] Liu Y, 2024, Arxiv, DOI arXiv:2305.13860
  • [10] Liu Zhenghao, 2020, P 58 ANN M ASS COMP, P7342