CollRec: Pre-Trained Language Models and Knowledge Graphs Collaborate to Enhance Conversational Recommendation System

被引:0
|
作者
Liu, Shuang [1 ]
Ao, Zhizhuo [1 ]
Chen, Peng [2 ]
Kolmanic, Simon [3 ]
机构
[1] Dalian Minzu Univ, Sch Comp Sci & Engn, Dalian 116600, Peoples R China
[2] Dalian Neusoft Univ Informat, Sch Comp & Software, Dalian 116023, Peoples R China
[3] Univ Maribor, Fac Elect Engn & Comp Sci, Maribor 2000, Slovenia
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Knowledge graphs; Oral communication; Task analysis; Recommender systems; Motion pictures; Costs; Accuracy; Large language models; Conversational recommendation system; knowledge graph; large language model; end-to-end generation; fine-tuning; ReDial; WebNLG; 2020; challenge;
D O I
10.1109/ACCESS.2024.3434720
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Existing conversational recommender systems (CRS) use insufficient generality in incorporating external information using knowledge graphs. The recommendation module and generation module are loosely connected during model training and shallowly integrated during inference. A simple switching or copying mechanism is used to merge recommended items into generated responses. These problems significantly degrade the recommendation performance. To alleviate this problem, we propose a novel unified framework for collaboratively enhancing conversational recommendations using pre-trained language models and knowledge graphs (CollRec). We use a fine-tuned pre-trained language model to efficiently extract knowledge graphs from conversational text descriptions, perform entity-based recommendations based on the generated graph nodes and edges, and fine-tune a large-scale pre-trained language model to generate fluent and diverse responses. Experimental results on the WebNLG 2020 Challenge dataset, ReDial dataset, and Reddit-Movie dataset show that our CollRec model significantly outperforms the state-of-the-art methods.
引用
收藏
页码:104663 / 104675
页数:13
相关论文
共 50 条
  • [41] Optimizing and Evaluating Pre-Trained Large Language Models for Alzheimer's Disease Detection
    Casu, Filippo
    Grosso, Enrico
    Lagorio, Andrea
    Trunfio, Giuseppe A.
    2024 32ND EUROMICRO INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED AND NETWORK-BASED PROCESSING, PDP 2024, 2024, : 277 - 284
  • [42] Effective test generation using pre-trained Large Language Models and mutation testing
    Dakhel, Arghavan Moradi
    Nikanjam, Amin
    Majdinasab, Vahid
    Khomh, Foutse
    Desmarais, Michel C.
    INFORMATION AND SOFTWARE TECHNOLOGY, 2024, 171
  • [43] PromeTrans: Bootstrap binary functionality classification with knowledge transferred from pre-trained models
    Sha, Zihan
    Zhang, Chao
    Wang, Hao
    Gao, Zeyu
    Zhang, Bolun
    Lan, Yang
    Shu, Hui
    EMPIRICAL SOFTWARE ENGINEERING, 2025, 30 (01)
  • [44] Enhancing Machine-Generated Text Detection: Adversarial Fine-Tuning of Pre-Trained Language Models
    Hee Lee, Dong
    Jang, Beakcheol
    IEEE ACCESS, 2024, 12 : 65333 - 65340
  • [45] Semi-supervised speaker verification system based on pre-trained models
    Li, Yishuang
    Chen, Zhicong
    Miao, Shiyu
    Su, Qi
    Li, Lin
    Hong, Qingyang
    Qinghua Daxue Xuebao/Journal of Tsinghua University, 2024, 64 (11): : 1936 - 1943
  • [46] Building Trust in Conversational AI: A Review and Solution Architecture Using Large Language Models and Knowledge Graphs
    Zafar, Ahtsham
    Parthasarathy, Venkatesh Balavadhani
    Le Van, Chan
    Shahid, Saad
    Khan, Aafaq Iqbal
    Shahid, Arsalan
    BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (06)
  • [47] Harnessing the Power of Pre-trained Vision-Language Models for Efficient Medical Report Generation
    Li, Qi
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 1308 - 1317
  • [48] Enhancing Domain Modeling with Pre-trained Large Language Models: An Automated Assistant for Domain Modelers
    Prokop, Dominik
    Stenchlak, Stepan
    Skoda, Petr
    Klimek, Jakub
    Necasky, Martin
    CONCEPTUAL MODELING, ER 2024, 2025, 15238 : 235 - 253
  • [49] Knowledge Prompt Makes Composed Pre-Trained Models Zero-Shot News Captioner
    Wang, Yanhui
    Xu, Ning
    Tian, Hongshuo
    Lv, Bo
    Duan, YuLong
    Li, Xuanya
    Liu, An-An
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 2879 - 2884
  • [50] SUR-adapter: Enhancing Text-to-Image Pre-trained Diffusion Models with Large Language Models
    Zhong, Shanshan
    Huang, Zhongzhan
    Wen, Wushao
    Qin, Jinghui
    Lin, Liang
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 567 - 578