CollRec: Pre-Trained Language Models and Knowledge Graphs Collaborate to Enhance Conversational Recommendation System

被引:0
|
作者
Liu, Shuang [1 ]
Ao, Zhizhuo [1 ]
Chen, Peng [2 ]
Kolmanic, Simon [3 ]
机构
[1] Dalian Minzu Univ, Sch Comp Sci & Engn, Dalian 116600, Peoples R China
[2] Dalian Neusoft Univ Informat, Sch Comp & Software, Dalian 116023, Peoples R China
[3] Univ Maribor, Fac Elect Engn & Comp Sci, Maribor 2000, Slovenia
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Knowledge graphs; Oral communication; Task analysis; Recommender systems; Motion pictures; Costs; Accuracy; Large language models; Conversational recommendation system; knowledge graph; large language model; end-to-end generation; fine-tuning; ReDial; WebNLG; 2020; challenge;
D O I
10.1109/ACCESS.2024.3434720
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Existing conversational recommender systems (CRS) use insufficient generality in incorporating external information using knowledge graphs. The recommendation module and generation module are loosely connected during model training and shallowly integrated during inference. A simple switching or copying mechanism is used to merge recommended items into generated responses. These problems significantly degrade the recommendation performance. To alleviate this problem, we propose a novel unified framework for collaboratively enhancing conversational recommendations using pre-trained language models and knowledge graphs (CollRec). We use a fine-tuned pre-trained language model to efficiently extract knowledge graphs from conversational text descriptions, perform entity-based recommendations based on the generated graph nodes and edges, and fine-tune a large-scale pre-trained language model to generate fluent and diverse responses. Experimental results on the WebNLG 2020 Challenge dataset, ReDial dataset, and Reddit-Movie dataset show that our CollRec model significantly outperforms the state-of-the-art methods.
引用
收藏
页码:104663 / 104675
页数:13
相关论文
共 50 条
  • [31] Leveraging pre-trained language models for mining microbiome-disease relationships
    Nikitha Karkera
    Sathwik Acharya
    Sucheendra K. Palaniappan
    BMC Bioinformatics, 24
  • [32] Grounding Ontologies with Pre-Trained Large Language Models for Activity Based Intelligence
    Azim, Anee
    Clark, Leon
    Lau, Caleb
    Cobb, Miles
    Jenner, Kendall
    SIGNAL PROCESSING, SENSOR/INFORMATION FUSION, AND TARGET RECOGNITION XXXIII, 2024, 13057
  • [33] Leveraging pre-trained language models for mining microbiome-disease relationships
    Karkera, Nikitha
    Acharya, Sathwik
    Palaniappan, Sucheendra K.
    BMC BIOINFORMATICS, 2023, 24 (01)
  • [34] Discrimination Bias Detection Through Categorical Association in Pre-Trained Language Models
    Dusi, Michele
    Arici, Nicola
    Gerevini, Alfonso Emilio
    Putelli, Luca
    Serina, Ivan
    IEEE ACCESS, 2024, 12 : 162651 - 162667
  • [35] The Use and Misuse of Pre-Trained Generative Large Language Models in Reliability Engineering
    Hu, Yunwei
    Goktas, Yavuz
    Yellamati, David Deepak
    De Tassigny, Catherine
    2024 ANNUAL RELIABILITY AND MAINTAINABILITY SYMPOSIUM, RAMS, 2024,
  • [36] Addressing Extraction and Generation Separately: Keyphrase Prediction With Pre-Trained Language Models
    Liu, Rui
    Lin, Zheng
    Wang, Weiping
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 3180 - 3191
  • [37] Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey
    Min, Bonan
    Ross, Hayley
    Sulem, Elior
    Ben Veyseh, Amir Pouran
    Nguyen, Thien Huu
    Sainz, Oscar
    Agirre, Eneko
    Heintz, Ilana
    Roth, Dan
    ACM COMPUTING SURVEYS, 2024, 56 (02)
  • [38] Recommending metamodel concepts during modeling activities with pre-trained language models
    Weyssow, Martin
    Sahraoui, Houari
    Syriani, Eugene
    SOFTWARE AND SYSTEMS MODELING, 2022, 21 (03) : 1071 - 1089
  • [39] CHATDESIGN: BOOTSTRAPPING GENERATIVE FLOOR PLAN DESIGN WITH PRE-TRAINED LARGE LANGUAGE MODELS
    Li, Jinmin
    Luo, Yilu
    Lu, Shuai
    Zhang, Jingyun
    Wang, Jun
    Guo, Rizen
    Wang, Shaoming
    PROCEEDINGS OF THE 29TH INTERNATIONAL CONFERENCE OF THE ASSOCIATION FOR COMPUTER-AIDED ARCHITECTURAL DESIGN RESEARCH IN ASIA, CAADRIA 2024, VOL 1, 2024, : 99 - 108
  • [40] Zero-Shot Recommendations with Pre-Trained Large Language Models for Multimodal Nudging
    Harrison, Rachel M.
    Dereventsov, Anton
    Bibin, Anton
    2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 1535 - 1542