ITSM-GCN: Informative Training Sample Mining for Graph Convolutional Network-based Collaborative Filtering

被引:6
作者
Gong, Kaiqi [1 ]
Song, Xiao [1 ]
Wang, Senzhang [2 ]
Liu, Songsong [1 ]
Li, Yong [1 ]
机构
[1] Beihang Univ, Beijing, Peoples R China
[2] Cent South Univ, Changsha, Hunan, Peoples R China
来源
PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022 | 2022年
关键词
Recommender systems; collaborative filtering; graph convolutional networks; positive sampling; negative sampling;
D O I
10.1145/3511808.3557368
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, graph convolutional network (GCN) has become one of the most popular and state-of-the-art collaborative filtering (CF) methods. Existing GCN-based CF studies have made many meaningful and excellent efforts at loss function design and embedding propagation improvement. Despite their successes, we argue that existing methods have not yet properly explored more effective sampling strategy, including both positive sampling and negative sampling. To tackle this limitation, a novel framework named ITSM-GCN is proposed to carry out our designed Informative Training Sample Mining (ITSM) sampling strategy for the learning of GCN-based CF models. Specifically, we first adopt and improve the dynamic negative sampling (DNS) strategy, which achieves considerable improvements in both training efficiency and recommendation performance. More importantly, we design two potentially positive training sample mining strategies, namely a similarity-based sampler and score-based sampler, to further enhance GCN-based CF. Extensive experiments show that ITSM-GCN significantly outperforms state-of-the-art GCN-based CF models, including LightGCN, SGL-ED and SimpleX. For example, ITSM-GCN improves on SimpleX by 12.0%, 3.0%, and 1.2% on Recall@20 for Amazon-Books, Yelp2018 and Gowalla, respectively.
引用
收藏
页码:614 / 623
页数:10
相关论文
共 36 条
  • [1] Efficient Neural Matrix Factorization without Sampling for Recommendation
    Chen, Chong
    Min, Zhang
    Zhang, Yongfeng
    Liu, Yiqun
    Ma, Shaoping
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2020, 38 (02)
  • [2] Chen L, 2020, AAAI CONF ARTIF INTE, V34, P27
  • [3] Deep Neural Networks for YouTube Recommendations
    Covington, Paul
    Adams, Jay
    Sargin, Emre
    [J]. PROCEEDINGS OF THE 10TH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS'16), 2016, : 191 - 198
  • [4] Ding JT, 2019, PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2230
  • [5] Ding Jingtao, 2020, ADV NEUR IN, V33
  • [6] Fu WJ, 2019, AAAI CONF ARTIF INTE, P94
  • [7] Glorot X., 2010, INT C ARTIFICIAL INT, P249
  • [8] node2vec: Scalable Feature Learning for Networks
    Grover, Aditya
    Leskovec, Jure
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 855 - 864
  • [9] IPGAN: Generating Informative Item Pairs by Adversarial Sampling
    Guo, Guibing
    Zhou, Huan
    Chen, Bowei
    Liu, Zhirong
    Xu, Xiao
    Chen, Xu
    Dong, Zhenhua
    He, Xiuqiang
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (02) : 694 - 706
  • [10] Guy I., 2009, P 3 ACM C REC SYST, P53, DOI DOI 10.1145/1639714.1639725