Relational metric learning with high-order neighborhood interactions for social recommendation

被引:5
作者
Liu, Zhen [1 ]
Wang, Xiaodong [1 ]
Ma, Ying [2 ]
Yang, Xinxin [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Comp & Informat Technol, Beijing 100044, Peoples R China
[2] State Informat Ctr, Beijing 100045, Peoples R China
关键词
Social recommendation; Relation modeling; Metric learning; High-order neighborhood interactions;
D O I
10.1007/s10115-022-01680-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Social information has been widely incorporated into traditional recommendation systems to alleviate the data sparsity and cold-start issues. However, existing social recommend methods typically have two common limitations: (a) they learn a unified representation for each user involved in both item and social domains, which is insufficient for fine-grained user modeling. (b) They ignore the high-order neighborhood information encoded in both user-item interactions and social relations. To overcome these two limitations, this paper proposes a novel social recommend method, SoHRML, based on social relations under the metric learning framework. Specifically, user-item interactions and social relations are modeled as two types of relation vectors, with which each user could be translated to both multiple item-aware and social-aware representations. In addition, to capture the rich information encoded in local neighborhoods, we model the relation vectors by high-order neighborhood interactions (HNI). In each domain, we design a dual layer-wise neighborhood aggregation (LNA) structure that contains dual graph attention networks (GATs) to aggregate the neighborhoods of users or items. Both high-order information encoded in user-item interactions and social relations can be captured by stacking the layer-wise structure. Extensive experimental results on three practical datasets demonstrate the superiority of the proposed model, especially under the cold-start scenarios. The performance gains over the best baseline are 0.51% to 3.31% on two ranking-based metrics.
引用
收藏
页码:1525 / 1547
页数:23
相关论文
empty
未找到相关数据