Low-Rank and Sparse Matrix Completion for Recommendation

被引:8
作者
Zhao, Zhi-Lin [1 ]
Huang, Ling [1 ]
Wang, Chang-Dong [1 ]
Lai, Jian-Huang [1 ]
Yu, Philip S. [2 ,3 ]
机构
[1] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou, Peoples R China
[2] Univ Illinois, Dept Comp Sci, Chicago, IL USA
[3] Tsinghua Univ, Inst Data Sci, Beijing, Peoples R China
来源
NEURAL INFORMATION PROCESSING, ICONIP 2017, PT V | 2017年 / 10638卷
关键词
Recommendation algorithms; Low-rank; Sparse;
D O I
10.1007/978-3-319-70139-4_1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, recommendation algorithms have been widely used to improve the benefit of businesses and the satisfaction of users in many online platforms. However, most of the existing algorithms generate intermediate output when predicting ratings and the error of intermediate output will be propagated to the final results. Besides, since most algorithms predict all the unrated items, some predicted ratings may be unreliable and useless which will lower the efficiency and effectiveness of recommendation. To this end, we propose a Low-rank and Sparse Matrix Completion (LSMC) method which recovers rating matrix directly to improve the quality of rating prediction. Following the common methodology, we assume the structure of the predicted rating matrix is low-rank since rating is just connected with some factors of user and item. However, different from the existing methods, we assume the matrix is sparse so some unreliable predictions will be removed and important results will be retained. Besides, a slack variable will be used to prevent overfitting and weaken the influence of noisy data. Extensive experiments on four real-world datasets have been conducted to verify that the proposed method outperforms the state-of-the-art recommendation algorithms.
引用
收藏
页码:3 / 13
页数:11
相关论文
共 20 条
[1]  
[Anonymous], 2013, P 23 INT JOINT C ART
[2]  
Argyriou A., 2006, Advances in Neural Information Processing Systems, P41, DOI DOI 10.1007/S10994-007-5040-8
[3]  
Bhaskar SA, 2015, 2015 49TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, P541, DOI 10.1109/ACSSC.2015.7421187
[4]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[5]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982
[6]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[7]  
Chao CH, 2015, PROC CVPR IEEE, P3900, DOI 10.1109/CVPR.2015.7299015
[8]   LorSLIM: Low Rank Sparse Linear Methods for Top-N Recommendations [J].
Cheng, Yao ;
Yin, Li'ang ;
Yu, Yong .
2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2014, :90-99
[9]  
Lemire D, 2005, SIAM PROC S, P471
[10]   Monoamine Transporters: Vulnerable and Vital Doorkeepers [J].
Lin, Zhicheng ;
Canales, Juan J. ;
Bjoergvinsson, Throestur ;
Thomsen, Morgane ;
Qu, Hong ;
Liu, Qing-Rong ;
Torres, Gonzalo E. ;
Caine, S. Barak .
BRAIN AS A DRUG TARGET, 2011, 98 :1-46