Deep learning based matrix completion

被引:62
作者
Fan, Jicong [1 ]
Chow, Tommy [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Hong Kong, Hong Kong, Peoples R China
关键词
Matrix completion; AutoEncoder; deep learning; out-of-sample extension; image inpainting; collaborative filtering; COMPONENT ANALYSIS; FACTORIZATION; ALGORITHM;
D O I
10.1016/j.neucom.2017.05.074
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Previous matrix completion methods are generally based on linear and shallow models where the given incomplete matrices are of low-rank and the data are assumed to be generated by linear latent variable models. In this paper, we first propose a novel method called AutoEncoder based matrix completion (AEMC). The main idea of AEMC is to utilize the partially observed data to learn and construct a nonlinear latent variable model in the form of AutoEncoder. The hidden layer of the AutoEncoder has much fewer units than the visible layers do. Meanwhile, the unknown entries of the data are recovered to fit the nonlinear latent variable model. Based on AEMC, we further propose a deep learning based matrix completion (DLMC) method. In DLMC, AEMC is used as a pre-training step for both the missing entries and network parameters; the hidden layer of AEMC is then used to learn stacked AutoEncoders (SAES) with greedy layer-wise training; finally, fine-tuning is carried out on the deep network formed by AEMC and SAES to obtain the missing entries of the data and the parameters of the network. In addition, we also provide out-of-sample extensions for AEMC and DLMC to recover online incomplete data. AEMC and DLMC are compared with state-of-the-art methods in the tasks of synthetic matrix completion, image inpainting, and collaborative filtering. The experimental results verify the effectiveness and superiority of the proposed methods. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:540 / 549
页数:10
相关论文
共 43 条
[1]  
ALAMEDAPINEDA X, 2016, PROC CVPR IEEE, P5240, DOI DOI 10.1109/CVPR.2016.566
[2]  
[Anonymous], 2006, NIPS
[3]  
[Anonymous], 2010, 100920105055 ARXIV
[4]  
[Anonymous], 2007, P 24 INT C MACHINE L
[5]  
[Anonymous], 2004, THESIS CITESEER
[6]  
[Anonymous], 2015, NIPS WORKSH MACH LEA
[7]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[8]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[9]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[10]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982