Performance guarantees of transformed Schatten-1 regularization for exact low-rank matrix recovery

被引:12
作者
Wang, Zhi [1 ]
Hu, Dong [1 ]
Luo, Xiaohu [1 ]
Wang, Wendong [2 ]
Wang, Jianjun [3 ]
Chen, Wu [1 ]
机构
[1] Southwest Univ, Coll Comp & Informat Sci, Chongqing 400715, Peoples R China
[2] Southwest Univ, Coll Artificial Intelligence, Chongqing 400715, Peoples R China
[3] Southwest Univ, Sch Math & Stat, Chongqing 400715, Peoples R China
关键词
Low-rank matrix recovery; Transformed Schatten-1 penalty function; Nonconvex model; Equivalence; VARIABLE SELECTION; ALGORITHM; MINIMIZATION;
D O I
10.1007/s13042-021-01361-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Low-rank matrix recovery aims to recover a matrix of minimum rank that subject to linear system constraint. It arises in various real world applications, such as recommender systems, image processing, and deep learning. Inspired by compressive sensing, the rank minimization can be relaxed to nuclear norm minimization. However, such a method treats all singular values of target matrix equally. To address this issue, recently the transformed Schatten-1 (TS1) penalty function was proposed and utilized to construct low-rank matrix recovery models. Unfortunately, the method for TS1-based models cannot provide both convergence accuracy and convergence speed. To alleviate such problems, this paper further investigates the basic properties of TS1 penalty function. And we describe a novel algorithm, which we called ATS1PGA, that is highly efficient in solving low-rank matrix recovery problems at a convergence rate of O(1/N), where N denotes the iterate count. In addition, we theoretically prove that the original rank minimization problem can be equivalently transformed into the TS1 optimization problem under certain conditions. Finally, extensive experimental results on real image data sets show that our proposed algorithm outperforms state-of-the-art methods in both accuracy and efficiency. In particular, our proposed algorithm is about 30 times faster than TS1 algorithm in solving low-rank matrix recovery problems.
引用
收藏
页码:3379 / 3395
页数:17
相关论文
共 42 条
[1]  
[Anonymous], 2017, Advances in Neural Information Processing Systems
[2]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982
[3]   Enhancing Sparsity by Reweighted l1 Minimization [J].
Candes, Emmanuel J. ;
Wakin, Michael B. ;
Boyd, Stephen P. .
JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2008, 14 (5-6) :877-905
[4]   An algorithm for low-rank matrix factorization and its applications [J].
Chen, Baiyu ;
Yang, Zi ;
Yang, Zhouwang .
NEUROCOMPUTING, 2018, 275 :1012-1020
[5]   Exact recovery low-rank matrix via transformed affine matrix rank minimization [J].
Cui, Angang ;
Peng, Jigen ;
Li, Haiyang .
NEUROCOMPUTING, 2018, 319 :1-12
[6]   Deep learning based matrix completion [J].
Fan, Jicong ;
Chow, Tommy .
NEUROCOMPUTING, 2017, 266 :540-549
[7]   Variable selection via nonconcave penalized likelihood and its oracle properties [J].
Fan, JQ ;
Li, RZ .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (456) :1348-1360
[8]  
Fazel M., 2002, Ph.D. thesis
[9]  
Gu B, 2018, AAAI CONF ARTIF INTE, P3093
[10]   Weighted Nuclear Norm Minimization with Application to Image Denoising [J].
Gu, Shuhang ;
Zhang, Lei ;
Zuo, Wangmeng ;
Feng, Xiangchu .
2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, :2862-2869