Kronecker component with robust low-rank dictionary for image denoising

被引:4
作者
Zhang, Lei [1 ]
Liu, Cong [1 ]
机构
[1] Univ Shanghai Sci & Technol, Shanghai 200082, Peoples R China
基金
中国国家自然科学基金;
关键词
Image denoising; Tensor factorization; Nuclear norm; Low-rank dictionary; Convex optimization; ELASTIC-NET; SPARSE; REGULARIZATION; REPRESENTATION; RECOGNITION; RECOVERY;
D O I
10.1016/j.displa.2022.102194
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Robust Kronecker Component Analysis (RKCA) model is a newly proposed denoising method, which transforms the basic sparse representation model used for 2-D data into the tensor sparse representation used for 3-D data. This model is based on the idea that the dictionary in the basic sparse representation model can be separated into two separable dictionaries using the Kronecker Product and the mode-n product. Moreover, the original 3-D data can also be decomposed into a sparse matrix and two dictionaries by using Tucker factorization. It uses the Frobenius norm constraint to get two low-rank dictionaries. In this paper, we take the Nuclear norm into RKCA to better capture the low-rank property of the two dictionaries. Firstly, we design a novel denoising model named Kronecker Component with Low-Rank Dictionary (KCLD), which replaces the Frobenius norm by Nuclear norm in order to capture the low-rank property better. Further, we design a more effective denoising model named Kronecker Component with Robust Low-Rank Dictionary (KCRD) by combining the Frobenius norm and the Nuclear norm. KCRD fuses the advantages of Frobenius norm and the Nuclear norm, thus can get the better low-rank dictionary. Then, an augmented Lagrange multiplier method is used by convex relaxation of the split variable to optimize the two proposed model. Last, through the comparison of a large number of experiments, our models are more competitive and effective than other proposed denoising methods in different types of images.
引用
收藏
页数:13
相关论文
共 60 条
[1]   Principal component analysis [J].
Abdi, Herve ;
Williams, Lynne J. .
WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL STATISTICS, 2010, 2 (04) :433-459
[2]  
Anandkumar A., 2015, TENSOR VS MATRIX ME, P268
[3]  
Anandkumar A., 2016, MACH LEARN
[4]   Robust Kronecker Component Analysis [J].
Bahri, Mehdi ;
Panagakis, Yannis ;
Zafeiriou, Stefanos .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (10) :2365-2379
[5]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982
[6]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[7]   Sparse representation for face recognition by discriminative low-rank matrix recovery [J].
Chen, Jie ;
Yi, Zhang .
JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2014, 25 (05) :763-773
[8]   Elastic-net regularization in learning theory [J].
De Mol, Christine ;
De Vito, Ernesto ;
Rosasco, Lorenzo .
JOURNAL OF COMPLEXITY, 2009, 25 (02) :201-230
[9]   Compressed sensing [J].
Donoho, DL .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (04) :1289-1306
[10]  
Fan Q., 2019, PROC CVPR IEEE