Two new low rank tensor completion methods based on sum nuclear norm

被引:4
作者
Zhang, Hongbing [1 ]
Fan, Hongtao [1 ]
Li, Yajing [1 ]
Liu, Xinyi [1 ]
Ye, Yinlin [1 ]
Zhu, Xinyun [2 ]
机构
[1] Northwest A&F Univ, Coll Sci, Yangling 712100, Shaanxi, Peoples R China
[2] Univ Texas Permian Basin, Dept Math, Odessa, TX 79762 USA
基金
中国国家自然科学基金;
关键词
Low rank tensor completion; Sum nuclear norm (SNN) method; QR decomposition; Total variation; Alternating direction multiplier method; L21; norm; DECOMPOSITION; FACTORIZATION; MINIMIZATION;
D O I
10.1016/j.dsp.2023.103949
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The low rank tensor completion (LRTC) problem has attracted great attention in computer vision and signal processing. How to acquire high quality image recovery effect is still an urgent task to be solved at present. This paper firstly proposes a new tensor L2,1 norm minimization model (TLNM) that integrates sum nuclear norm (SNN) method, differing from the classical tensor nuclear norm (TNN)-based tensor completion method, with L2,1 norm and QR decomposition for solving the LRTC problem. To improve the utilization rate of the local prior information of the image, a total variation (TV) regularization term is introduced, resulting in another new class of tensor L2,1 norm minimization with total variation model (TLNMTV). Both proposed models are convex and therefore have global optimal solutions. Moreover, we adopt the Alternating Direction Multiplier Method (ADMM) to obtain the closed-form solution of each variable, which makes the algorithm be able to be efficiently implemented. Numerical experiments show that the two proposed algorithms are convergent and outperform compared methods. In particular, our method significantly outperforms the compared methods when the sampling rate is 2.5% for hyperspectral images.(c) 2023 Elsevier Inc. All rights reserved.
引用
收藏
页数:24
相关论文
共 51 条
[11]  
Harshman R.A., 1970, MULTIMODAL FACTOR AN, V16
[12]   Most Tensor Problems Are NP-Hard [J].
Hillar, Christopher J. ;
Lim, Lek-Heng .
JOURNAL OF THE ACM, 2013, 60 (06)
[13]   Rank minimization with applications to image noise removal [J].
Huang, Yu-Mei ;
Yan, Hui-Yin ;
Wen, You-Wei ;
Yang, Xue .
INFORMATION SCIENCES, 2018, 429 :147-163
[14]   Framelet Representation of Tensor Nuclear Norm for Third-Order Tensor Completion [J].
Jiang, Tai-Xiang ;
Ng, Michael K. ;
Zhao, Xi-Le ;
Huang, Ting-Zhu .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 (29) :7233-7244
[15]   Factorization strategies for third-order tensors [J].
Kilmer, Misha E. ;
Martin, Carla D. .
LINEAR ALGEBRA AND ITS APPLICATIONS, 2011, 435 (03) :641-658
[16]   Tensor Decompositions and Applications [J].
Kolda, Tamara G. ;
Bader, Brett W. .
SIAM REVIEW, 2009, 51 (03) :455-500
[17]   Multilinear Discriminant Analysis for Higher-Order Tensor Data Classification [J].
Li, Qun ;
Schonfeld, Dan .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2014, 36 (12) :2524-2537
[18]  
Li XT, 2017, AAAI CONF ARTIF INTE, P2210
[19]   MR-NTD: Manifold Regularization Nonnegative Tucker Decomposition for Tensor Data Dimension Reduction and Representation [J].
Li, Xutao ;
Ng, Michael K. ;
Cong, Gao ;
Ye, Yunming ;
Wu, Qingyao .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (08) :1787-1800
[20]   Low Tucker rank tensor recovery via ADMM based on exact and inexact iteratively reweighted algorithms [J].
Li, Yu-Fan ;
Shang, Kun ;
Huang, Zheng-Hai .
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2018, 331 :64-81