Robust low tubal rank tensor completion via factor tensor norm minimization

被引:18
作者
Jiang, Wei [1 ,2 ]
Zhang, Jun [3 ]
Zhang, Changsheng [1 ,2 ]
Wang, Lijun [4 ]
Qi, Heng [5 ]
机构
[1] Wenzhou Univ, Coll Comp & Artificial Intelligence, Wenzhou 325035, Peoples R China
[2] Wenzhou Univ, Key Lab Intelligent Informat Safety & Emergency Zh, Wenzhou 325035, Peoples R China
[3] Liaoning Normal Univ, Sch Math, Dalian 116029, Peoples R China
[4] Inst Sci & Tech Informat China, Res Ctr Informat Sci Theory & Methodol, Beijing 100038, Peoples R China
[5] Dalian Univ Technol, Elect Informat & Elect Engn Dept, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
Low tubal rank tensor completion; Schatten-p norm; Tensor double nuclear norm; Tensor frobenius; nuclear norm; MATRIX; FACTORIZATION;
D O I
10.1016/j.patcog.2022.109169
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent research has demonstrated that low tubal rank recovery based on tensor has received extensive attention. In this correspondence, we define tensor double nuclear norm and tensor Frobenius/nuclear hybrid norm to induce a surrogate for tensor tubal rank, and prove that they are equivalent to tensor Schatten-p norm for p = 1 / 2 and p = 2 / 3 . Based on the definition, we propose two novel tractable ten-sor completion models called Double Nuclear norm regularized Tensor Completion (DNTC) and Frobe-nius/Nuclear hybrid norm regularized Tensor Completion (FNTC) by integrating these two norm mini-mization and factorization methods into a joint learning framework. Furthermore, we adopt invertible linear transforms to obtain low tubal rank tensors, which makes the model more flexible and effective. Two efficient algorithms are designed to solve the proposed tensor completion models by incorporat-ing the convexity of the factor norms. Comprehensive experiments are conducted on synthetic and real datasets to achieve better results in comparison with some state-of-the-art approaches.(c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 39 条
[1]   SPECTRAL TENSOR-TRAIN DECOMPOSITION [J].
Bigoni, Daniele ;
Engsig-Karup, Allan P. ;
Marzouk, Youssef M. .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (04) :A2405-A2439
[2]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[3]   Exact Matrix Completion via Convex Optimization [J].
Candes, Emmanuel J. ;
Recht, Benjamin .
FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2009, 9 (06) :717-772
[4]   Logarithmic Norm Regularized Low-Rank Factorization for Matrix and Tensor Completion [J].
Chen, Lin ;
Jiang, Xue ;
Liu, Xingzhao ;
Zhou, Zhixin .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 :3434-3449
[5]   Robust Low-Rank Tensor Recovery via Nonconvex Singular Value Minimization [J].
Chen, Lin ;
Jiang, Xue ;
Liu, Xingzhao ;
Zhou, Zhixin .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 :9044-9059
[6]   Unifying tensor factorization and tensor nuclear norm approaches for low-rank tensor completion [J].
Du, Shiqiang ;
Xiao, Qingjiang ;
Shi, Yuqing ;
Cucchiara, Rita ;
Ma, Yide .
NEUROCOMPUTING, 2021, 458 :204-218
[7]   NUCLEAR NORM OF HIGHER-ORDER TENSORS [J].
Friedland, Shmuel ;
Lim, Lek-Heng .
MATHEMATICS OF COMPUTATION, 2018, 87 (311) :1255-1281
[8]  
Hou J., 2021, IEEE T PATTERN ANAL, V4, P1
[9]   Tensor-tensor products with invertible linear transforms [J].
Kernfeld, Eric ;
Kilmer, Misha ;
Aeron, Shuchin .
LINEAR ALGEBRA AND ITS APPLICATIONS, 2015, 485 :545-570
[10]  
Kiers HAL, 2000, J CHEMOMETR, V14, P105, DOI 10.1002/1099-128X(200005/06)14:3<105::AID-CEM582>3.0.CO