Tensor Wheel Decomposition and Its Tensor Completion Application

被引:0
作者
Wu, Zhong-Cheng [1 ]
Huang, Ting-Zhu [1 ]
Deng, Liang-Jian [1 ]
Dou, Hong-Xia [2 ]
Meng, Deyu [3 ,4 ]
机构
[1] Univ Elect Sci & Technol China, Sch Math Sci, Chengdu, Peoples R China
[2] Xihua Univ, Sch Sci, Chengdu, Peoples R China
[3] Xi An Jiao Tong Univ, Sch Math & Stat, Xian, Peoples R China
[4] Pazhou Lab Huangpu, Huangpu, Peoples R China
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
关键词
MATRIX PRODUCT STATES; RENORMALIZATION-GROUP; MINIMIZATION; IMAGE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, tensor network (TN) decompositions have gained prominence in computer vision and contributed promising results to high-order data recovery tasks. However, current TN models are rather being developed towards more intricate structures to pursue incremental improvements, which instead leads to a dramatic increase in rank numbers, thus encountering laborious hyper-parameter selection, especially for higher-order cases. In this paper, we propose a novel TN decomposition, dubbed tensor wheel (TW) decomposition, in which a high-order tensor is represented by a set of latent factors mapped into a specific wheel topology. Such decomposition is constructed starting from analyzing the graph structure, aiming to more accurately characterize the complex interactions inside objectives while maintaining a lower hyper-parameter scale, theoretically alleviating the above deficiencies. Furthermore, to investigate the potentiality of TW decomposition, we provide its one numerical application, i.e., tensor completion (TC), yet develop an efficient proximal alternating minimization-based solving algorithm with guaranteed convergence. Experimental results elaborate that the proposed method is significantly superior to other tensor decomposition-based state-of-the-art methods on synthetic and real-world data, implying the merits of TW decomposition. The code is available at: https://github.com/zhongchengwu/code_TWDec.
引用
收藏
页数:13
相关论文
共 37 条
[1]  
Anandkumar A, 2014, J MACH LEARN RES, V15, P2773
[2]   Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods [J].
Attouch, Hedy ;
Bolte, Jerome ;
Svaiter, Benar Fux .
MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) :91-129
[3]   Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train [J].
Bengua, Johann A. ;
Phien, Ho N. ;
Hoang Duong Tuan ;
Do, Minh N. .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (05) :2466-2479
[4]   Proximal alternating linearized minimization for nonconvex and nonsmooth problems [J].
Bolte, Jerome ;
Sabach, Shoham ;
Teboulle, Marc .
MATHEMATICAL PROGRAMMING, 2014, 146 (1-2) :459-494
[5]   Hyperspectral Image Compressive Sensing Reconstruction Using Subspace-Based Nonlocal Tensor Ring Decomposition [J].
Chen, Yong ;
Huang, Ting-Zhu ;
He, Wei ;
Yokoya, Naoto ;
Zhao, Xi-Le .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 :6813-6828
[6]   Nonnegative matrix and tensor factorization [J].
Cichocki, Andrzej ;
Zdunek, Rafal ;
Amari, Shun-Ichi .
IEEE SIGNAL PROCESSING MAGAZINE, 2008, 25 (01) :142-145
[7]   Tensor Decompositions for Signal Processing Applications [J].
Cichocki, Andrzej ;
Mandic, Danilo P. ;
Anh Huy Phan ;
Caiafa, Cesar F. ;
Zhou, Guoxu ;
Zhao, Qibin ;
De Lathauwer, Lieven .
IEEE SIGNAL PROCESSING MAGAZINE, 2015, 32 (02) :145-163
[8]   Tensor completion and low-n-rank tensor recovery via convex optimization [J].
Gandy, Silvia ;
Recht, Benjamin ;
Yamada, Isao .
INVERSE PROBLEMS, 2011, 27 (02)
[9]   VARIANTS OF ALTERNATING LEAST SQUARES TENSOR COMPLETION IN THE TENSOR TRAIN FORMAT [J].
Grasedyck, Lars ;
Kluge, Melanie ;
Kraemer, Sebastian .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2015, 37 (05) :A2424-A2450
[10]   HIERARCHICAL SINGULAR VALUE DECOMPOSITION OF TENSORS [J].
Grasedyck, Lars .
SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2010, 31 (04) :2029-2054