Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors

被引:31
作者
Zheng, Yu-Bang [1 ,2 ]
Huang, Ting-Zhu [1 ]
Zhao, Xi-Le [1 ]
Zhao, Qibin [2 ,3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Math Sci, Chengdu, Peoples R China
[2] RIKEN Ctr Adv Intelligence Project AIP, Tensor Learning Team, Tokyo, Japan
[3] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensor decomposition; Tensor completion; Low-rankness; Image processing; Proximal alternating minimization; RANK APPROXIMATION; MATRIX FACTORIZATION; NONCONVEX; RECOVERY; MINIMIZATION;
D O I
10.1007/s10915-022-01841-8
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The recently proposed fully-connected tensor network (FCTN) decomposition has a powerful ability to capture the low-rankness of tensors and has achieved great success in tensor completion. However, the FCTN decomposition-based method is highly sensitive to the choice of the FCTN-rank and can not provide satisfactory results in the recovery of local details. In this paper, we propose a novel tensor completion model by introducing a factor-based regularization to the framework of the FCTN decomposition. The regularization provides a robust performance to the choice of the FCTN-rank and simultaneously enforces the global low-rankness and the local continuity of the target tensor. More specifically, by illustrating that the unfolding matrices of the FCTN factors can be reasonably assumed to be of low-rank in the gradient domain and further imposing a low-rank matrix factorization (LRMF) on them, the proposed model enhances the robustness to the choice of the FCTN-rank. By employing a Tikhonov regularization to the LRMF factors, the proposed model promotes the local continuity and preserves local details of the target tensor. To solve the optimization problem associated with the proposed model, we develop an efficient proximal alternating minimization (PAM)-based algorithm and theoretically demonstrate its convergence. To reduce the running time of the developed algorithm, we design an automatic rank-increasing strategy. Numerical experimental results demonstrate that the proposed method outperforms its competitors.
引用
收藏
页数:35
相关论文
共 50 条
  • [1] Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
    Attouch, Hedy
    Bolte, Jerome
    Svaiter, Benar Fux
    [J]. MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) : 91 - 129
  • [2] Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Lojasiewicz Inequality
    Attouch, Hedy
    Bolte, Jerome
    Redont, Patrick
    Soubeyran, Antoine
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2010, 35 (02) : 438 - 457
  • [3] Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
    Bengua, Johann A.
    Phien, Ho N.
    Hoang Duong Tuan
    Do, Minh N.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (05) : 2466 - 2479
  • [4] Clarke subgradients of stratifiable functions
    Bolte, Jerome
    Daniilidis, Aris
    Lewis, Adrian
    Shiota, Masahiro
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2007, 18 (02) : 556 - 572
  • [5] Proximal alternating linearized minimization for nonconvex and nonsmooth problems
    Bolte, Jerome
    Sabach, Shoham
    Teboulle, Marc
    [J]. MATHEMATICAL PROGRAMMING, 2014, 146 (1-2) : 459 - 494
  • [6] KRONECKER PRODUCTS AND MATRIX CALCULUS IN SYSTEM THEORY
    BREWER, JW
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1978, 25 (09): : 772 - 781
  • [7] A Barzilai-Borwein Gradient Algorithm for Spatio-Temporal Internet Traffic Data Completion via Tensor Triple Decomposition
    Chen, Yannan
    Zhang, Xinzhen
    Qi, Liqun
    Xu, Yanwei
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2021, 88 (03)
  • [8] TENSOR RANK AND THE ILL-POSEDNESS OF THE BEST LOW-RANK APPROXIMATION PROBLEM
    de Silva, Vin
    Lim, Lek-Heng
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2008, 30 (03) : 1084 - 1127
  • [9] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Ding, Meng
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Yang, Jing-Hua
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2019, 81 (02) : 941 - 964
  • [10] Tensor completion and low-n-rank tensor recovery via convex optimization
    Gandy, Silvia
    Recht, Benjamin
    Yamada, Isao
    [J]. INVERSE PROBLEMS, 2011, 27 (02)