A low-rank and sparse enhanced Tucker decomposition approach for tensor completion

被引:6
|
作者
Pan, Chenjian [1 ,2 ]
Ling, Chen [2 ]
He, Hongjin [1 ]
Qi, Liqun [3 ]
Xu, Yanwei [4 ]
机构
[1] Ningbo Univ, Sch Math & Stat, Ningbo 315211, Peoples R China
[2] Hangzhou Dianzi Univ, Sch Sci, Hangzhou 310018, Peoples R China
[3] Hong Kong Polytech Univ, Dept Appl Math, Kowloon, Hong Kong, Peoples R China
[4] 2012 Labs Huawei Tech Investment Co Ltd, Future Network Theory Lab, Shatin, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Tensor completion; Tucker decomposition; Nuclear norm; Internet traffic data; Image inpainting; THRESHOLDING ALGORITHM; MATRIX FACTORIZATION; RECOVERY;
D O I
10.1016/j.amc.2023.128432
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion. Our model possesses a sparse regularization term to promote a sparse core of the Tucker decomposition, which is beneficial for tensor data compression. Moreover, we enforce low-rank regularization terms on factor matrices of the Tucker decomposition for inducing the low-rankness of the tensor with a cheap computational cost. Numerically, we propose a customized splitting method with easy subproblems to solve the underlying model. It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors. A series of computational experiments on real-world data sets, including internet traffic data sets and color images, demonstrate that our model performs better than many existing state-of-the-art matricization and tensorization approaches in terms of achieving higher recovery accuracy.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [42] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [43] Optimal Low-Rank Tensor Tree Completion
    Li, Zihan
    Zhu, Ce
    Long, Zhen
    Liu, Yipeng
    2023 IEEE 25TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, MMSP, 2023,
  • [44] Low-rank tensor completion for visual data recovery via the tensor train rank-1 decomposition
    Liu, Xiaohua
    Jing, Xiao-Yuan
    Tang, Guijin
    Wu, Fei
    Dong, Xiwei
    IET IMAGE PROCESSING, 2020, 14 (01) : 114 - 124
  • [45] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [46] Efficient enhancement of low-rank tensor completion via thin QR decomposition
    Wu, Yan
    Jin, Yunzhi
    FRONTIERS IN BIG DATA, 2024, 7
  • [47] Low-Rank and Sparse Matrix Completion for Recommendation
    Zhao, Zhi-Lin
    Huang, Ling
    Wang, Chang-Dong
    Lai, Jian-Huang
    Yu, Philip S.
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT V, 2017, 10638 : 3 - 13
  • [48] Low-Rank Tensor Completion: A Pseudo-Bayesian Learning Approach
    Chen, Wei
    Song, Nan
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 3325 - 3333
  • [49] A Simpler Approach to Low-Rank Tensor Canonical Polyadic Decomposition
    Pimentel-Alarcon, Daniel L.
    2016 54TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2016, : 474 - 481
  • [50] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667