Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks and total variation

被引:0
作者
He, Jingfei [1 ]
Yang, Zezhong [1 ]
Zheng, Xunan [1 ]
Zhang, Xiaoyue [1 ]
Li, Ao [1 ]
机构
[1] Hebei Univ Technol, Sch Elect & Informat Engn, Tianjin Key Lab Elect Mat & Devices, 5340 Xiping Rd, Tianjin 300401, Peoples R China
基金
中国国家自然科学基金;
关键词
Low rank tensor completion; TT rank; Total variation; Partially overlapping sub-block; Tensor augmentation; MATRIX FACTORIZATION; NUCLEAR NORM; IMAGE; RECOVERY;
D O I
10.1016/j.image.2024.117193
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, the low-rank tensor completion method based on tensor train (TT) rank has achieved promising performance. Ket augmentation (KA) is commonly used in TT rank-based methods to improve the performance by converting low-dimensional tensors to higher-dimensional tensors. However, block artifacts are caused since KA also destroys the original structure and image continuity of original low-dimensional tensors. To tackle this issue, a low-rank tensor completion method based on TT rank with tensor augmentation by partially overlapped sub-blocks (TAPOS) and total variation (TV) is proposed in this paper. The proposed TAPOS preserves the image continuity of the original tensor and enhances the low-rankness of the generated higher-dimensional tensors, and a weighted de-augmentation method is used to assign different weights to the elements of sub-blocks and further reduce the block artifacts. To further alleviate the block artifacts and improve reconstruction accuracy, TV is introduced in the TAPOS-based model to add the piecewise smooth prior. The parallel matrix decomposition method is introduced to estimate the TT rank to reduce the computational cost. Numerical experiments show that the proposed method outperforms the existing state-of-the-art methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Tensor Train Factorization with Spatio-temporal Smoothness for Streaming Low-rank Tensor Completion
    Yu, Gaohang
    Wan, Shaochun
    Ling, Chen
    Qi, Liqun
    Xu, Yanwei
    FRONTIERS OF MATHEMATICS, 2024, 19 (05): : 933 - 959
  • [32] Low-rank tensor completion via combined Tucker and Tensor Train for color image recovery
    Zhang, Tianheng
    Zhao, Jianli
    Sun, Qiuxia
    Zhang, Bin
    Chen, Jianjian
    Gong, Maoguo
    APPLIED INTELLIGENCE, 2022, 52 (07) : 7761 - 7776
  • [33] Auto-weighted robust low-rank tensor completion via tensor-train
    Chen, Chuan
    Wu, Zhe-Bin
    Chen, Zi-Tai
    Zheng, Zi-Bin
    Zhang, Xiong-Jun
    INFORMATION SCIENCES, 2021, 567 : 100 - 115
  • [34] A Nonconvex Relaxation Approach to Low-Rank Tensor Completion
    Zhang, Xiongjun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (06) : 1659 - 1671
  • [35] Nonlocal Low-Rank Tensor Completion for Visual Data
    Zhang, Lefei
    Song, Liangchen
    Du, Bo
    Zhang, Yipeng
    IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (02) : 673 - 685
  • [36] Low-Rank Tensor Completion via Tensor Nuclear Norm With Hybrid Smooth Regularization
    Zhao, Xi-Le
    Nie, Xin
    Zheng, Yu-Bang
    Ji, Teng-Yu
    Huang, Ting-Zhu
    IEEE ACCESS, 2019, 7 : 131888 - 131901
  • [37] CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION
    Zhang, Anru
    ANNALS OF STATISTICS, 2019, 47 (02) : 936 - 964
  • [38] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [39] Total variation regularized nonlocal low-rank tensor train for spectral compressive imaging
    Wang, Yao
    Han, Yishan
    Wang, Kaidong
    Zhao, Xi-Le
    SIGNAL PROCESSING, 2022, 195
  • [40] Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion
    Xue, Jize
    Zhao, Yongqiang
    Huang, Shaoguang
    Liao, Wenzhi
    Chan, Jonathan Cheung-Wai
    Kong, Seong G.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) : 6916 - 6930