Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks and total variation

被引:0
|
作者
He, Jingfei [1 ]
Yang, Zezhong [1 ]
Zheng, Xunan [1 ]
Zhang, Xiaoyue [1 ]
Li, Ao [1 ]
机构
[1] Hebei Univ Technol, Sch Elect & Informat Engn, Tianjin Key Lab Elect Mat & Devices, 5340 Xiping Rd, Tianjin 300401, Peoples R China
基金
中国国家自然科学基金;
关键词
Low rank tensor completion; TT rank; Total variation; Partially overlapping sub-block; Tensor augmentation; MATRIX FACTORIZATION; NUCLEAR NORM; IMAGE; RECOVERY;
D O I
10.1016/j.image.2024.117193
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, the low-rank tensor completion method based on tensor train (TT) rank has achieved promising performance. Ket augmentation (KA) is commonly used in TT rank-based methods to improve the performance by converting low-dimensional tensors to higher-dimensional tensors. However, block artifacts are caused since KA also destroys the original structure and image continuity of original low-dimensional tensors. To tackle this issue, a low-rank tensor completion method based on TT rank with tensor augmentation by partially overlapped sub-blocks (TAPOS) and total variation (TV) is proposed in this paper. The proposed TAPOS preserves the image continuity of the original tensor and enhances the low-rankness of the generated higher-dimensional tensors, and a weighted de-augmentation method is used to assign different weights to the elements of sub-blocks and further reduce the block artifacts. To further alleviate the block artifacts and improve reconstruction accuracy, TV is introduced in the TAPOS-based model to add the piecewise smooth prior. The parallel matrix decomposition method is introduced to estimate the TT rank to reduce the computational cost. Numerical experiments show that the proposed method outperforms the existing state-of-the-art methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
    Bengua, Johann A.
    Phien, Ho N.
    Hoang Duong Tuan
    Do, Minh N.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2017, 26 (05) : 2466 - 2479
  • [22] Robust low-rank tensor completion via transformed tensor nuclear norm with total variation regularization
    Qiu, Duo
    Bai, Minru
    Ng, Michael K.
    Zhang, Xiongjun
    NEUROCOMPUTING, 2021, 435 : 197 - 215
  • [23] Low-Rank tensor completion based on nonconvex regularization
    Su, Xinhua
    Ge, Huanmin
    Liu, Zeting
    Shen, Yanfei
    SIGNAL PROCESSING, 2023, 212
  • [24] ANISOTROPIC TOTAL VARIATION REGULARIZED LOW-RANK TENSOR COMPLETION BASED ON TENSOR NUCLEAR NORM FOR COLOR IMAGE INPAINTING
    Jiang, Fei
    Liu, Xiao-Yang
    Lu, Hongtao
    Shen, Ruimin
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 1363 - 1367
  • [25] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667
  • [26] A Weighted Tensor Factorization Method for Low-rank Tensor Completion
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2019), 2019, : 30 - 38
  • [27] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [28] Tensor Denoising Using Low-Rank Tensor Train Decomposition
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    Ai, Bo
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1685 - 1689
  • [29] Low-Rank Tensor Completion via Tensor Joint Rank With Logarithmic Composite Norm
    Zhang, Hongbing
    Zheng, Bing
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2025, 32 (02)
  • [30] TENSOR QUANTILE REGRESSION WITH LOW-RANK TENSOR TRAIN ESTIMATION
    Liu, Zihuan
    Lee, Cheuk Yin
    Zhang, Heping
    ANNALS OF APPLIED STATISTICS, 2024, 18 (02): : 1294 - 1318