Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks and total variation

被引:0
|
作者
He, Jingfei [1 ]
Yang, Zezhong [1 ]
Zheng, Xunan [1 ]
Zhang, Xiaoyue [1 ]
Li, Ao [1 ]
机构
[1] Hebei Univ Technol, Sch Elect & Informat Engn, Tianjin Key Lab Elect Mat & Devices, 5340 Xiping Rd, Tianjin 300401, Peoples R China
基金
中国国家自然科学基金;
关键词
Low rank tensor completion; TT rank; Total variation; Partially overlapping sub-block; Tensor augmentation; MATRIX FACTORIZATION; NUCLEAR NORM; IMAGE; RECOVERY;
D O I
10.1016/j.image.2024.117193
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, the low-rank tensor completion method based on tensor train (TT) rank has achieved promising performance. Ket augmentation (KA) is commonly used in TT rank-based methods to improve the performance by converting low-dimensional tensors to higher-dimensional tensors. However, block artifacts are caused since KA also destroys the original structure and image continuity of original low-dimensional tensors. To tackle this issue, a low-rank tensor completion method based on TT rank with tensor augmentation by partially overlapped sub-blocks (TAPOS) and total variation (TV) is proposed in this paper. The proposed TAPOS preserves the image continuity of the original tensor and enhances the low-rankness of the generated higher-dimensional tensors, and a weighted de-augmentation method is used to assign different weights to the elements of sub-blocks and further reduce the block artifacts. To further alleviate the block artifacts and improve reconstruction accuracy, TV is introduced in the TAPOS-based model to add the piecewise smooth prior. The parallel matrix decomposition method is introduced to estimate the TT rank to reduce the computational cost. Numerical experiments show that the proposed method outperforms the existing state-of-the-art methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks
    He, Jingfei
    Zheng, Xunan
    Gao, Peng
    Zhou, Yatong
    SIGNAL PROCESSING, 2022, 190
  • [2] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Ding, Meng
    Huang, Ting-Zhu
    Ji, Teng-Yu
    Zhao, Xi-Le
    Yang, Jing-Hua
    JOURNAL OF SCIENTIFIC COMPUTING, 2019, 81 (02) : 941 - 964
  • [3] Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
    Meng Ding
    Ting-Zhu Huang
    Teng-Yu Ji
    Xi-Le Zhao
    Jing-Hua Yang
    Journal of Scientific Computing, 2019, 81 : 941 - 964
  • [4] Low-Rank Tensor Completion by Approximating the Tensor Average Rank
    Wang, Zhanliang
    Dong, Junyu
    Liu, Xinguo
    Zeng, Xueying
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4592 - 4600
  • [5] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [6] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [7] Tensor completion using total variation and low-rank matrix factorization
    Ji, Teng-Yu
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Ma, Tian-Hui
    Liu, Gang
    INFORMATION SCIENCES, 2016, 326 : 243 - 257
  • [8] Low-Rank Tensor Completion with Total Variation for Visual Data Inpainting
    Li, Xutao
    Ye, Yunming
    Xu, Xiaofei
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2210 - 2216
  • [9] A Low-Rank Total-Variation Regularized Tensor Completion Algorithm
    Song, Liangchen
    Du, Bo
    Zhang, Lefei
    Zhang, Liangpei
    COMPUTER VISION, PT II, 2017, 772 : 311 - 322
  • [10] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384