Fast Tensor Nuclear Norm for Structured Low-Rank Visual Inpainting

被引:38
作者
Xu, Honghui [1 ]
Zheng, Jianwei [1 ]
Yao, Xiaomin [1 ]
Feng, Yuchao [1 ]
Chen, Shengyong [2 ]
机构
[1] Zhejiang Univ Technol, Coll Comp Sci & Technol, Hangzhou 310023, Peoples R China
[2] Tianjin Univ Technol, Coll Comp Sci & Engn, Tianjin 300384, Peoples R China
关键词
Tensors; Visualization; Correlation; Optimization; Learning systems; Three-dimensional displays; Singular value decomposition; Visual inpainting; low-rank tensor completion; tensor nuclear norm (TNN); alternating direction method of multiplier; MATRIX; COMPLETION; DECOMPOSITION; SPARSE; SPACE; TRAIN;
D O I
10.1109/TCSVT.2021.3067022
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Low-rank modeling has achieved great success in visual data completion. However, the low-rank assumption of original visual data may be in approximate mode, which leads to suboptimality for the recovery of underlying details, especially when the missing rate is extremely high. In this paper, we go further by providing a detailed analysis about the rank distributions in Hankel structured and clustered cases, and figure out both non-local similarity and patch-based structuralization play a positive role. This motivates us to develop a new Hankel low-rank tensor recovery method that is competent to truthfully capture the underlying details with sacrifice of slightly more computational burden. First, benefiting from the correlation of different spectral bands and the smoothness of local spatial neighborhood, we divide the visual data into overlapping 3D patches and group the similar ones into individual clusters exploring the non-local similarity. Second, the 3D patches are individually mapped to the structured Hankel tensors for better revealing low-rank property of the image. Finally, we solve the tensor completion model via the well-known alternating direction method of multiplier (ADMM) optimization algorithm. Due to the fact that size expansion happens inevitably in Hankelization operation, we further propose a fast randomized skinny tensor singular value decomposition (rst-SVD) to accelerate the per-iteration running efficiency. Extensive experimental results on real world datasets verify the superiority of our method compared to the state-of-the-art visual inpainting approaches.
引用
收藏
页码:538 / 552
页数:15
相关论文
共 50 条
  • [21] Nonlocal Low-Rank Tensor Completion for Visual Data
    Zhang, Lefei
    Song, Liangchen
    Du, Bo
    Zhang, Yipeng
    IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (02) : 673 - 685
  • [22] LOW-RANK QUATERNION TENSOR COMPLETION FOR COLOR VIDEO INPAINTING VIA A NOVEL FACTORIZATION STRATEGY
    Qin, Zhenzhi
    Ming, Zhenyu
    Sun, Defeng
    Zhang, Liping
    MATHEMATICS OF COMPUTATION, 2024,
  • [23] Low-Rank Autoregressive Tensor Completion for Spatiotemporal Traffic Data Imputation
    Chen, Xinyu
    Lei, Mengying
    Saunier, Nicolas
    Sun, Lijun
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (08) : 12301 - 12310
  • [24] MULTIRESOLUTION LOW-RANK TENSOR FORMATS
    Mickelin, Oscar
    Karaman, Sertac
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2020, 41 (03) : 1086 - 1114
  • [25] Optimality conditions for Tucker low-rank tensor optimization
    Luo, Ziyan
    Qi, Liqun
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 86 (03) : 1275 - 1298
  • [26] Fast randomized tensor singular value thresholding for low-rank tensor optimization
    Che, Maolin
    Wang, Xuezhong
    Wei, Yimin
    Zhao, Xile
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2022, 29 (06)
  • [27] Structured Low-Rank Tensor Completion for IoT Spatiotemporal High-Resolution Sensing Data Reconstruction
    Zhang, Xiaoyue
    He, Jingfei
    Pan, XuanAng
    Chi, Yue
    Zhou, Yatong
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (05): : 8299 - 8310
  • [28] Low-rank high-order tensor recovery via joint transformed tensor nuclear norm and total variation regularization
    Luo, Xiaohu
    Ma, Weijun
    Wang, Wendong
    Zheng, Yuanshi
    Wang, Jianjun
    NEUROCOMPUTING, 2025, 624
  • [29] Multimodal Core Tensor Factorization and its Applications to Low-Rank Tensor Completion
    Zeng, Haijin
    Xue, Jize
    Luong, Hiap Q.
    Philips, Wilfried
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 7010 - 7024
  • [30] Low-Rank Tensor Completion Based on Self-Adaptive Learnable Transforms
    Wu, Tongle
    Gao, Bin
    Fan, Jicong
    Xue, Jize
    Woo, W. L.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 8826 - 8838