A High-Order Tensor Completion Algorithm Based on Fully-Connected Tensor Network Weighted Optimization

被引:0
|
作者
Yang, Peilin [1 ]
Huang, Yonghui [1 ]
Qiu, Yuning [1 ]
Sun, Weijun [1 ]
Zhou, Guoxu [1 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
来源
PATTERN RECOGNITION AND COMPUTER VISION, PT I, PRCV 2022 | 2022年 / 13534卷
关键词
FCTN-WOPT; Tensor decomposition; Tensor completion; Deep learning; Gradient descent; RANK;
D O I
10.1007/978-3-031-18907-4_32
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tensor completion aims at recovering missing data, and it is one of the popular concerns in deep learning and signal processing. Among the higher-order tensor decomposition algorithms, the recently proposed fully-connected tensor network decomposition (FCTN) algorithm is the most advanced. In this paper, by leveraging the superior expression of the fully-connected tensor network (FCTN) decomposition, we propose a new tensor completion method named the fully connected tensor network weighted optimization (FCTN-WOPT). The algorithm performs a composition of the completed tensor by initializing the factors from the FCTN decomposition. We build a loss function with the weight tensor, the completed tensor and the incomplete tensor together, and then update the completed tensor using the lbfgs gradient descent algorithm to reduce the spatial memory occupation and speed up iterations. Finally we test the completion with synthetic data and real data (both image data and video data) and the results show the advanced performance of our FCTN-WOPT when it is applied to higher-order tensor completion.
引用
收藏
页码:411 / 422
页数:12
相关论文
共 50 条
  • [1] Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors
    Yu-Bang Zheng
    Ting-Zhu Huang
    Xi-Le Zhao
    Qibin Zhao
    Journal of Scientific Computing, 2022, 92
  • [2] Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors
    Zheng, Yu-Bang
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Zhao, Qibin
    JOURNAL OF SCIENTIFIC COMPUTING, 2022, 92 (01)
  • [3] Fully-Connected Tensor Network Decomposition and Its Application to Higher-Order Tensor Completion
    Zheng, Yu-Bang
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Zhao, Qibin
    Jiang, Tai-Xiang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 11071 - 11078
  • [4] Low-rank sparse fully-connected tensor network for tensor completion
    Yu, Jinshi
    Li, Zhifu
    Ma, Ge
    Wang, Jingwen
    Zou, Tao
    Zhou, Guoxu
    PATTERN RECOGNITION, 2025, 158
  • [5] Rank-revealing fully-connected tensor network decomposition and its application to tensor completion
    Liu, Yun-Yang
    Zhao, Xi-Le
    Vivone, Gemine
    PATTERN RECOGNITION, 2025, 165
  • [6] A random sampling algorithm for fully-connected tensor network decomposition with applications
    Wang, Mengyu
    Cui, Honghua
    Li, Hanyu
    COMPUTATIONAL & APPLIED MATHEMATICS, 2024, 43 (04):
  • [7] High-order tensor completion via gradient-based optimization under tensor train format
    Yuan, Longhao
    Zhao, Qibin
    Gui, Lihua
    Cao, Jianting
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 73 (53-61) : 53 - 61
  • [8] A nonlinear high-order transformations-based method for high-order tensor completion
    Luo, Linhong
    Tu, Zhihui
    Lu, Jian
    Wang, Chao
    Xu, Chen
    SIGNAL PROCESSING, 2024, 225
  • [9] HIGH-ORDER TENSOR COMPLETION FOR DATA RECOVERY VIA SPARSE TENSOR-TRAIN OPTIMIZATION
    Yuan, Longhao
    Zhao, Qibin
    Cao, Jianting
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 1258 - 1262
  • [10] Balanced Unfolding Induced Tensor Nuclear Norms for High-Order Tensor Completion
    Qiu, Yuning
    Zhou, Guoxu
    Wang, Andong
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 14