A random sampling algorithm for fully-connected tensor network decomposition with applications

被引:1
作者
Wang, Mengyu [1 ]
Cui, Honghua [2 ]
Li, Hanyu [1 ]
机构
[1] Chongqing Univ, Coll Math & Stat, Chongqing 401331, Peoples R China
[2] Xiamen Univ, Wang Yanan Inst Studies Econ, Fujian 361005, Peoples R China
基金
中国国家自然科学基金;
关键词
Fully-connected tensor network decomposition; Alternating least squares; Randomized algorithm; Leverage sampling; Tensor-on-vector regression; Multi-view subspace clustering; Nonnegative tensor approximation; Tensor completion;
D O I
10.1007/s40314-024-02751-1
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Fully-connected tensor network (FCTN) decomposition is a generalization of the popular tensor train and tensor ring decompositions and has been applied to various fields with great success. The standard method for computing this decomposition is the well-known alternating least squares (ALS). However, it is very expensive, especially for large-scale tensors. To reduce the cost, we propose an ALS-based randomized algorithm. Specifically, by defining a new tensor product called subnetwork product and adjusting the sizes of FCTN factors suitably, the structure of the coefficient matrices of the ALS subproblems from FCTN decomposition is first figured out. Then, with the structure and the properties of subnetwork product, we devise the randomized algorithm based on leverage sampling. This algorithm enables sampling on FCTN factors and hence avoids the formation of full coefficient matrices of ALS subproblems. The computational complexity and numerical performance of our algorithm are presented. Experimental results show that it requires much less computation time to achieve similar accuracy compared with the deterministic ALS method. Further, we apply our algorithm to four famous problems, i.e., tensor-on-vector regression, multi-view subspace clustering, nonnegative tensor approximation and tensor completion, and the performances are quite decent.
引用
收藏
页数:22
相关论文
共 37 条
  • [1] BADER B. W., 2021, TENSOR TOOLBOX MATLA
  • [2] Bahadori MT, 2014, ADV NEUR IN, V27
  • [3] A PRACTICAL RANDOMIZED CP TENSOR DECOMPOSITION
    Battaglino, Casey
    Ballard, Grey
    Kolda, Tamara G.
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2018, 39 (02) : 876 - 901
  • [4] Faster least squares approximation
    Drineas, Petros
    Mahoney, Michael W.
    Muthukrishnan, S.
    Sarlos, Tamas
    [J]. NUMERISCHE MATHEMATIK, 2011, 117 (02) : 219 - 249
  • [5] Fahrbach Matthew, 2022, ADV NEURAL INF PROCE, V35, P28776
  • [6] Han Z, 2023, IEEE Transactions on Big Data
  • [7] Tensor Decompositions and Applications
    Kolda, Tamara G.
    Bader, Brett W.
    [J]. SIAM REVIEW, 2009, 51 (03) : 455 - 500
  • [8] PRACTICAL LEVERAGE-BASED SAMPLING FOR LOW-RANK TENSOR DECOMPOSITION
    Larsen, Brett W.
    Kolda, Tamara G.
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2022, 43 (03) : 1488 - 1517
  • [9] FULLY-CONNECTED TENSOR NETWORK DECOMPOSITION FOR ROBUST TENSOR COMPLETION PROBLEM
    Liu, Yun-Yang
    Zhao, Xi-Le
    Song, Guang-Jing
    Zheng, Yu-Bang
    Ng, Michael K.
    Huang, Ting-Zhu
    [J]. INVERSE PROBLEMS AND IMAGING, 2023, : 208 - 238
  • [10] Multi-View MERA Subspace Clustering
    Long, Zhen
    Zhu, Ce
    Chen, Jie
    Li, Zihan
    Ren, Yazhou
    Liu, Yipeng
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 3102 - 3112