CANDECOMP/PARAFAC Decomposition of High-Order Tensors Through Tensor Reshaping

被引:39
|
作者
Phan, Anh-Huy [1 ]
Tichavsky, Petr [2 ]
Cichocki, Andrzej [1 ,3 ]
机构
[1] RIKEN, Lab Adv Brain Signal Proc, Brain Sci Inst, Wako, Saitama 3510198, Japan
[2] Acad Sci Czech Republ, Inst Informat Theory & Automat, CR-18208 Prague, Czech Republic
[3] Polish Acad Sci, Syst Res Inst, PL-01447 Warsaw, Poland
关键词
Tensor factorization; canonical decomposition; PARAFAC; ALS; structured CPD; tensor unfolding; Cramer-Rao induced bound (CRIB); Cramer-Rao lower bound (CRLB); UNDERDETERMINED MIXTURES; BLIND IDENTIFICATION; POLYADIC DECOMPOSITION; LEAST-SQUARES; UNIQUENESS; ALGORITHMS; PARAFAC; RANK; APPROXIMATION; COMPLEXITY;
D O I
10.1109/TSP.2013.2269046
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In general, algorithms for order-3 CANDECOMP/PARAFAC (CP), also coined canonical polyadic decomposition (CPD), are easy to implement and can be extended to higher order CPD. Unfortunately, the algorithms become computationally demanding, and they are often not applicable to higher order and relatively large scale tensors. In this paper, by exploiting the uniqueness of CPD and the relation of a tensor in Kruskal form and its unfolded tensor, we propose a fast approach to deal with this problem. Instead of directly factorizing the high order data tensor, the method decomposes an unfolded tensor with lower order, e.g., order-3 tensor. On the basis of the order-3 estimated tensor, a structured Kruskal tensor, of the same dimension as the data tensor, is then generated, and decomposed to find the final solution using fast algorithms for the structured CPD. In addition, strategies to unfold tensors are suggested and practically verified in the paper.
引用
收藏
页码:4847 / 4860
页数:14
相关论文
共 39 条
  • [21] A Blind Block Term Decomposition of High Order Tensors
    Cai, Yunfeng
    Li, Ping
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 6868 - 6876
  • [22] Linked Component Analysis From Matrices to High-Order Tensors: Applications to Biomedical Data
    Zhou, Guoxu
    Zhao, Qibin
    Zhang, Yu
    Adali, Tulay
    Xie, Shengli
    Cichocki, Andrzej
    PROCEEDINGS OF THE IEEE, 2016, 104 (02) : 310 - 331
  • [23] An Optimal High-Order Tensor Method for Convex Optimization
    Jiang, Bo
    Wang, Haoyue
    Zhang, Shuzhong
    MATHEMATICS OF OPERATIONS RESEARCH, 2021, 46 (04) : 1390 - 1412
  • [24] High-order tensor completion via gradient-based optimization under tensor train format
    Yuan, Longhao
    Zhao, Qibin
    Gui, Lihua
    Cao, Jianting
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2019, 73 (53-61) : 53 - 61
  • [25] High-order sum-of-squares structured tensors: theory and applications
    Chen, Haibin
    Wang, Yiju
    Zhou, Guanglu
    FRONTIERS OF MATHEMATICS IN CHINA, 2020, 15 (02) : 255 - 284
  • [26] Symmetric rank-1 approximation of symmetric high-order tensors
    Wu, Leqin
    Liu, Xin
    Wen, Zaiwen
    OPTIMIZATION METHODS & SOFTWARE, 2020, 35 (02): : 416 - 438
  • [27] Optimal Sparse Singular Value Decomposition for High-Dimensional High-Order Data
    Zhang, Anru
    Han, Rungang
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (528) : 1708 - 1725
  • [28] High-order tensor estimation via trains of coupled third-order CP and Tucker decompositions
    Zniyed, Yassine
    Boyer, Remy
    de Almeida, Andre L. F.
    Favier, Gerard
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2020, 588 : 304 - 337
  • [29] Optimal High-Order Tensor SVD via Tensor-Train Orthogonal Iteration
    Zhou, Yuchen
    Zhang, Anru R.
    Zheng, Lili
    Wang, Yazhen
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (06) : 3991 - 4019
  • [30] Efficient evaluation of high-order moments and cumulants in tensor network states
    West, Colin G.
    Garcia-Saez, Artur
    Wei, Tzu-Chieh
    PHYSICAL REVIEW B, 2015, 92 (11)