TENSOR RANK AND THE ILL-POSEDNESS OF THE BEST LOW-RANK APPROXIMATION PROBLEM

被引:587
|
作者
de Silva, Vin [2 ]
Lim, Lek-Heng [1 ]
机构
[1] Stanford Univ, Inst Computat & Math Engn, Stanford, CA 94305 USA
[2] Pomona Coll, Dept Math, Claremont, CA 91711 USA
关键词
numerical multilinear algebra; tensors; multidimensional arrays; multiway arrays; tensor rank; tensor decompositions; low-rank tensor approximations; hyperdeterminants; Eckart-Young theorem; principal component analysis; PARAFAC; CANDECOMP; Bregman divergence of tensors;
D O I
10.1137/06066518X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
There has been continued interest in seeking a theorem describing optimal low-rank approximations to tensors of order 3 or higher that parallels the Eckart-Young theorem for matrices. In this paper, we argue that the naive approach to this problem is doomed to failure because, unlike matrices, tensors of order 3 or higher can fail to have best rank-r approximations. The phenomenon is much more widespread than one might suspect: examples of this failure can be constructed over a wide range of dimensions, orders, and ranks, regardless of the choice of norm (or even Bregman divergence). Moreover, we show that in many instances these counterexamples have positive volume: they cannot be regarded as isolated phenomena. In one extreme case, we exhibit a tensor space in which no rank-3 tensor has an optimal rank-2 approximation. The notable exceptions to this misbehavior are rank-1 tensors and order-2 tensors (i. e., matrices). In a more positive spirit, we propose a natural way of overcoming the ill-posedness of the low-rank approximation problem, by using weak solutions when true solutions do not exist. For this to work, it is necessary to characterize the set of weak solutions, and we do this in the case of rank 2, order 3 ( in arbitrary dimensions). In our work we emphasize the importance of closely studying concrete low-dimensional examples as a first step toward more general results. To this end, we present a detailed analysis of equivalence classes of 2 x 2 x 2 tensors, and we develop methods for extending results upward to higher orders and dimensions. Finally, we link our work to existing studies of tensors from an algebraic geometric point of view. The rank of a tensor can in theory be given a semialgebraic description; in other words, it can be determined by a system of polynomial inequalities. We study some of these polynomials in cases of interest to us; in particular, we make extensive use of the hyperdeterminant. on R-2 x 2 x 2.
引用
收藏
页码:1084 / 1127
页数:44
相关论文
共 50 条
  • [1] On Approximation Algorithm for Orthogonal Low-Rank Tensor Approximation
    Yang, Yuning
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2022, 194 (03) : 821 - 851
  • [2] On Approximation Algorithm for Orthogonal Low-Rank Tensor Approximation
    Yuning Yang
    Journal of Optimization Theory and Applications, 2022, 194 : 821 - 851
  • [3] Locally Linear Low-rank Tensor Approximation
    Ozdemir, Alp
    Iwen, Mark A.
    Aviyente, Selin
    2015 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2015, : 839 - 843
  • [4] GUARANTEES FOR EXISTENCE OF A BEST CANONICAL POLYADIC APPROXIMATION OF A NOISY LOW-RANK TENSOR
    Evert, Eric
    De Lathauwer, Lieven
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2022, 43 (01) : 328 - 369
  • [5] On best uniform approximation by low-rank matrices
    Georgieva, I.
    Hofreither, C.
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2017, 518 : 159 - 176
  • [6] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [7] A literature survey of low-rank tensor approximation techniques
    Grasedyck, Lars
    Kressner, Daniel
    Tobler, Christine
    Kressner, D. (daniel.kressner@epfl.ch), 1600, Wiley-VCH Verlag (36): : 53 - 78
  • [8] Tensor Recovery via Nonconvex Low-Rank Approximation
    Chen, Lin
    Jiang, Xue
    Liu, Xingzhao
    Zhou, Zhixin
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 710 - 714
  • [9] Clustering Ensemble Meets Low-rank Tensor Approximation
    Jia, Yuheng
    Liu, Hui
    Hou, Junhui
    Zhang, Qingfu
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7970 - 7978
  • [10] ON A PROBLEM OF WEIGHTED LOW-RANK APPROXIMATION OF MATRICES
    Dutta, Aritra
    Li, Xin
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2017, 38 (02) : 530 - 553