Tensor-tensor algebra for optimal representation and compression of multiway data

被引:59
作者
Kilmer, Misha E. [1 ]
Horesh, Lior [2 ]
Avron, Haim [3 ]
Newman, Elizabeth [4 ]
机构
[1] Tufts Univ, Dept Math, Medford, MA 02155 USA
[2] IBM Res, Math AI, Yorktown Hts, NY 10598 USA
[3] Tel Aviv Univ, Sch Math Sci, IL-6997801 Tel Aviv, Israel
[4] Emory Univ, Dept Math, Atlanta, GA 30322 USA
关键词
tensor  compression  multiway data  SVD  rank; FACTORIZATION;
D O I
10.1073/pnas.2015851118
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
With the advent of machine learning and its overarching pervasiveness it is imperative to devise ways to represent large datasets efficiently while distilling intrinsic features necessary for subsequent analysis. The primary workhorse used in data dimensionality reduction and feature extraction has been the matrix singular value decomposition (SVD), which presupposes that data have been arranged in matrix format. A primary goal in this study is to show that high-dimensional datasets are more compressible when treated as tensors (i.e., multiway arrays) and compressed via tensor-SVDs under the tensor-tensor product constructs and its generalizations. We begin by proving Eckart-Young optimality results for families of tensor-SVDs under two different truncation strategies. Since such optimality properties can be proven in both matrix and tensor-based algebras, a fundamental question arises: Does the tensor construct subsume the matrix construct in terms of representation efficiency? The answer is positive, as proven by showing that a tensor-tensor representation of an equal dimensional spanning space can be superior to its matrix counterpart. We then use these optimality results to investigate how the compressed representation provided by the truncated tensor SVD is related both theoretically and empirically to its two closest tensor based analogs, the truncated high-order SVD and the truncated tensor-train SVD.
引用
收藏
页数:12
相关论文
共 27 条
  • [1] [Anonymous], 1927, J MATH PHYS CAMB, DOI [DOI 10.1002/SAPM192761164, 10.1002/sapm192761164]
  • [2] [Anonymous], 2012, TENSOR SPACES NUMERI
  • [3] Ballester-Ripoll R., 2018, TTHRESH TENSOR COMPR TTHRESH TENSOR COMPR
  • [4] ANALYSIS OF INDIVIDUAL DIFFERENCES IN MULTIDIMENSIONAL SCALING VIA AN N-WAY GENERALIZATION OF ECKART-YOUNG DECOMPOSITION
    CARROLL, JD
    CHANG, JJ
    [J]. PSYCHOMETRIKA, 1970, 35 (03) : 283 - &
  • [5] A multilinear singular value decomposition
    De Lathauwer, L
    De Moor, B
    Vandewalle, J
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2000, 21 (04) : 1253 - 1278
  • [6] THE APPROXIMATION OF ONE MATRIX BY ANOTHER OF LOWER RANK
    Eckart, Carl
    Young, Gale
    [J]. PSYCHOMETRIKA, 1936, 1 (03) : 211 - 218
  • [7] 5D seismic data completion and denoising using a novel class of tensor decompositions
    Ely, Gregory
    Aeron, Shuchin
    Hao, Ning
    Kilmer, Misha E.
    [J]. GEOPHYSICS, 2015, 80 (04) : V83 - V95
  • [8] From few to many: Illumination cone models for face recognition under variable lighting and pose
    Georghiades, AS
    Belhumeur, PN
    Kriegman, DJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2001, 23 (06) : 643 - 660
  • [9] HIERARCHICAL SINGULAR VALUE DECOMPOSITION OF TENSORS
    Grasedyck, Lars
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2010, 31 (04) : 2029 - 2054
  • [10] Facial Recognition Using Tensor-Tensor Decompositions
    Hao, Ning
    Kilmer, Misha E.
    Braman, Karen
    Hoover, Randy C.
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2013, 6 (01): : 437 - 463