Tensor-Train Parameterization for Ultra Dimensionality Reduction

被引:0
|
作者
Bai, Mingyuan [1 ]
Choy, S. T. Boris [1 ]
Song, Xin [1 ,2 ]
Gao, Junbin [1 ]
机构
[1] Univ Sydney, Business Sch, Discipline Business Analyt, Camperdown, NSW 2006, Australia
[2] China Univ Geosci, Sch Comp Sci, Wuhan 430074, Peoples R China
关键词
tensor; high-dimensional data; dimensionality reduction; locality preserving projections; robustness;
D O I
10.1109/ICBK.2019.00011
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dimensionality reduction is a conventional yet crucial field in machine learning. In dimensionality reduction, locality preserving projections (LPP) are a vital method designed to avoid the sensitivity to outliers based on data graph information. However, in terms of the extreme outliers, the performance of LPP is still largely undermined by them. For the case when the input data are matrices or tensors, LPP can only process them by flattening them into an extensively long vector and thus result in the loss of structural information. Furthermore, the assumption for LPP is that the dimension of data should be smaller than the number of instances. Therefore, for high-dimensional data analysis, LPP is not appropriate. In this case, the tensor-train decomposition comes to the stage and demonstrates the efficiency and effectiveness to capture these spatial relations. In consequence, a tensor-train parameterization for ultra dimensionality reduction (TTPUDR) is proposed in this paper, where the conventional LPP mapping is tensorized through tensor-trains and the objective function in the traditional LPP is substituted with the Frobenius norm instead of the squared Frobenius norm to enhance the robustness of the model. We also utilize the manifold optimization to assist the learning process of the model. We evaluate the performance of TTPUDR on classification problems versus the state-of-the-art methods and the past axiomatic methods and TTPUDR significantly outperforms them.
引用
收藏
页码:17 / 24
页数:8
相关论文
共 50 条
  • [1] TENSOR-TRAIN DECOMPOSITION
    Oseledets, I. V.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2011, 33 (05): : 2295 - 2317
  • [2] Tensor-Train Discriminant Analysis
    Sofuoglu, Seyyid Emre
    Aviyente, Selin
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3422 - 3426
  • [3] SPECTRAL TENSOR-TRAIN DECOMPOSITION
    Bigoni, Daniele
    Engsig-Karup, Allan P.
    Marzouk, Youssef M.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (04): : A2405 - A2439
  • [4] Tensor-Train decomposition for image recognition
    D. Brandoni
    V. Simoncini
    Calcolo, 2020, 57
  • [5] Tensor-train ranks for matrices and their inverses
    Oseledets, Ivan
    Tyrtyshnikov, Eugene
    Zamarashkin, Nickolai
    Computational Methods in Applied Mathematics, 2011, 11 (03) : 394 - 403
  • [6] Tensor-Train decomposition for image recognition
    Brandoni, D.
    Simoncini, V
    CALCOLO, 2020, 57 (01)
  • [7] A continuous analogue of the tensor-train decomposition
    Gorodetsky, Alex
    Karaman, Sertac
    Marzouk, Youssef
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2019, 347 : 59 - 84
  • [8] On Tensor-Train Ranks of Tensorized Polynomials
    Vysotsky, Lev
    LARGE-SCALE SCIENTIFIC COMPUTING (LSSC 2019), 2020, 11958 : 189 - 196
  • [9] Tensor-train WENO scheme for compressible flows
    Danis, M. Engin
    Truong, Duc
    Boureima, Ismael
    Korobkin, Oleg
    Rasmussen, Kim o.
    Alexandrov, Boian S.
    JOURNAL OF COMPUTATIONAL PHYSICS, 2025, 529
  • [10] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    Journal of Machine Learning Research, 2022, 23