Large Margin Low Rank Tensor Analysis

被引:18
作者
Zhong, Guoqiang [1 ]
Cheriet, Mohamed [1 ]
机构
[1] Ecole Technol Super, Synchromedia Lab Multimedia Commun Telepresence, Montreal, PQ H3C 1K3, Canada
关键词
DIMENSIONALITY REDUCTION; DISCRIMINANT-ANALYSIS; COMPONENT ANALYSIS;
D O I
10.1162/NECO_a_00570
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a supervised model for tensor dimensionality reduction, which is called large margin low rank tensor analysis (LMLRTA). In contrast to traditional vector representation-based dimensionality reduction methods, LMLRTA can take any order of tensors as input. And unlike previous tensor dimensionality reduction methods, which can learn only the low-dimensional embeddings with a priori specified dimensionality, LMLRTA can automatically and jointly learn the dimensionality and the low-dimensional representations from data. Moreover, LMLRTA delivers low rank projection matrices, while it encourages data of the same class to be close and of different classes to be separated by a large margin of distance in the low-dimensional tensor space. LMLRTA can be optimized using an iterative fixed-point continuation algorithm, which is guaranteed to converge to a local optimal solution of the optimization problem. We evaluate LMLRTA on an object recognition application, where the data are represented as 2D tensors, and a face recognition application, where the data are represented as 3D tensors. Experimental results show the superiority of LMLRTA over state-of-the-art approaches.
引用
收藏
页码:761 / 780
页数:20
相关论文
共 45 条
  • [1] [Anonymous], 2006, Proceedings of the 21st National Conference on Artificial Intelligence
  • [2] Generalized discriminant analysis using a kernel approach
    Baudat, G
    Anouar, FE
    [J]. NEURAL COMPUTATION, 2000, 12 (10) : 2385 - 2404
  • [3] Laplacian eigenmaps for dimensionality reduction and data representation
    Belkin, M
    Niyogi, P
    [J]. NEURAL COMPUTATION, 2003, 15 (06) : 1373 - 1396
  • [4] Bengio Y., 2006, Advances in Neural Information Processing Systems, V19, DOI DOI 10.7551/MITPRESS/7503.003.0024
  • [5] Bengio Yoshua, 2003, ADV NEURAL INFORM PR, V16
  • [6] Bondy J. A., 1976, Graph theory with applications
  • [7] Exact Matrix Completion via Convex Optimization
    Candes, Emmanuel
    Recht, Benjamin
    [J]. COMMUNICATIONS OF THE ACM, 2012, 55 (06) : 111 - 119
  • [8] The Power of Convex Relaxation: Near-Optimal Matrix Completion
    Candes, Emmanuel J.
    Tao, Terence
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 56 (05) : 2053 - 2080
  • [9] Chung F.R.K., 1997, Spectral graph theory
  • [10] IMPROVING GENERALIZATION WITH ACTIVE LEARNING
    COHN, D
    ATLAS, L
    LADNER, R
    [J]. MACHINE LEARNING, 1994, 15 (02) : 201 - 221