Generalized Learning Vector Quantization With Log-Euclidean Metric Learning on Symmetric Positive-Definite Manifold

被引:13
作者
Tang, Fengzhen [1 ,2 ]
Tino, Peter [3 ]
Yu, Haibin [1 ,2 ]
机构
[1] Chinese Acad Sci, Shenyang Inst Automat, State Key Lab Robot, Shenyang 110016, Peoples R China
[2] Chinese Acad Sci, Inst Robot & Intelligent Mfg, Shenyang 110169, Peoples R China
[3] Univ Birmingham, Sch Comp Sci, Birmingham B15 2TT, W Midlands, England
基金
中国国家自然科学基金;
关键词
Prototypes; Measurement; Manifolds; Cost function; Training; Tensors; Vector quantization; Generalized learning vector quantization (GLVQ); log-Euclidean metric (LEM); metric learning; Riemannian geodesic distance; Riemannian manifold; BRAIN-COMPUTER INTERFACES; RIEMANNIAN MANIFOLD;
D O I
10.1109/TCYB.2022.3178412
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In many classification scenarios, the data to be analyzed can be naturally represented as points living on the curved Riemannian manifold of symmetric positive-definite (SPD) matrices. Due to its non-Euclidean geometry, usual Euclidean learning algorithms may deliver poor performance on such data. We propose a principled reformulation of the successful Euclidean generalized learning vector quantization (GLVQ) methodology to deal with such data, accounting for the nonlinear Riemannian geometry of the manifold through log-Euclidean metric (LEM). We first generalize GLVQ to the manifold of SPD matrices by exploiting the LEM-induced geodesic distance (GLVQ-LEM). We then extend GLVQ-LEM with metric learning. In particular, we study both 1) a more straightforward implementation of the metric learning idea by adapting metric in the space of vectorized log-transformed SPD matrices and 2) the full formulation of metric learning without matrix vectorization, thus preserving the second-order tensor structure. To obtain the distance metric in the full LEM learning (LEML) approaches, two algorithms are proposed. One method is to restrict the distance metric to be full rank, treating the distance metric tensor as an SPD matrix, and readily use the LEM framework (GLVQ-LEML-LEM). The other method is to cast no such restriction, treating the distance metric tensor as a fixed rank positive semidefinite matrix living on a quotient manifold with total space equipped with flat geometry (GLVQ-LEML-FM). Experiments on multiple datasets of different natures demonstrate the good performance of the proposed methods.
引用
收藏
页码:5178 / 5190
页数:13
相关论文
共 32 条
  • [1] [Anonymous], 2015, ARXIV150102393
  • [2] [Anonymous], 1986, TKKFA601 HEL U TECHN
  • [3] [Anonymous], 2014, P 16 INT C MULT INT
  • [4] Geometric means in a novel vector space structure on symmetric positive-definite matrices
    Arsigny, Vincent
    Fillard, Pierre
    Pennec, Xavier
    Ayache, Nicholas
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2007, 29 (01) : 328 - 347
  • [5] Log-euclidean metrics for fast and simple calculus on diffusion tensors
    Arsigny, Vincent
    Fillard, Pierre
    Pennec, Xavier
    Ayache, Nicholas
    [J]. MAGNETIC RESONANCE IN MEDICINE, 2006, 56 (02) : 411 - 421
  • [6] Multiclass Brain-Computer Interface Classification by Riemannian Geometry
    Barachant, Alexandre
    Bonnet, Stephane
    Congedo, Marco
    Jutten, Christian
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2012, 59 (04) : 920 - 928
  • [7] Learning matrix quantization and relevance learning based on Schatten-p-norms
    Bohnsack, A.
    Domaschke, K.
    Kaden, M.
    Lange, M.
    Villmann, T.
    [J]. NEUROCOMPUTING, 2016, 192 : 104 - 114
  • [8] Brunner C., 2008, BCI COMPETITION 2008, P136
  • [9] Congedo M, 2017, BRAIN-COMPUT INTERFA, V4, P155, DOI 10.1080/2326263X.2017.1297192
  • [10] Flowing on Riemannian Manifold: Domain Adaptation by Shifting Covariance
    Cui, Zhen
    Li, Wen
    Xu, Dong
    Shan, Shiguang
    Chen, Xilin
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (12) : 2264 - 2273