Constrained large Margin Local Projection algorithms and extensions for multimodal dimensionality reduction

被引:26
作者
Zhang, Zhao [1 ]
Zhao, Mingbo [1 ]
Chow, Tommy W. S. [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Kowloon, Hong Kong, Peoples R China
关键词
Dimensionality reduction; Large margin projection; Manifold visualization; Pairwise constraints; Locality preservation; Multimodality preservation; Kernel method; Pattern classification; PRESERVING DISCRIMINANT-ANALYSIS; CRITERION; EIGENMAPS; EFFICIENT;
D O I
10.1016/j.patcog.2012.05.015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A Constrained large Margin Local Projection (CMLP) technique for multimodal dimensionality reduction is proposed. We elaborate the criterion of CMLP from a pairwise constrained marginal perspective. Four effective CMLP solution schemes are presented and the corresponding comparative analyses are given. An equivalent weighted least squares formulation for CMLP is also detailed. CMLP is originated from the criterion of Locality Preserving Projections (LPP), but CMLP offers a number of attractive advantages over LPP. To keep the intrinsic proximity relations of inter-class and intra-class similarity pairs, the localized pairwise Cannot-Link and Must-Link constraints are applied to specify the types of those neighboring pairs. By utilizing the CMLP criterion, margins between inter- and intra-class clusters are significantly enlarged. As a result, multimodal distributions are effectively preserved. To further optimize the CMLP criterion, one feasible improvement strategy is described. With kernel methods, we present the kernelized extensions of our approaches. Mathematical comparisons and analyses between this work and the related works are also detailed. Extensive simulations including multivariate manifold visualization and classification on the benchmark UCL, ORL, YALE, UMIST, MIT CBCL and USPS datasets are conducted to verify the efficiency of our techniques. The presented results reveal that our methods are highly competitive with and even outperform some widely used state-of-the-art algorithms. (C) 2012 Elsevier Ltd. All rights reserved.
引用
收藏
页码:4466 / 4493
页数:28
相关论文
共 53 条
  • [1] DISCRETE COSINE TRANSFORM
    AHMED, N
    NATARAJAN, T
    RAO, KR
    [J]. IEEE TRANSACTIONS ON COMPUTERS, 1974, C 23 (01) : 90 - 93
  • [2] [Anonymous], P IEEE COMP SOC C CO
  • [3] [Anonymous], 2005, Advances in Neural Information Processing Systems
  • [4] [Anonymous], P 1 IEEE WORKSH FAC
  • [5] [Anonymous], P 26 ANN INT C MACH
  • [6] [Anonymous], 2010, Data fitting and uncertainty: A practical introduction to weighted least squares and beyond
  • [7] [Anonymous], P 26 ANN INT C MACH
  • [8] [Anonymous], 2002, Learning with Kernels:Support Vector Machines, Regularization, Optimization, and Beyond
  • [9] [Anonymous], 2006, Distance metric learning: a comprehensive survey
  • [10] [Anonymous], 2009, 2009005 TICCTR TILB