Robust Low Rank and Sparse Representation for Multiple Kernel Dimensionality Reduction

被引:7
作者
Yan, Wenzhu [1 ,2 ]
Yang, Ming [1 ,2 ]
Li, Yanmeng [3 ]
机构
[1] Nanjing Normal Univ, Sch Comp & Elect Informat, Nanjing 210046, Peoples R China
[2] Nanjing Normal Univ, Sch Artificial Intelligence, Nanjing 210046, Peoples R China
[3] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Peoples R China
基金
中国国家自然科学基金;
关键词
Index Terms- Data representation; dimensionality reduction; feature selection; kernel; low-rank embedding; l(2,1) norm; COMPONENT ANALYSIS; PROJECTIONS; RECOGNITION; ALGORITHM;
D O I
10.1109/TCSVT.2021.3087643
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In the fields of pattern recognition and data mining, two problems need to be addressed. First, the curse of dimensionality degrades the performance of many practical data processing techniques. Second, due to the existence of noise and outliers, feature extraction on corrupted data cannot be effectively achieved. Recently, some representation based methods have produced promising results. However, these methods cannot handle the case in which nonlinear similarity exists and have failed to provide the quantized interpretability for the importance of features. In this paper, we propose a novel low rank and sparse representation method to realize dimensionality reduction and robustly extract latent low dimensional discriminative features. Specifically, we first adopt multiple kernel learning to map the original data into an embedded reproducing kernel Hilbert space (RKHS) and then kernel based similarity discriminative projection is learned to explore the within-class and between-class variability. Notably, this low dimensional feature learning strategy is definitely integrated into the low rank matrix recovery of the kernel matrix. Next, we introduce the regularization of l(2,1) norm on error matrix to eliminate noise and on projection matrix to lead the selected features to be more compact and interpretable. The non-convex optimization problem is effectively solved by the alternating direction method of multipliers (ADMM) methods. Extensive experiments on seven benchmark datasets are conducted to demonstrate the effectiveness of our method.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 54 条
  • [1] [Anonymous], 2000, An Introduction To Support Vector Machines and Other Kernel-based Learning Methods
  • [3] Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection
    Belhumeur, PN
    Hespanha, JP
    Kriegman, DJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) : 711 - 720
  • [4] Boyd S., 2004, CONVEX OPTIMIZATION, DOI [DOI 10.1017/CBO9780511804441, 10.1017/CBO9780511804441]
  • [5] A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION
    Cai, Jian-Feng
    Candes, Emmanuel J.
    Shen, Zuowei
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) : 1956 - 1982
  • [6] Robust Principal Component Analysis?
    Candes, Emmanuel J.
    Li, Xiaodong
    Ma, Yi
    Wright, John
    [J]. JOURNAL OF THE ACM, 2011, 58 (03)
  • [7] When Deep Learning Meets Metric Learning: Remote Sensing Image Scene Classification via Learning Discriminative CNNs
    Cheng, Gong
    Yang, Ceyuan
    Yao, Xiwen
    Guo, Lei
    Han, Junwei
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (05): : 2811 - 2821
  • [8] Duplex Metric Learning for Image Set Classification
    Cheng, Gong
    Zhou, Peicheng
    Han, Junwei
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (01) : 281 - 292
  • [9] Cutler A, 2012, ENSEMBLE MACHINE LEARNING: METHODS AND APPLICATIONS, P157, DOI 10.1007/978-1-4419-9326-7_5
  • [10] Approximate Low-Rank Projection Learning for Feature Extraction
    Fang, Xiaozhao
    Han, Na
    Wu, Jigang
    Xu, Yong
    Yang, Jian
    Wong, Wai Keung
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) : 5228 - 5241