Supervised Discriminative Sparse PCA with Adaptive Neighbors for Dimensionality Reduction

被引:1
作者
Shi, Zhenhua [1 ]
Wu, Dongrui [1 ]
Huang, Jian [1 ]
Wang, Yu-Kai [2 ]
Lin, Chin-Teng [2 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Artificial Intelligence & Automat, Wuhan, Peoples R China
[2] Univ Technol, Fac Engn & Informat Technol, Sydney, NSW, Australia
来源
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2020年
基金
中国国家自然科学基金;
关键词
Principal component analysis; adaptive neighbors; linear dimensionality reduction; GRAPH;
D O I
10.1109/ijcnn48605.2020.9206927
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dimensionality reduction is an important operation in information visualization, feature extraction, clustering, regression, and classification, especially for processing noisy high dimensional data. However, most existing approaches preserve either the global or the local structure of the data, but not both. Approaches that preserve only the global data structure, such as principal component analysis (PCA), are usually sensitive to outliers. Approaches that preserve only the local data structure, such as locality preserving projections, are usually unsupervised (and hence cannot use label information) and uses a fixed similarity graph. We propose a novel linear dimensionality reduction approach, supervised discriminative sparse PCA with adaptive neighbors (SDSPCAAN), to integrate neighborhood-free supervised discriminative sparse PCA and projected clustering with adaptive neighbors. As a result, both global and local data structures, as well as the label information, are used for better dimensionality reduction. Classification experiments on nine high-dimensional datasets validated the effectiveness and robustness of our proposed SDSPCAAN.
引用
收藏
页数:8
相关论文
共 33 条
[1]  
[Anonymous], 1991, Graph theory, combinatorics, and applications, DOI DOI 10.1016/J.CAMWA.2004.05.005
[2]  
[Anonymous], IEEE T KNOWLEDGE DAT
[3]   Supervised principal component analysis: Visualization, classification and regression on subspaces and submanifolds [J].
Barshan, Elnaz ;
Ghodsi, Ali ;
Azimifar, Zohreh ;
Jahromi, Mansoor Zolghadri .
PATTERN RECOGNITION, 2011, 44 (07) :1357-1371
[4]  
Belkin M, 2002, ADV NEUR IN, V14, P585
[5]   Recent Advances in Supervised Dimension Reduction: A Survey [J].
Chao, Guoqing ;
Luo, Yuan ;
Ding, Weiping .
MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2019, 1 (01) :341-358
[6]  
Chung F., 1996, CBMS REGIONAL C SERI, V92
[7]  
Cunningham JP, 2015, J MACH LEARN RES, V16, P2859
[8]  
De Bie T, 2005, HANDBOOK OF GEOMETRIC COMPUTING: APPLICATIONS IN PATTERN RECOGNITION, COMPUTER VISION, NEURALCOMPUTING, AND ROBOTICS, P129
[9]  
Dua D, 2017, UCI machine learning repository
[10]  
Fanti C, 2004, ADV NEUR IN, V16, P1603