Discriminating Classes Collapsing for Globality and Locality Preserving Projections

被引:0
作者
Wang, Wei [1 ]
Hu, Baogang [2 ]
Wang, Zengfu [1 ]
机构
[1] Univ Sci & Technol China, Dept Automat, Hefei 230026, Peoples R China
[2] Chinese Acad Sci, Natl Lab Pattern Recognit, Inst Automat, Beijing 100864, Peoples R China
来源
2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2012年
关键词
Dimensionality reduction; Maximally Collapsing Metric Learning (MCML); manifold learning; GPU; Visualization; DIMENSIONALITY REDUCTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a novel approach, namely Globality and Locality Preserving Projections (GLPP), is proposed in the study of dimensionality reduction. The method is designed to combine the ideas behind Locality Preserving (LP), Discriminating Power (DP) and Maximally Collapsing Metric Learning (MCML), resulting in a unified model. Several distinguished features are obtained from the integration design. First, the method is able to take into account both global and local information of the data set. We introduce a new formula for calculating the conditional probabilities, which can remove the locality distortions from MCML. Second, discrimination information is applied so that a projection matrix is formed which can collapse all data points of the same class closer together, while pushing points of different classes further away. Third, the proposed method guarantees a supervised convex algorithm, which is a critical feature in data processing. Furthermore on this concern, GLPP is mapped to a Graphics Processor Unit (GPU) architecture in the implementation to be appropriate for large scale data sets. Several numerical studies are conducted on a variety of data sets. The numerical results confirm that GLPP consistently outperforms most up-to-date methods, allowing high classification accuracy, good visualization and sharply decreased consuming time.
引用
收藏
页数:8
相关论文
共 23 条
[1]   Parallel GPU Implementation of Iterative PCA Algorithms [J].
Andrecut, M. .
JOURNAL OF COMPUTATIONAL BIOLOGY, 2009, 16 (11) :1593-1599
[2]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[3]  
Boyd S.P, 2004, Convex optimization, DOI [DOI 10.1017/CBO9780511804441, 10.1017/CBO9780511804441]
[4]  
Chen HT, 2005, PROC CVPR IEEE, P846
[5]  
Cox M., 2008, Measur. Judgment Decis. Mak., P315, DOI [10.1007/978-3-540-33037-014, DOI 10.1007/978-3-540-33037-0_14]
[6]  
Fanti C, 2004, ADV NEUR IN, V16, P1603
[7]   The use of multiple measurements in taxonomic problems [J].
Fisher, RA .
ANNALS OF EUGENICS, 1936, 7 :179-188
[8]  
Globerson A., 2006, NIPS 18, P451
[9]  
Goldberger J., 2004, Advances in Neural Information Processing Systems, V17
[10]  
He X. F., 2003, ADV NEURAL INFORM PR, P153