Manifold-based discriminant analysis

被引:0
作者
机构
[1] School of Electronics and Computer Science Technology, North University of China
[2] School of Information, Business College of Shanxi University
来源
Liu, Z.-B. (liu_zhongbao@hotmail.com) | 2013年 / Science Press卷 / 35期
关键词
Dimensionality reduction; Fisher criterion; Global features; Manifold learning; Pattern recognition;
D O I
10.3724/SP.J.1146.2012.01552
中图分类号
学科分类号
摘要
Researches on current Dimensionality Reduction (DR) methods are mainly based on two ways. One attempts to ensure the stabilities of global features of high-dimensional samples, the other tries to make the local manifold structure between data before and after dimension reduction be as invariant as possible. As the existed information is not fully utilized by current DR methods, the DR efficiencies are restricted. Based on the above analysis, Manifold-based Discriminnant Analysis (MDA) is proposed based on Fisher criterion and manifold preserving. The global features and local structure are both taken into consideration by MDA. It defines two scatters: Manifold-based Within-Class Scatter (MWCS) and Manifold-based Between-Class Scatter (MBCS). According to Fisher criterion, the optimal projection satisfies the ratio of MBCS and MWCS is maximized. MDA not only inherits the superiorities of current DR methods, but further improves the DR efficiencies. Experiments on some standard datasets verify the effectiveness of the proposed method MDA.
引用
收藏
页码:2047 / 2053
页数:6
相关论文
共 19 条
[1]  
Martinez A.M., Kak A.C., PCA versus LDA, IEEE Transactions on Pattern Analysis and Machine Intelligence, 23, 2, pp. 228-233, (2001)
[2]  
Alibeigi M., Hashemi S., Hamzeh A., DBFS: an effective density based feature selection scheme for small sample size and high dimensional imbalanced data sets, Data & Knowledge Engineering, 81-82, pp. 67-103, (2012)
[3]  
Friedman H., Regularized discriminant analysis, Journal of the American Statistical Association, 84, 405, pp. 165-175, (1989)
[4]  
Li M., Yuan B., 2D-LDA: a novel statistical linear discriminant analysis for image matrix, Pattern Recognition Letters, 26, 5, pp. 527-532, (2005)
[5]  
Ye J.P., Xiong T., Computational and theoretical analysis of null space and orthogonal linear discriminant analysis, Journal of Machine Learning Research, 7, pp. 1183-1204, (2006)
[6]  
Yu H., Yang J., A direct LDA algorithm for highdimensional data with application to face recognition, Pattern Recognition, 34, 11, pp. 2067-2070, (2001)
[7]  
Wan M.H., Lai Z.H., Jin Z., Feature extraction using two-dimensional local graph embedding based on maximum margin criterion, Applied Mathematics and Computation, 217, 23, pp. 9659-9668, (2011)
[8]  
Ji S.W., Ye J.P., Generalized linear discriminant analysis: a unified framework and efficient model selection, IEEE Transactions on Neural Networks, 19, 10, pp. 1768-1782, (2008)
[9]  
Chen L.F., Liao H.Y.M., Ko M.T., Et al., A new LDA-based face recognition system which can solve the small sample size problem, Pattern Recognition, 33, 10, pp. 1713-1726, (2000)
[10]  
Belhumeur P.N., Hespanha J.P., Kriegman D.J., Eiegnfaces vs. fisherfaces: recognition using class specific linear projection, IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 7, pp. 711-720, (1997)