Regularized least squares fisher linear discriminant with applications to image recognition

被引:12
作者
Chen, Xiaobo [1 ,2 ]
Yang, Jian [2 ]
Mao, Qirong [1 ]
Han, Fei [1 ]
机构
[1] Jiangsu Univ, Sch Comp Sci & Telecommun Engn, Zhenjiang 212013, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Technol, Nanjing 210094, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Linear discriminant analysis (LDA); Regularization technique; Concave-convex programming (CCP); 2-Norm loss function; SUPPORT VECTOR MACHINE; FACE RECOGNITION; PROJECTIONS; REDUCTION;
D O I
10.1016/j.neucom.2013.05.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recursive concave-convex Fisher Linear Discriminant (RPFLD) is a novel efficient dimension reduction method and has been successfully applied to image recognition. However, RPFLD suffers from singularity problem and may lose some useful discriminant information when applied to high-dimensional data. Moreover, RPFLD is computationally expensive because it has to solve a series of quadratic programming (QP) problems to obtain optimal solution. In order to improve the generalization performance of RPFLD and at the same time reduce its training burden, we propose a novel method termed as regularized least squares Fisher linear discriminant (RLS-FLD) in this paper. The central idea is to introduce regularization into RPFLD and simultaneously use the 2-norm loss function. In doing so, the objective function of RLS-FLD turns out to be positive-definite, thus avoiding singularity problem. To solve RLS-FLD, the concave-convex programming (CCP) algorithm is employed to convert the original nonconvex problem to a series of equality-constrained convex QP problems. Each optimization problem in this series has a closed-form solution in its primal formulation via classic Lagrangian method. The resulting RLS-FLD thus leads to much fast training speed and does not need any optimization packages. Meanwhile, theoretical analysis is provided to uncover the connections between RLS-FLD and regularized linear discriminant analysis (RLDA), thus giving more insight into the principle of RLS-FLD. The effectiveness of the proposed RLS-FLD is demonstrated by experimental results on some real-world handwritten digit, face and object recognition datasets. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:521 / 534
页数:14
相关论文
共 39 条
[1]  
[Anonymous], 2002, Principal components analysis
[2]  
[Anonymous], EUR C COMP VIS, DOI DOI 10.1007/BFB0015522
[3]   Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction [J].
Bian, Wei ;
Tao, Dacheng .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (05) :1037-1050
[4]  
Bo Xu, 2010, Proceedings of the 2010 20th International Conference on Pattern Recognition (ICPR 2010), P569, DOI 10.1109/ICPR.2010.144
[5]   Recursive robust least squares support vector regression based on maximum correntropy criterion [J].
Chen, Xiaobo ;
Yang, Jian ;
Liang, Jun ;
Ye, Qiaolin .
NEUROCOMPUTING, 2012, 97 :63-73
[6]   Optimal Locality Regularized Least Squares Support Vector Machine via Alternating Optimization [J].
Chen, Xiaobo ;
Yang, Jian ;
Liang, Jun .
NEURAL PROCESSING LETTERS, 2011, 33 (03) :301-315
[7]   Recursive projection twin support vector machine via within-class variance minimization [J].
Chen, Xiaobo ;
Yang, Jian ;
Ye, Qiaolin ;
Liang, Jun .
PATTERN RECOGNITION, 2011, 44 (10-11) :2643-2655
[8]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297
[9]   Face recognition by regularized discriminant analysis [J].
Dai, Dao-Qing ;
Yuen, Pong C. .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2007, 37 (04) :1080-1085
[10]   Local Linear Discriminant Analysis Framework Using Sample Neighbors [J].
Fan, Zizhu ;
Xu, Yong ;
Zhang, David .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (07) :1119-1132