Ratio Sum Versus Sum Ratio for Linear Discriminant Analysis

被引:10
作者
Wang, Jingyu [1 ]
Wang, Hongmei [2 ]
Nie, Feiping [1 ,3 ]
Li, Xuelong [1 ,3 ]
机构
[1] Northwestern Polytech Univ, Sch Astronaut, Sch Artificial Intelligence OPt & Elect iOPEN, Xian 710072, Shaanxi, Peoples R China
[2] Northwestern Polytech Univ, Sch Astronaut, Xian 710072, Shaanxi, Peoples R China
[3] Northwestern Polytech Univ, Minist Ind & Informat Technol, Key Lab Intelligent Interact & Applicat, Xian 710072, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Kernel; Principal component analysis; Feature extraction; Covariance matrices; Linear discriminant analysis; Eigenvalues and eigenfunctions; Sparse matrices; Supervised learning; dimensionality reduction; linear discriminant analysis; ratio sum problem; kernel technique; subspace learning; FEATURE-EXTRACTION; FEATURE-SELECTION; ROBUST; FRAMEWORK; EFFICIENT; TRACE;
D O I
10.1109/TPAMI.2021.3133351
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dimension reduction is a critical technology for high-dimensional data processing, where Linear Discriminant Analysis (LDA) and its variants are effective supervised methods. However, LDA prefers to feature with smaller variance, which causes feature with weak discriminative ability retained. In this paper, we propose a novel Ratio Sum for Linear Discriminant Analysis (RSLDA), which aims at maximizing discriminative ability of each feature in subspace. To be specific, it maximizes the sum of ratio of the between-class distance to the within-class distance in each dimension of subspace. Since the original RSLDA problem is difficult to obtain the closed solution, an equivalent problem is developed which can be solved by an alternative optimization algorithm. For solving the equivalent problem, it is transformed into two sub-problems, one of which can be solved directly, the other is changed into a convex optimization problem, where singular value decomposition is employed instead of matrix inversion. Consequently, performance of algorithm cannot be affected by the non-singularity of covariance matrix. Furthermore, Kernel RSLDA (KRSLDA) is presented to improve the robustness of RSLDA. Additionally, time complexity of RSLDA and KRSLDA are analyzed. Extensive experiments show that RSLDA and KRSLDA outperforms other comparison methods on toy datasets and multiple public datasets.
引用
收藏
页码:10171 / 10185
页数:15
相关论文
共 57 条
[1]  
[Anonymous], 2011, IJCAI proceedings-international joint conference on artificial intelligence
[2]  
[Anonymous], 2005, Fisher Linear Discriminant Analysis
[3]   Modified Sparse Linear-Discriminant Analysis via Nonconvex Penalties [J].
Cai, Jia ;
Huang, Xiaolin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (10) :4957-4966
[4]  
Cai X, 2013, 19TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'13), P1124
[5]   Intrinsic Grassmann Averages for Online Linear, Robust and Nonlinear Subspace Learning [J].
Chakraborty, Rudrasis ;
Yang, Liu ;
Hauberg, Soren ;
Vemuri, Baba C. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (11) :3904-3917
[6]   Regularized orthogonal linear discriminant analysis [J].
Ching, Wai-Ki ;
Chu, Delin ;
Liao, Li-Zhi ;
Wang, Xiaoyan .
PATTERN RECOGNITION, 2012, 45 (07) :2719-2732
[7]   SPARSE ORTHOGONAL LINEAR DISCRIMINANT ANALYSIS [J].
Chu, Delin ;
Liao, Li-Zhi ;
Ng, Michael K. .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2012, 34 (05) :A2421-A2443
[8]   A new and fast implementation for null space based linear discriminant analysis [J].
Chu, Delin ;
Thye, Goh Siong .
PATTERN RECOGNITION, 2010, 43 (04) :1373-1379
[9]   Sparse Discriminant Analysis [J].
Clemmensen, Line ;
Hastie, Trevor ;
Witten, Daniela ;
Ersboll, Bjarne .
TECHNOMETRICS, 2011, 53 (04) :406-413
[10]   Face Recognition via Collaborative Representation: Its Discriminant Nature and Superposed Representation [J].
Deng, Weihong ;
Hu, Jiani ;
Guo, Jun .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (10) :2513-2521