Flexible sparse robust low-rank approximation of matrix for image feature selection and classification

被引:2
作者
Chen, Xiuhong [1 ,2 ]
Chen, Tong [1 ]
机构
[1] JiangNan Univ, Sch Artificial Intelligence & Comp Sci, Wuxi, Jiangsu, Peoples R China
[2] Jiangnan Univ, Jiangsu Key Lab Media Design & Software Technol, Wuxi, Jiangsu, Peoples R China
关键词
Low-rank approximation of matrix; Projection and recovery matrix; Reconstruction error; Kronecker product; Sparsity; Feature selection; Classification; PRINCIPAL COMPONENT ANALYSIS; FACE REPRESENTATION; 2-DIMENSIONAL PCA; EIGENFACES;
D O I
10.1007/s00500-023-09189-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The left/right projection matrix and recovery matrix used for the reconstruction error in the traditional generalized low-rank approximation of matrix models are the same and orthogonal, which makes the model inflexible. To this end, we propose the flexible sparse robust low-rank approximation of matrices model to integrate feature selection into subspace learning and to exclude the redundant features. In the proposed model, two recovery matrices are introduced to together recover the original image data from the subspace spanned by the selected features, resulting in more freedom and flexible to jointly select useful features for low-dimensional representation. Moreover, the L1-norm is imposed on the reconstruction error and L2,1-norm on the Kronecker product of left and right transformation matrices, which can reduce the influence of noise on errors and perform feature selection while learning the optimal transformation matrices and recovery matrices. According to some theoretical analysis, an alternative iterative solution method is designed, and the convergence and time complexity of the algorithm are analyzed. The experimental results on some image datasets show that our method is superior to the existing state-of-the-art methods.
引用
收藏
页码:17603 / 17620
页数:18
相关论文
共 46 条
[1]   Generalized low-rank approximation of matrices based on multiple transformation pairs [J].
Ahmadi, Soheil ;
Rezghi, Mansoor .
PATTERN RECOGNITION, 2020, 108 (108)
[2]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720
[3]   Using linear algebra for intelligent information retrieval [J].
Berry, MW ;
Dumais, ST ;
OBrien, GW .
SIAM REVIEW, 1995, 37 (04) :573-595
[4]   A pure L1-norm principal component analysis [J].
Brooks, J. P. ;
Dula, J. H. ;
Boone, E. L. .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2013, 61 :83-98
[5]   Dynamic Graph Regularization and Label Relaxation-Based Sparse Matrix Regression for Two-Dimensional Feature Selection [J].
Chen, Xiuhong ;
Lu, Yun .
IEEE ACCESS, 2020, 8 (08) :62855-62870
[6]   L2,1-norm-based sparse principle component analysis with trace norm regularised term [J].
Chen, Xiuhong ;
Sun, Huiqiang .
IET IMAGE PROCESSING, 2019, 13 (06) :910-922
[7]   L1-norm projection pursuit principal component analysis [J].
Choulakian, V .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2006, 50 (06) :1441-1451
[8]   A Framework for Robust Subspace Learning [J].
Fernando De la Torre ;
Michael J. Black .
International Journal of Computer Vision, 2003, 54 (1-3) :117-142
[9]  
De la Torre F, 2001, EIGHTH IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION, VOL I, PROCEEDINGS, P362, DOI 10.1109/ICCV.2001.937541
[10]   A multilinear singular value decomposition [J].
De Lathauwer, L ;
De Moor, B ;
Vandewalle, J .
SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2000, 21 (04) :1253-1278