Image classification based on weighted nonconvex low-rank and discriminant least squares regression

被引:0
作者
Kunyan Zhong
Jinglei Liu
机构
[1] Yantai University,School of Computer and Control Engineering
来源
Applied Intelligence | 2023年 / 53卷
关键词
Low-rank; Weighted nonconvex; Image classification; Least squares regression; Projection;
D O I
暂无
中图分类号
学科分类号
摘要
Classifiers based on least squares regression (LSR) are effective in multi-classification tasks. However, there are two main problems that greatly limit its performance. First of all, most of the existing methods use limited projections and cause a lot of loss of discriminative information, but excessive use of relaxed labels may lead to overfitting. Additionally, the traditional nuclear norm treats the weights of each singular value equally, and cannot fully discuss the influence of different weights on the rank. In order to solve these problems and improve the classification performance, this paper proposes a multi-class image classification method based on weighted nonconvex low-rank and discriminative least squares regression (WNLRDLSR). Specially, using relaxed labels to replace zero-one labels, which allows the margins from different classes of samples to be widened while enhancing the intra-class compactness and similarity, thus making the resulting projections more discriminative; Furthermore, introducing the weighted nonconvex low-rank constraint in the least squares regression model, applying the weighted nonconvex low-rank norm to fully explore the effect of different rank components on the label matrix while being close to the original low-rank hypothesis. Experiments show that it helps to learn more distinguished regression projections to achieve better classification performance. The classification accuracy on different face, object and handwriting datasets are higher than that of the contrastive methods, and experiments show that the proposed WNLRDLSR is superior to many state-of-the-art methods.
引用
收藏
页码:20844 / 20862
页数:18
相关论文
共 170 条
[1]  
Abdi H(2010)Partial least squares regression and projection on latent structure regression (pls-regression) Wiley Interdisc Rev: Comput Stat 2 97-106
[2]  
Avron H(2017)Faster kernel ridge regression using sketching and preconditioning SIAM J Matrix Anal Appl 38 1116-1138
[3]  
Clarkson K(2011)Distributed optimization and statistical learning via the alternating direction method of multipliers Found Trends Mach Learn 3 1-122
[4]  
Woodruff D(2019)A sparse regularized nuclear norm based matrix regression for face recognition with contiguous occlusion Pattern Recogn Lett 125 494-499
[5]  
Boyd S(2018)Robust latent subspace learning for image classification IEEE Trans Neural Netw Learn Syst 29 2502-2515
[6]  
Parikh N(2018)Regularized label relaxation linear regression IEEE Trans Neural Netw Learn Syst 29 1006-1018
[7]  
Chu E(2012)Fast sparse regression and classification Int J Forecast 28 722-738
[8]  
Peleato B(2002)From few to many: illumination cone models for face recognition under variable lighting and pose IEEE Trans Pattern Anal Mach Intell 23 643-660
[9]  
Eckstein J(2017)2d feature selection by sparse matrix regression IEEE Trans Image Process 26 4255-4268
[10]  
Chen Z(2013)Label consistent k-svd: learning a discriminative dictionary for recognition IEEE Trans Pattern Anal Mach Intell 35 2651-2664