Nuclear-Norm-Based Jointly Sparse Regression for Two-Dimensional Image Regression

被引:1
作者
Sul, Haosheng [1 ]
Lai, Zhihui [1 ,2 ]
Liu, Ning [1 ]
Wang, Jun [1 ,2 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Peoples R China
[2] Shenzhen Inst Artificial Intelligence & Robot Soc, Shenzhen, Peoples R China
来源
PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2020, PT III | 2020年 / 12307卷
关键词
Dimensionality reduction; 2DLPP; Nuclear norm; L-2; L-1; norm; Image feature extraction and recognition; NEIGHBORHOOD PRESERVING PROJECTION; DIMENSIONALITY REDUCTION; FACE REPRESENTATION; RECOGNITION; PCA; EXTENSION; FRAMEWORK; RECOVERY;
D O I
10.1007/978-3-030-60636-7_30
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a typical manifold learning method, two-dimensional Locality Preserving Projections (2DLPP) can perverse the intrinsic manifold structure of the data, and it has been widely used in dimensionality reduction. However, 2DLPP is sensitive to noise and outliers since 2DLPP uses the Frobenius norm to measure the reconstruction error. In order to address the robustness problem in 2DLPP, this paper proposes a novel framework, called nuclear-norm-based jointly sparse regression (NJSR). NJSR characterizes the reconstruction error by using the nuclear norm as the measurement and the L-2,L-1-norm as the regularized term to derive jointly sparse solutions for feature extraction and selection. Moreover, we propose a bilateral extension over NJSR called Bilateral NJSR (BNJSR). BNJSR learns the projection matrices in both the row and the column directions simultaneously. Both the NJSR and BNJSR can be solved through an iterative algorithm by computing a set of eigenfunctions. Two face databases are used to verify the effectiveness of the proposed methods, and the experimental results show BNJSR outperformed the compared methods.
引用
收藏
页码:355 / 368
页数:14
相关论文
共 33 条
[1]   Inductive Robust Principal Component Analysis [J].
Bao, Bing-Kun ;
Liu, Guangcan ;
Xu, Changsheng ;
Yan, Shuicheng .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2012, 21 (08) :3794-3800
[2]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[3]  
Chen SB, 2007, INT CONF ACOUST SPEE, P601
[4]   2D-LPP: A two-dimensional extension of locality preserving projections [J].
Chen, Sibao ;
Zhao, Haifeng ;
Kong, Min ;
Luo, Bin .
NEUROCOMPUTING, 2007, 70 (4-6) :912-921
[5]   LOW-RANK MATRIX RECOVERY VIA ITERATIVELY REWEIGHTED LEAST SQUARES MINIMIZATION [J].
Fornasier, Massimo ;
Rauhut, Holger ;
Ward, Rachel .
SIAM JOURNAL ON OPTIMIZATION, 2011, 21 (04) :1614-1640
[6]  
Fukunnaga K., 1991, INTRO STAT PATTERN R, Vsecond
[7]  
Gu Q., 2011, Proceedings of the Twenty-Second international joint conference on Artificial Intelligence, V2, P1294
[8]  
He XF, 2004, ADV NEUR IN, V16, P153
[9]   Face recognition using Laplacianfaces [J].
He, XF ;
Yan, SC ;
Hu, YX ;
Niyogi, P ;
Zhang, HJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (03) :328-340
[10]   Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection [J].
Hou, Chenping ;
Nie, Feiping ;
Li, Xuelong ;
Yi, Dongyun ;
Wu, Yi .
IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (06) :793-804