Subspace structural constraint-based discriminative feature learning via nonnegative low rank representation

被引:10
作者
Li, Ao [1 ]
Liu, Xin [1 ]
Wang, Yanbing [2 ]
Chen, Deyun [1 ]
Lin, Kezheng [1 ]
Sun, Guanglu [1 ]
Jiang, Hailong [3 ]
机构
[1] Harbin Univ Sci & Technol, Sch Comp Sci & Technol, Postdoctoral Stn, Harbin, Heilongjiang, Peoples R China
[2] Harbin Univ Sci & Technol, Sch Measurement Control Technol & Commun Engn, Harbin, Heilongjiang, Peoples R China
[3] Kent State Univ, Dept Comp Sci, Kent, OH 44242 USA
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
FACE RECOGNITION; OPTIMIZATION; DICTIONARY;
D O I
10.1371/journal.pone.0215450
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Feature subspace learning plays a significant role in pattern recognition, and many efforts have been made to generate increasingly discriminative learning models. Recently, several discriminative feature learning methods based on a representation model have been proposed, which have not only attracted considerable attention but also achieved success in practical applications. Nevertheless, these methods for constructing the learning model simply depend on the class labels of the training instances and fail to consider the essential subspace structural information hidden in them. In this paper, we propose a robust feature subspace learning approach based on a low-rank representation. In our approach, the low-rank representation coefficients are considered as weights to construct the constraint item for feature learning, which can introduce a subspace structural similarity constraint in the proposed learning model for facilitating data adaptation and robustness. Moreover, by placing the subspace learning and low-rank representation into a unified framework, they can benefit each other during the iteration process to realize an overall optimum. To achieve extra discrimination, linear regression is also incorporated into our model to enforce the projection features around and close to their label-based centers. Furthermore, an iterative numerical scheme is designed to solve our proposed objective function and ensure convergence. Extensive experimental results obtained using several public image datasets demonstrate the advantages and effectiveness of our novel approach compared with those of the existing methods.
引用
收藏
页数:19
相关论文
共 49 条
[1]  
[Anonymous], P IEEE C COMP VIS PA
[2]  
[Anonymous], 2010, 100920105055 ARXIV
[3]  
[Anonymous], 2016, PROC CVPR IEEE, DOI DOI 10.1109/CVPR.2016.322
[4]  
[Anonymous], SIAM J OPTIMIZATION
[5]  
[Anonymous], BAGS FEATURES SPATIA
[6]  
Belhumeur PN, 1997, EIGENFACES VS FISHER, P43
[7]  
Cai D, 2007, 20TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P714
[8]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[9]  
Chen CF, 2012, PROC CVPR IEEE, P2618, DOI 10.1109/CVPR.2012.6247981
[10]   Approximate Low-Rank Projection Learning for Feature Extraction [J].
Fang, Xiaozhao ;
Han, Na ;
Wu, Jigang ;
Xu, Yong ;
Yang, Jian ;
Wong, Wai Keung ;
Li, Xuelong .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) :5228-5241