Orthogonal least squares regression for feature extraction

被引:42
作者
Zhao, Haifeng [1 ,2 ]
Wang, Zheng [1 ,2 ]
Nie, Feiping [3 ,4 ]
机构
[1] Anhui Univ, MOE, Key Lab Intelligent Comp & Signal Proc, Hefei 230039, Peoples R China
[2] Anhui Univ, Sch Comp & Technol, Hefei 230039, Peoples R China
[3] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Shanxi, Peoples R China
[4] Northwestern Polytech Univ, Ctr OPT IMagery Anal & Learning OPTIMAL, Xian 710072, Shanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Least squares regression; Orthogonal constraint; Unbalanced orthogonal procrustes problem; DIMENSIONALITY REDUCTION; EFFICIENT;
D O I
10.1016/j.neucom.2016.07.037
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In many data mining applications, dimensionality reduction is a primary technique to map high-dimensional data to a lower dimensional space. In order to preserve more local structure information, we propose a novel orthogonal least squares regression model for feature extraction in this paper. The main contributions of this paper are shown as follows: first, the new least squares regression method is constructed under the orthogonal constraint which can preserve more discriminant information in the subspace. Second, the optimization problem of classical least squares regression can be solved easily. However the proposed objective function is an unbalanced orthogonal procrustes problem, it is so difficult to obtain the solution that we present a novel iterative optimization algorithm to obtain the optimal solution. The last one, we also provide a proof of the convergence for our iterative algorithm. Additionally, experimental results show that we obtain a global optimal solution through our iterative algorithm even though the optimization problem is a non-convex problem. Both theoretical analysis and empirical studies demonstrate that our method can more effectively reduce the data dimensionality than conventional methods. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:200 / 207
页数:8
相关论文
共 35 条
[1]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720
[2]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[3]  
Bishop C., 2006, Pattern recognition and machine learning, P423
[4]   Orthogonal laplacianfaces for face recognition [J].
Cai, Deng ;
He, Xiaofei ;
Han, Jiawei ;
Zhang, Hong-Jiang .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2006, 15 (11) :3608-3614
[5]   A new LDA-based face recognition system which can solve the small sample size problem [J].
Chen, LF ;
Liao, HYM ;
Ko, MT ;
Lin, JC ;
Yu, GJ .
PATTERN RECOGNITION, 2000, 33 (10) :1713-1726
[6]  
Feiping Nie, 2014, Machine Learning and Knowledge Discovery in Databases. European Conference, ECML PKDD 2014. Proceedings: LNCS 8725, P485, DOI 10.1007/978-3-662-44851-9_31
[7]   The use of multiple measurements in taxonomic problems [J].
Fisher, RA .
ANNALS OF EUGENICS, 1936, 7 :179-188
[8]   REGULARIZED DISCRIMINANT-ANALYSIS [J].
FRIEDMAN, JH .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1989, 84 (405) :165-175
[9]   Discriminative sparsity preserving projections for image recognition [J].
Gao, Quanxue ;
Huang, Yunfang ;
Zhang, Hailin ;
Hong, Xin ;
Li, Kui ;
Wang, Yong .
PATTERN RECOGNITION, 2015, 48 (08) :2543-2553
[10]   Enhanced fisher discriminant criterion for image recognition [J].
Gao, Quanxue ;
Liu, Jingjing ;
Zhang, Haijun ;
Hou, Jun ;
Yang, Xiaojing .
PATTERN RECOGNITION, 2012, 45 (10) :3717-3724