Adaptive Graph Learning for Unsupervised Feature Selection

被引:2
作者
Zhang, Zhihong [1 ]
Bai, Lu [2 ]
Liang, Yuanheng [3 ]
Hancock, Edwin R. [4 ]
机构
[1] Xiamen Univ, Software Sch, Xiamen, Fujian, Peoples R China
[2] Cent Univ Finance & Econ, Sch Informat, Beijing, Peoples R China
[3] Xiamen Univ, Sch Math Sci, Xiamen, Fujian, Peoples R China
[4] Univ York, Dept Comp Sci, York YO10 5DD, N Yorkshire, England
来源
COMPUTER ANALYSIS OF IMAGES AND PATTERNS, CAIP 2015, PT I | 2015年 / 9256卷
关键词
Graph learning; Laplacian; Unsupervised feature selection;
D O I
10.1007/978-3-319-23192-1_66
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most existing feature selection methods select features by evaluating a criterion which measures their ability to preserve the similarity structure of a data graph. However, these methods dichotomise the process of constructing or learning the underlying data graph and subsequent feature ranking. Once the graph is determined so as to characterize the structure of the similarity data, it is left fixed in the following ranking or regression steps. As a result, the performance of feature selection is largely determined by the effectiveness of graph construction step. The key to constructing an effective similarity graph is to determine a data similarity matrix. In this paper we perform the problem of estimating or learning the data similarity matrix and data-regression as simultaneous tasks, to perform unsupervised spectral feature selection. Our new method learns the data similarity matrix by optimally re-assigning the neighbors for each data point based on local distances or dis-similarities. Meanwhile, the l(2,1)-norm is imposed to the transformation matrix to achieve row sparsity, which leads to the selection of relevant features. We derive an efficient optimization method to solve the simultaneous feature similarity graph and feature selection problems. Extensive experimental results on real-world benchmark data sets shows that our method consistently outperforms the alternative feature selection methods.
引用
收藏
页码:790 / 800
页数:11
相关论文
共 13 条
[1]  
Belkin M, 2002, ADV NEUR IN, V14, P585
[2]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[3]  
Boyd S., 2004, CONVEX OPTIMIZATION
[4]  
Cai D., 2010, P 16 ACM SIGKDD INT, P333, DOI DOI 10.1145/1835804.1835848
[5]   SPECTRAL K-WAY RATIO-CUT PARTITIONING AND CLUSTERING [J].
CHAN, PK ;
SCHLAG, MDF ;
ZIEN, JY .
IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 1994, 13 (09) :1088-1096
[6]  
Chang C.-C., 2011, ACM T INTELLIGENT SY
[8]  
He X., 2005, P 18 INT C NEUR INF, P507
[9]  
He X., 2003, Locality Preserving Projections
[10]   Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection [J].
Hou, Chenping ;
Nie, Feiping ;
Li, Xuelong ;
Yi, Dongyun ;
Wu, Yi .
IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (06) :793-804