Joint adaptive manifold and embedding learning for unsupervised feature selection

被引:38
作者
Wu, Jian-Sheng [1 ]
Song, Meng-Xiao [1 ]
Min, Weidong [2 ,3 ]
Lai, Jian-Huang [4 ]
Zheng, Wei-Shi [4 ]
机构
[1] Nanchang Univ, Sch Informat Engn, Nanchang 330031, Jiangxi, Peoples R China
[2] Nanchang Univ, Sch Software, Nanchang 330047, Jiangxi, Peoples R China
[3] Jiangxi Key Lab Smart City, Nanchang 330047, Jiangxi, Peoples R China
[4] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou 510006, Peoples R China
关键词
Unsupervised feature selection; Manifold learning; Embedding learning; Sparse learning; STRUCTURE PRESERVATION;
D O I
10.1016/j.patcog.2020.107742
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As data always lie on a lower-dimensional space, feature selection has become an important step in computer vision, machine learning and data mining. Due to the lack of class information, the performance of unsupervised feature selection depends on how to characterize and preserve the manifold structure among data. In this paper, we propose a novel unsupervised feature selection framework, named as joint adaptive manifold and embedding learning for unsupervised feature selection (JAMEL). It iteratively and adaptively learns lower-dimensional embeddings for data to preserve the manifold structure among data, regresses data to embeddings to measure the importance of features, and learns the manifold structure among data according to the data density in the intrinsic space, where the redundant and noisy features are eliminated. In addition, we present an efficient algorithm to solve the proposed problem, together with the convergence analysis. Finally, the evaluation results with the tasks of k-means, spectral clustering and nearest neighbor classification using the selected features on 12 datasets show the effectiveness and efficiency of our approach. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:14
相关论文
共 37 条
[1]  
[Anonymous], 2011, P 20 ACM INT C INF K, DOI 10.1145/2063576.2063716
[2]  
Cai D., 2010, P 16 ACM SIGKDD INT, P333, DOI DOI 10.1145/1835804.1835848
[3]   Feature Selection Using a Neural Framework With Controlled Redundancy [J].
Chakraborty, Rudrasis ;
Pal, Nikhil R. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) :35-50
[4]   Unsupervised Feature Selection with Adaptive Structure Learning [J].
Du, Liang ;
Shen, Yi-Dong .
KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, :209-218
[5]   Local and Global Discriminative Learning for Unsupervised Feature Selection [J].
Du, Liang ;
Shen, Zhiyong ;
Li, Xuan ;
Zhou, Peng ;
Shen, Yi-Dong .
2013 IEEE 13TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2013, :131-140
[6]  
Dy JG, 2004, J MACH LEARN RES, V5, P845
[7]   Sparse Subspace Clustering: Algorithm, Theory, and Applications [J].
Elhamifar, Ehsan ;
Vidal, Rene .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (11) :2765-2781
[8]  
Elhamifar Ehsan, 2011, ADV NEURAL INFORM PR, V24
[9]   Graph autoencoder-based unsupervised feature selection with broad and local data structure preservation [J].
Feng, Siwei ;
Duarte, Marco F. .
NEUROCOMPUTING, 2018, 312 :310-323
[10]  
He X., 2005, ADV NEURAL INF PROCE, V18