UNSUPERVISED FEATURE SELECTION BY NONNEGATIVE SPARSITY ADAPTIVE SUBSPACE LEARNING

被引:0
作者
Zhou, Nan [1 ,2 ]
Cheng, Hong [1 ]
Zheng, Ya-Li [1 ]
He, Liang-Tian [2 ]
Pedrycz, Witold [3 ]
机构
[1] Univ Elect Sci & Technol China, Ctr Robot, Chengdu 611731, Sichuan, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Math Sci, Chengdu 611731, Sichuan, Peoples R China
[3] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB T6G 2R3, Canada
来源
PROCEEDINGS OF 2016 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION (ICWAPR) | 2016年
关键词
Machine learning; Feature selection; Nonnegative matrix factorization; Sparse subspace learning;
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Given the high-dimensionality of the original data, dimensionality reduction becomes a necessary step in data processing. In this study, a novel unsupervised feature selection model is proposed, which regards the unsupervised feature selection process as nonnegative subspace learning. Considering the efficiency of the learned subspace which can better indicate the selected features, a nonnegative sparsity adaptive subspace learning framework is proposed. It adapts the sparsity by weighted l(2),(1) model. Specifically, the weights are defined by multi-stage support detection. Then we provide an approach to solve this weighted l(2,1) constraint non-convex problem leading to the Non-negative Sparsity Adaptive Subspace Learning (NSASL) algorithm. By the experiments which are conducted on real-word datasets, the superiority of proposed method over seven state-of-the- art unsupervised feature selection algorithms is verified.
引用
收藏
页码:18 / 24
页数:7
相关论文
共 17 条
[1]  
[Anonymous], 2000, Pattern Classification, DOI DOI 10.1007/978-3-319-57027-3_4
[2]  
[Anonymous], 2012, P AAAI C ART INT
[3]  
Cai D., 2010, P 16 ACM SIGKDD INT, P333, DOI DOI 10.1145/1835804.1835848
[4]  
He L., 2014, MATH PROBLEMS ENG, V2014
[5]  
He X., 2005, P 18 INT C NEUR INF, P507
[6]   Unsupervised feature selection using feature similarity [J].
Mitra, P ;
Murthy, CA ;
Pal, SK .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2002, 24 (03) :301-312
[7]   Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy [J].
Peng, HC ;
Long, FH ;
Ding, C .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (08) :1226-1238
[8]   13 WAYS TO LOOK AT THE CORRELATION-COEFFICIENT [J].
RODGERS, JL ;
NICEWANDER, WA .
AMERICAN STATISTICIAN, 1988, 42 (01) :59-66
[9]   A robust elastic net approach for feature learning [J].
Wang, Ling ;
Cheng, Hong ;
Liu, Zicheng ;
Zhu, Ce .
JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2014, 25 (02) :313-321
[10]   Subspace learning for unsupervised feature selection via matrix factorization [J].
Wang, Shiping ;
Pedrycz, Witold ;
Zhu, Qingxin ;
Zhu, William .
PATTERN RECOGNITION, 2015, 48 (01) :10-19