Sequential random k-nearest neighbor feature selection for high-dimensional data

被引:74
作者
Park, Chan Hee [1 ]
Kim, Seoung Bum [1 ]
机构
[1] Korea Univ, Sch Ind Management Engn, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
Feature selection; High dimensionality; Ensemble; Wrapper; Random forest; k-NN; CLASSIFICATION; FOREST;
D O I
10.1016/j.eswa.2014.10.044
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection based on an ensemble classifier has been recognized as a crucial technique for modeling high-dimensional data. Feature selection based on the random forests model, which is constructed by aggregating multiple decision tree classifiers, has been widely used. However, a lack of stability and balance in decision trees decreases the robustness of random forests. This limitation motivated us to propose a feature selection method based on newly designed nearest-neighbor ensemble classifiers. The proposed method finds significant features by using an iterative procedure. We performed experiments with 20 datasets of microarray gene expressions to examine the property of the proposed method and compared it with random forests. The results demonstrated the effectiveness and robustness of the proposed method, especially when the number of features exceeds the number of observations. (C) 2014 Elsevier Ltd. All rights reserved.
引用
收藏
页码:2336 / 2342
页数:7
相关论文
共 29 条
[11]  
Hall M. A., 1999, Proceedings of the Twelfth International Florida AI Research Society Conference, P235
[12]   Credit scoring with a data mining approach based on support vector machines [J].
Huang, Cheng-Lung ;
Chen, Mu-Chen ;
Wang, Chieh-Jen .
EXPERT SYSTEMS WITH APPLICATIONS, 2007, 33 (04) :847-856
[13]   A GA-based feature selection and parameters optimization for support vector machines [J].
Huang, Cheng-Lung ;
Wang, Chieh-Jen .
EXPERT SYSTEMS WITH APPLICATIONS, 2006, 31 (02) :231-240
[14]  
Jing Yang, 2014, Bioinformatics Research and Applications. 10th International Symposium, ISBRA 2014. Proceedings: LNCS 8492, P1, DOI 10.1007/978-3-319-08171-7_1
[15]   Stability of feature selection algorithms [J].
Kalousis, A ;
Prados, J ;
Hilario, M .
FIFTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2005, :218-225
[16]  
Kononenko I., 1994, EUR C MACH LEARN, P171, DOI DOI 10.1007/3-540-57868-4_57
[17]   Random KNN feature selection - a fast and stable alternative to Random Forests [J].
Li, Shengqiao ;
Harner, E. James ;
Adjeroh, Donald A. .
BMC BIOINFORMATICS, 2011, 12
[18]   Knowledge management technologies and applications - literature review from 1995 to 2002 [J].
Liao, SH .
EXPERT SYSTEMS WITH APPLICATIONS, 2003, 25 (02) :155-164
[19]  
Liu H., 1998, FEATURE SELECTION KN
[20]   A comparative analysis of kNN and decision tree methods for the Irish National Forest Inventory [J].
McInerney, Daniel O. ;
Nieuwenhuis, Maarten .
INTERNATIONAL JOURNAL OF REMOTE SENSING, 2009, 30 (19) :4937-4955