Iteratively local fisher score for feature selection

被引:32
作者
Gan, Min [1 ,2 ]
Zhang, Li [1 ,2 ,3 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou 215006, Jiangsu, Peoples R China
[2] Soochow Univ, Joint Int Res Lab Machine Learning & Neuromorph C, Suzhou 215006, Jiangsu, Peoples R China
[3] Soochow Univ, Prov Key Lab Comp Informat Proc Technol, Suzhou 215006, Jiangsu, Peoples R China
关键词
Feature selection; Fisher score; Neighbourhood; Iterative; MUTUAL INFORMATION; RELEVANCE;
D O I
10.1007/s10489-020-02141-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In machine learning, feature selection is a kind of important dimension reduction techniques, which aims to choose features with the best discriminant ability to avoid the issue of curse of dimensionality for subsequent processing. As a supervised feature selection method, Fisher score (FS) provides a feature evaluation criterion and has been widely used. However, FS ignores the association between features by assessing all features independently and loses the local information for fully connecting within-class samples. In order to solve these issues, this paper proposes a novel feature evaluation criterion based on FS, named iteratively local Fisher score (ILFS). Compared with FS, the new criterion pays more attention to the local structure of data by using K nearest neighbours instead of all samples when calculating the scatters of within-class and between-class. In order to consider the relationship between features, we calculate local Fisher scores of feature subsets instead of scores of single features, and iteratively select the current optimal feature to achieve this idea like sequential forward selection (SFS). Experimental results on UCI and TEP data sets show that the improved algorithm performs well in classification activities compared with some other state-of-the-art methods.
引用
收藏
页码:6167 / 6181
页数:15
相关论文
共 35 条
[1]   Hybrid clustering analysis using improved krill herd algorithm [J].
Abualigah, Laith Mohammad ;
Khader, Ahamad Tajudin ;
Hanandeh, Essam Said .
APPLIED INTELLIGENCE, 2018, 48 (11) :4047-4071
[2]  
Abualigah LMQ., 2019, FEATURE SELECTION EN, DOI 10.1007/978-3-030-10674-4
[3]  
Appice A., 2004, Proceedings of the 21st International Conference on Machine Learning, P5
[4]  
Bishop CM, 1995, NEURAL NETWORKS PATT, P477
[5]   Weighted nearest neighbors feature selection [J].
Bugata, Peter ;
Drotar, Peter .
KNOWLEDGE-BASED SYSTEMS, 2019, 163 :749-761
[6]   A survey on feature selection methods [J].
Chandrashekar, Girish ;
Sahin, Ferat .
COMPUTERS & ELECTRICAL ENGINEERING, 2014, 40 (01) :16-28
[7]   Face recognition based on multi-class mapping of Fisher scores [J].
Chen, L ;
Man, H ;
Nefian, AV .
PATTERN RECOGNITION, 2005, 38 (06) :799-811
[8]   Semantic Fisher Scores for Task Transfer: Using Objects to Classify Scenes [J].
Dixit, Mandar ;
Li, Yunsheng ;
Vasconcelos, Nuno .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (12) :3102-3118
[9]   A PLANT-WIDE INDUSTRIAL-PROCESS CONTROL PROBLEM [J].
DOWNS, JJ ;
VOGEL, EF .
COMPUTERS & CHEMICAL ENGINEERING, 1993, 17 (03) :245-255
[10]  
Dua D, 2017, UCI MACHINE LEARNING, DOI DOI 10.1016/J.DSS.2009.05.016