A new feature selection method for handling redundant information in text classification

被引:9
作者
Wang, You-wei [1 ]
Feng, Li-zhou [2 ]
机构
[1] Cent Univ Finance & Econ, Sch Informat, Beijing 100081, Peoples R China
[2] Tianjin Univ Finance & Econ, Sch Sci & Engn, Tianjin 300222, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Feature selection; Dimensionality reduction; Text classification; Redundant features; Support vector machine; Naive Bayes; Mutual information; MUTUAL INFORMATION; HARMONY SEARCH; CATEGORIZATION; ALGORITHM;
D O I
10.1631/FITEE.1601761
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection is an important approach to dimensionality reduction in the field of text classification. Because of the difficulty in handling the problem that the selected features always contain redundant information, we propose a new simple feature selection method, which can effectively filter the redundant features. First, to calculate the relationship between two words, the definitions of word frequency based relevance and correlative redundancy are introduced. Furthermore, an optimal feature selection (OFS) method is chosen to obtain a feature subset FS1. Finally, to improve the execution speed, the redundant features in FS1 are filtered by combining a predetermined threshold, and the filtered features are memorized in the linked lists. Experiments are carried out on three datasets (WebKB, 20-Newsgroups, and Reuters-21578) where in support vector machines and na < ve Bayes are used. The results show that the classification accuracy of the proposed method is generally higher than that of typical traditional methods (information gain, improved Gini index, and improved comprehensively measured feature selection) and the OFS methods. Moreover, the proposed method runs faster than typical mutual information-based methods (improved and normalized mutual information-based feature selections, and multilabel feature selection based on maximum dependency and minimum redundancy) while simultaneously ensuring classification accuracy. Statistical results validate the effectiveness of the proposed method in handling redundant information in text classification.
引用
收藏
页码:221 / 234
页数:14
相关论文
共 41 条
[1]   Chaotic harmony search algorithms [J].
Alatas, Bilal .
APPLIED MATHEMATICS AND COMPUTATION, 2010, 216 (09) :2687-2699
[2]  
[Anonymous], 1998, P AAAI 98 WORKSH LEA, DOI DOI 10.1109/TSMC.1985.6313426
[3]  
[Anonymous], 1997, An Algorithm for Suffix Stripping, page
[4]  
Apte C, 1999, C AUT LEARN DISC, P169
[5]   USING MUTUAL INFORMATION FOR SELECTING FEATURES IN SUPERVISED NEURAL-NET LEARNING [J].
BATTITI, R .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (04) :537-550
[6]  
Breiman L, 1984, CLASSIFICATION REGRE, DOI [10.1111/j.1365-2753.2011.01762.x, DOI 10.1111/J.1365-2753.2011.01762.X]
[7]   An ontology enhanced parallel SVM for scalable spam filter training [J].
Caruana, Godwin ;
Li, Maozhen ;
Liu, Yang .
NEUROCOMPUTING, 2013, 108 :45-57
[8]   A naive Bayes classifier for planning transfusion requirements in heart surgery [J].
Cevenini, Gabriele ;
Barbini, Emanuela ;
Massai, Maria R. ;
Barbini, Paolo .
JOURNAL OF EVALUATION IN CLINICAL PRACTICE, 2013, 19 (01) :25-29
[9]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[10]   Feature selection for text classification with Naive Bayes [J].
Chen, Jingnian ;
Huang, Houkuan ;
Tian, Shengfeng ;
Qu, Youli .
EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (03) :5432-5435