Feature Selection for Text Classification Using Mutual Information

被引:2
作者
Sel, Ilhami [1 ]
Karci, Ali [1 ]
Hanbay, Davut [1 ]
机构
[1] Inonu Univ, Bilgisayar Muhendisligi Bolumu, Malatya, Turkey
来源
2019 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND DATA PROCESSING (IDAP 2019) | 2019年
关键词
Natural Language Processing; Doc2Vec; Mutual Information; Maximum Entropy;
D O I
10.1109/idap.2019.8875927
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The feature selection can be defined as the selection of the best subset to represent the data set, that is, the removal of unnecessary data that does not affect the result. The efficiency and accuracy of the system can be increased by decreasing the size and the feature selection in classification applications. In this study, text classification was applied by using "20 news group" data published by Reuters news agency. The pre-processed news data were converted into vectors using the Doc2Vec method and a data set was created. This data set is classified by the Maximum Entropy Classification method. Afterwards, a subset of data sets was created by using the Mutual Information Method for the feature selection. Reclassification was performed with the resulting data set and the results were compared according to the performance rates. While the success of the system with 600 features was (0.9285) before the feature selection, (0.9285), then, the performance rates of the 200, 100, 50, 20 models were obtained as (0.9454, 0.9426, 0.9407, 0.9123), respectively. When the results were examined, the success of the 50-featured model was higher than the 600-featured model initially created.
引用
收藏
页数:4
相关论文
共 15 条
[1]  
[Anonymous], 2019, 20 NEWSGROUPS DATA S
[2]  
[Anonymous], 2013, Data Clustering: Algorithms and Applications
[3]  
[Anonymous], 2017, 2017 25 SIGNAL PROCE, DOI DOI 10.1109/SIU.2017.7960552
[4]  
CELIK Ceyhun, 2015, GAZI U MUHENDISLIK M, V30
[5]  
Forman G., 2003, Journal of Machine Learning Research, V3, P1289, DOI 10.1162/153244303322753670
[6]  
Guyon I., 2020, J MACH LEARN RES, V3, P1157, DOI [DOI 10.1162/153244303322753616, 10.1162/153244303322753616]
[7]  
Jin X, 2006, LECT NOTES COMPUT SC, V3916, P106
[8]  
Khan Aurangzeb, 2010, Journal of Advances in Information Technology, V1, P4, DOI 10.4304/jait.1.1.4-20
[9]  
Le Q., 2014, ICML, P1188
[10]  
McCallum A, 1999, IJCAI-99: PROCEEDINGS OF THE SIXTEENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOLS 1 & 2, P662