An Improved Feature Selection Algorithm with Conditional Mutual Information for Classification Problems

被引:0
作者
Palanichamy, Jaganathan [1 ]
Ramasamy, Kuppuchamy [1 ]
机构
[1] PSNA Coll Engn & Technol, Dept Comp Applicat, Dindigul, Tamil Nadu, India
来源
2013 INTERNATIONAL CONFERENCE ON HUMAN COMPUTER INTERACTIONS (ICHCI) | 2013年
关键词
Mutual Information; Conditional Mutual Information; Feature Selection; Classification;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The purpose of the feature selection is to eliminate insignificant features from entire dataset and simultaneously to keep the class discriminatory information for classification problems. Many feature selection algorithms have been proposed to measure the relevance and redundancy of the features and class variables. In this paper, we proposed an improved feature selection algorithm based on maximum relevance and minimum redundancy criterion. The relevance of a feature to the class variables are evaluated with mutual information and conditional mutual information is used to calculate the redundancy between the selected and the candidate features to each class variable. The experimental result is tested with five benchmarked datasets available from UCI Machine Learning Repository. The results shows the proposed algorithm is considered quite well when compared with some existing algorithms.
引用
收藏
页数:5
相关论文
共 14 条
  • [1] USING MUTUAL INFORMATION FOR SELECTING FEATURES IN SUPERVISED NEURAL-NET LEARNING
    BATTITI, R
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (04): : 537 - 550
  • [2] Cover T. M., 1999, Elements of information theory
  • [3] Normalized Mutual Information Feature Selection
    Estevez, Pablo. A.
    Tesmer, Michel
    Perez, Claudio A.
    Zurada, Jacek A.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (02): : 189 - 201
  • [4] Frank A., 2010, UCI machine learning repository, V213
  • [5] Haykin S., 1999, Neural Networks: A Comprehensive Foundation, DOI DOI 10.1017/S0269888998214044
  • [6] A hybrid genetic algorithm for feature selection wrapper based on mutual information
    Huang, Jinjie
    Cai, Yunze
    Xu, Xiaoming
    [J]. PATTERN RECOGNITION LETTERS, 2007, 28 (13) : 1825 - 1844
  • [7] [HUANG JinJie 黄金杰], 2008, [自动化学报, Acta Automatica Sinica], V34, P383
  • [8] Wrappers for feature subset selection
    Kohavi, R
    John, GH
    [J]. ARTIFICIAL INTELLIGENCE, 1997, 97 (1-2) : 273 - 324
  • [9] Input feature selection for classification problems
    Kwak, N
    Choi, CH
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (01): : 143 - 159
  • [10] Information-theoretic algorithm for feature selection
    Last, M
    Kandel, A
    Maimon, O
    [J]. PATTERN RECOGNITION LETTERS, 2001, 22 (6-7) : 799 - 811