Tree-Structured Feature Extraction Using Mutual Information

被引:20
作者
Oveisi, Farid [1 ]
Oveisi, Shahrzad [2 ]
Efranian, Abbas [3 ]
Patras, Ioannis [1 ]
机构
[1] Queen Mary Univ London, Sch Elect Engn & Comp Sci, London E1 4NS, England
[2] Azad Univ, Dept Comp Engn, Tehran 1969633651, Iran
[3] Iran Univ Sci & Technol, Iran Neural Technol Ctr, Dept Biomed Engn, Tehran 1684613114, Iran
关键词
Classification; dimensionality reduction; feature extraction; mutual information; CLASSIFICATION; SELECTION; ICA;
D O I
10.1109/TNNLS.2011.2178447
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the most informative measures for feature extraction (FE) is mutual information (MI). In terms of MI, the optimal FE creates new features that jointly have the largest dependency on the target class. However, obtaining an accurate estimate of a high-dimensional MI as well as optimizing with respect to it is not always easy, especially when only small training sets are available. In this paper, we propose an efficient tree-based method for FE in which at each step a new feature is created by selecting and linearly combining two features such that the MI between the new feature and the class is maximized. Both the selection of the features to be combined and the estimation of the coefficients of the linear transform rely on estimating 2-D MIs. The estimation of the latter is computationally very efficient and robust. The effectiveness of our method is evaluated on several real-world data sets. The results show that the classification accuracy obtained by the proposed method is higher than that achieved by other FE methods.
引用
收藏
页码:127 / 137
页数:11
相关论文
共 28 条
[1]  
[Anonymous], IEEE T NEURAL NETW
[2]  
[Anonymous], 1973, Pattern Classification and Scene Analysis
[3]  
[Anonymous], 1997, MACHINE LEARNING, MCGRAW-HILL SCIENCE/ENGINEERING/MATH
[4]   Classification of Hyperspectral Images With Regularized Linear Discriminant Analysis [J].
Bandos, Tatyana V. ;
Bruzzone, Lorenzo ;
Camps-Valls, Gustavo .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2009, 47 (03) :862-873
[5]   USING MUTUAL INFORMATION FOR SELECTING FEATURES IN SUPERVISED NEURAL-NET LEARNING [J].
BATTITI, R .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (04) :537-550
[6]   Geometric symmetry in the Quadratic Fisher Discriminant operating on image pixels [J].
Caprari, RS .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (04) :1780-1788
[7]   Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information [J].
Chow, TWS ;
Huang, D .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (01) :213-224
[8]   Estimation of the information by an adaptive partitioning of the observation space [J].
Darbellay, GA ;
Vajda, I .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1999, 45 (04) :1315-1321
[9]   Complex random vectors and ICA models: Identifiability, uniqueness, and separability [J].
Eriksson, J ;
Koivunen, V .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (03) :1017-1029
[10]   Compressive-Projection Principal Component Analysis [J].
Fowler, James E. .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2009, 18 (10) :2230-2242