Feature selection by independent component analysis and mutual information maximization in EEG signal classification

被引:0
|
作者
Lan, T [1 ]
Erdogmus, D [1 ]
Adami, A [1 ]
Pavel, M [1 ]
机构
[1] Oregon Hlth & Sci Univ, OGI Sch Sci & Engn, Dept Biomed Engn, Beaverton, OR 97006 USA
来源
PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), VOLS 1-5 | 2005年
关键词
feature selection; independent component analysis; mutual information; entropy estimation; EEG; brain-computer interface;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection and dimensionality reduction are important steps in pattern recognition. In this paper, we propose a scheme for feature selection using linear independent component analysis and mutual information maximization method. The method is theoretically motivated by the fact that the classification error rate is related to the mutual information between the feature vectors and the class labels. The feasibility of the principle is illustrated on a synthetic dataset and its performance is demonstrated using EEG signal classification. Experimental results show that this method works well for feature selection.
引用
收藏
页码:3011 / 3016
页数:6
相关论文
共 50 条
  • [41] On the Feature Selection Criterion Based on an Approximation of Multidimensional Mutual Information
    Balagani, Kiran S.
    Phoha, Vir V.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (07) : 1342 - 1343
  • [42] EEG classification using generative independent component analysis
    Chiappa, S
    Barber, D
    NEUROCOMPUTING, 2006, 69 (7-9) : 769 - 777
  • [43] Is mutual information adequate for feature selection in regression?
    Frenay, Benoit
    Doquire, Gauthier
    Verleysen, Michel
    NEURAL NETWORKS, 2013, 48 : 1 - 7
  • [44] Genetic algorithm for feature selection with mutual information
    Ge, Hong
    Hu, Tianliang
    2014 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2014), VOL 1, 2014, : 116 - 119
  • [45] Feature Selection by Maximizing Part Mutual Information
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    2018 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND MACHINE LEARNING (SPML 2018), 2018, : 120 - 127
  • [46] Mutual information for feature selection: estimation or counting?
    Nguyen H.B.
    Xue B.
    Andreae P.
    Evolutionary Intelligence, 2016, 9 (3) : 95 - 110
  • [47] Feature Selection with Mutual Information for Regression Problems
    Sulaiman, Muhammad Aliyu
    Labadin, Jane
    2015 9TH INTERNATIONAL CONFERENCE ON IT IN ASIA (CITA), 2015,
  • [48] Discriminant Mutual Information for Text Feature Selection
    Wang, Jiaqi
    Zhang, Li
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2021), PT II, 2021, 12682 : 136 - 151
  • [49] Signal estimation based on mutual information maximization
    Rohde, G. K.
    Nichols, J.
    Bucholtz, F.
    Michalowicz, J. V.
    CONFERENCE RECORD OF THE FORTY-FIRST ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, VOLS 1-5, 2007, : 597 - +
  • [50] Gender Classification using One Half Face and Feature Selection based on Mutual Information
    Tapia, Juan E.
    Perez, Claudio A.
    2013 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2013), 2013, : 3282 - 3287