Sparse optimization in feature selection: application in neuroimaging

被引:0
|
作者
K. Kampa
S. Mehta
C. A. Chou
W. A. Chaovalitwongse
T. J. Grabowski
机构
[1] University of Washington,Department of Industrial and Systems Engineering
[2] University of Washington Medical Center,Integrated Brain Imaging Center
[3] University of Washington,Department of Radiology
[4] University of Washington,Department of Psychology
[5] Binghamton University,Department of Systems Science and Industrial Engineering
[6] State University of New York,Department of Radiology
[7] University of Washington,Department of Neurology
[8] University of Washington,undefined
来源
Journal of Global Optimization | 2014年 / 59卷
关键词
Sparse optimization; Feature selection; Machine learning; fMRI; Cognitive neuroscience; Regularization ; Pattern classification;
D O I
暂无
中图分类号
学科分类号
摘要
Feature selection plays an important role in the successful application of machine learning techniques to large real-world datasets. Avoiding model overfitting, especially when the number of features far exceeds the number of observations, requires selecting informative features and/or eliminating irrelevant ones. Searching for an optimal subset of features can be computationally expensive. Functional magnetic resonance imaging (fMRI) produces datasets with such characteristics creating challenges for applying machine learning techniques to classify cognitive states based on fMRI data. In this study, we present an embedded feature selection framework that integrates sparse optimization for regularization (or sparse regularization) and classification. This optimization approach attempts to maximize training accuracy while simultaneously enforcing sparsity by penalizing the objective function for the coefficients of the features. This process allows many coefficients to become zero, which effectively eliminates their corresponding features from the classification model. To demonstrate the utility of the approach, we apply our framework to three different real-world fMRI datasets. The results show that regularized classifiers yield better classification accuracy, especially when the number of initial features is large. The results further show that sparse regularization is key to achieving scientifically-relevant generalizability and functional localization of classifier features. The approach is thus highly suited for analysis of fMRI data.
引用
收藏
页码:439 / 457
页数:18
相关论文
共 50 条
  • [21] Sparse Mutual Granularity-Based Feature Selection and its Application of Schizophrenia Patients
    Ju, Hengrong
    Yin, Tao
    Huang, Jiashuang
    Ding, Weiping
    Yang, Xibei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (01): : 604 - 614
  • [22] Autoweighted Multiview Feature Selection With Graph Optimization
    Wang, Qi
    Jiang, Xu
    Chen, Mulin
    Li, Xuelong
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 12966 - 12977
  • [23] Online Feature Selection Using Sparse Gradient
    Banu, Nasrin N.
    Kumar, Radha Senthil
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2022, 31 (08)
  • [24] Feature selection via kernel sparse representation
    Lv, Zhizheng
    Li, Yangding
    Li, Jieye
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2637 - 2644
  • [25] Sparse structural feature selection for multitarget regression
    Yuan, Haoliang
    Zheng, Junjie
    Lai, Loi Lei
    Tang, Yuan Yan
    KNOWLEDGE-BASED SYSTEMS, 2018, 160 : 200 - 209
  • [26] Sparse and Flexible Projections for Unsupervised Feature Selection
    Wang, Rong
    Zhang, Canyu
    Bian, Jintang
    Wang, Zheng
    Nie, Feiping
    Li, Xuelong
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (06) : 6362 - 6375
  • [27] BAYESIAN FEATURE SELECTION FOR SPARSE TOPIC MODEL
    Chang, Ying-Lan
    Lee, Kuen-Feng
    Chien, Jen-Tzung
    2011 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2011,
  • [28] A Robust Sparse Feature Selection for Hyperspectral Images
    Kiyamousavi, S. Ensiye
    Rezghi, Mansoor
    2016 2ND INTERNATIONAL CONFERENCE OF SIGNAL PROCESSING AND INTELLIGENT SYSTEMS (ICSPIS), 2016, : 154 - 158
  • [29] CARDINAL SPARSE PARTIAL LEAST SQUARE FEATURE SELECTION AND ITS APPLICATION IN FACE RECOGNITION
    Zhang, Honglei
    Kiranyaz, Serkan
    Gabbouj, Moncef
    2014 PROCEEDINGS OF THE 22ND EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2014, : 785 - 789
  • [30] Binary Horse Optimization Algorithm for Feature Selection
    Moldovan, Dorin
    ALGORITHMS, 2022, 15 (05)