Interpretation of linear classifiers by means of feature relevance bounds

被引:7
|
作者
Goepfert, Christina [1 ]
Pfannschmidt, Lukas [1 ]
Goepfert, Jan Philip [1 ]
Hammer, Barbara [1 ]
机构
[1] Cognit Interact Technol, Inspirat 1, D-33619 Bielefeld, Germany
关键词
Feature relevance; Feature selection; Interpretability; All-relevant; Linear classification; FEATURE-SELECTION;
D O I
10.1016/j.neucom.2017.11.074
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Research on feature relevance and feature selection problems goes back several decades, but the importance of these areas continues to grow as more and more data becomes available, and machine learning methods are used to gain insight and interpret, rather than solely to solve classification or regression problems. Despite the fact that feature relevance is often discussed, it is frequently poorly defined, and the feature selection problems studied are subtly different. Furthermore, the problem of finding all features relevant for a classification problem has only recently started to gain traction, despite its importance for interpretability and integrating expert knowledge. In this paper, we attempt to unify commonly used concepts and to give an overview of the main questions and results. We formalize two interpretations of the all-relevant problem and propose a polynomial method to approximate one of them for the important hypothesis class of linear classifiers, which also enables a distinction between strongly and weakly relevant features. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:69 / 79
页数:11
相关论文
共 50 条
  • [1] Valid Interpretation of Feature Relevance for Linear Data Mappings
    Frenay, Benoit
    Hofmann, Daniela
    Schulz, Alexander
    Biehl, Michael
    Hammer, Barbara
    2014 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DATA MINING (CIDM), 2014, : 149 - 156
  • [2] Generalization Error Bounds for Multiclass Sparse Linear Classifiers
    Levy, Tomer
    Abramovich, Felix
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [3] Relevance and Redundancy Analysis for Ensemble Classifiers
    Duangsoithong, Rakkrit
    Windeatt, Terry
    MACHINE LEARNING AND DATA MINING IN PATTERN RECOGNITION, 2009, 5632 : 206 - 220
  • [4] Feature Relevance Analysis for Writer Identification
    Siddiqi, Imran
    Khurshid, Khurram
    Vincent, Nicole
    DOCUMENT RECOGNITION AND RETRIEVAL XVIII, 2011, 7874
  • [5] Feature Selection and Non-Linear Classifiers: Effects on Simultaneous Motion Recognition in Upper Limb
    Camargo, Jonathan
    Young, Aaron
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2019, 27 (04) : 743 - 750
  • [6] Test feature classifiers: Performance and applications
    Lashkia, V
    Aleshin, S
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2001, 31 (04): : 643 - 650
  • [7] Bootstrap Feature Selection for Ensemble Classifiers
    Duangsoithong, Rakkrit
    Windeatt, Terry
    ADVANCES IN DATA MINING: APPLICATIONS AND THEORETICAL ASPECTS, 2010, 6171 : 28 - 41
  • [8] Designing genetic programming classifiers with feature selection and feature construction
    Ma, Jianbin
    Gao, Xiaoying
    APPLIED SOFT COMPUTING, 2020, 97
  • [9] Feature relevance determination for ordinal regression in the context of feature redundancies and privileged information
    Pfannschmidt, Lukas
    Jakob, Jonathan
    Hinder, Fabian
    Biehl, Michael
    Tino, Peter
    Hammer, Barbara
    NEUROCOMPUTING, 2020, 416 : 266 - 279
  • [10] UNSUPERVISED FEATURE SELECTION BASED ON FEATURE RELEVANCE
    Zhang, Feng
    Zhao, Ya-Jun
    Chen, Jun-Fen
    PROCEEDINGS OF 2009 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-6, 2009, : 487 - +