Feature Selection for Support Vector Regression via Kernel Penalization

被引:0
|
作者
Maldonado, Sebastian [1 ]
Weber, Richard [1 ]
机构
[1] Univ Chile, Dept Ind Engn, Santiago, Chile
来源
2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010 | 2010年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel feature selection approach (KP-SVR) that determines a non-linear regression function with minimal error and simultaneously minimizes the number of features by penalizing their use in the dual formulation of SVR. The approach optimizes the width of an anisotropic RBF Kernel using an iterative algorithm based on the gradient descent method, eliminating features that have low relevance for the regression model. Our approach presents an explicit stopping criterion, indicating clearly when eliminating further features begins to affect negatively the model's performance. Experiments with two real-world benchmark problems demonstrate that our approach accomplishes the best performance compared to well-known feature selection methods using consistently a small number of features.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Linear penalization support vector machines for feature selection
    Miranda, J
    Montoya, R
    Weber, R
    PATTERN RECOGNITION AND MACHINE INTELLIGENCE, PROCEEDINGS, 2005, 3776 : 188 - 192
  • [2] Flexible Kernel Selection in Multitask Support Vector Regression
    Ruiz, Carlos
    Alaiz, Carlos M.
    Catalina, Alejandro
    Dorronsoro, Jose R.
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [3] Feature selection for support vector machines with RBF kernel
    Liu, Quanzhong
    Chen, Chihau
    Zhang, Yang
    Hu, Zhengguo
    ARTIFICIAL INTELLIGENCE REVIEW, 2011, 36 (02) : 99 - 115
  • [4] Feature selection for support vector machines with RBF kernel
    Quanzhong Liu
    Chihau Chen
    Yang Zhang
    Zhengguo Hu
    Artificial Intelligence Review, 2011, 36 : 99 - 115
  • [5] Nonlinear feature selection for support vector quantile regression
    Ye, Ya-Fen
    Wang, Jie
    Chen, Wei-Jie
    NEURAL NETWORKS, 2025, 185
  • [6] Evolutionary feature and parameter selection in support vector regression
    Mejia-Guevara, Ivan
    Kuri-Morales, Angel
    MICAI 2007: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2007, 4827 : 399 - +
  • [7] Feature Selection Based on Twin Support Vector Regression
    Wu, Qing
    Zhang, Haoyi
    Jing, Rongrong
    Li, Yiran
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2903 - 2907
  • [8] A Novel Hybrid Genetic Algorithm and Simulated Annealing for Feature Selection and Kernel Optimization in Support Vector Regression
    Wu, Jiansheng
    Lu, Zusong
    Jin, Long
    2012 IEEE 13TH INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION (IRI), 2012, : 401 - 406
  • [9] A Novel Hybrid Genetic Algorithm and Simulated Annealing for Feature Selection and Kernel Optimization in Support Vector Regression
    Wu, Jiansheng
    Lu, Zusong
    2012 IEEE FIFTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2012, : 999 - 1003
  • [10] Sparse kernel feature extraction via support vector learning
    Wang, Kunzhe
    Xiao, Huaitie
    PATTERN RECOGNITION LETTERS, 2018, 101 : 67 - 73