Feature selection via Least Squares Support Feature Machine

被引:28
|
作者
Li, Jianping [1 ]
Chen, Zhenyu [1 ,2 ]
Wei, Liwei [1 ,2 ]
Xu, Weixuan [1 ]
Kou, Gang [3 ]
机构
[1] Chinese Acad Sci, Inst Policy & Management, Beijing 100080, Peoples R China
[2] Chinese Acad Sci, Grad Univ, Beijing 100039, Peoples R China
[3] Thomson Corp, St Paul, MN 55123 USA
基金
中国国家自然科学基金;
关键词
feature selection; Support Vector Machine; credit assessment;
D O I
10.1142/S0219622007002733
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In many applications such as credit risk management, data are represented as high-dimensional feature vectors. It makes the feature selection necessary to reduce the computational complexity, improve the generalization ability and the interpretability. In this paper, we present a novel feature selection method -"Least Squares Support Feature Machine" (LS-SFM). The proposed method has two advantages comparing with conventional Support Vector Machine (SVM) and LS-SVM. First, the convex combinations of basic kernels are used as the kernel and each basic kernel makes use of a single feature. It transforms the feature selection problem that cannot be solved in the context of SVM to an ordinary multiple-parameter learning problem. Second, all parameters are learned by a two stage iterative algorithm. A 1-norm based regularized cost function is used to enforce sparseness of the feature parameters. The " support features" refer to the respective features with nonzero feature parameters. Experimental study on some of the UCI datasets and a commercial credit card dataset demonstrates the effectiveness and efficiency of the proposed approach.
引用
收藏
页码:671 / 686
页数:16
相关论文
共 50 条
  • [31] The Improved Particle Swarm Optimization for Feature Selection of Support Vector Machine
    Wang, Sipeng
    Ding, Sheng
    PROCEEDINGS OF 2017 2ND INTERNATIONAL CONFERENCE ON COMMUNICATION AND INFORMATION SYSTEMS (ICCIS 2017), 2015, : 314 - 317
  • [32] The research on the method of feature selection in support vector Machine based Entropy
    Zhu, Xiaoyan
    Tian, Xi
    Zhu, Xiaoxun
    PROGRESS IN POWER AND ELECTRICAL ENGINEERING, PTS 1 AND 2, 2012, 354-355 : 1192 - +
  • [33] Reseach on Feature Selection Algorithm Based on the margin of Support Vector Machine
    Hu, Linfang
    Qiao, Lei
    Huang, Minde
    MEASUREMENT TECHNOLOGY AND ENGINEERING RESEARCHES IN INDUSTRY, PTS 1-3, 2013, 333-335 : 1430 - 1434
  • [34] A Hybrid Kernel Support Vector Machine with Feature Selection for the Diagnosis of Diseases
    Tania, Farjana Akter
    Shill, Pintu Chandra
    2019 4TH INTERNATIONAL CONFERENCE ON ELECTRICAL INFORMATION AND COMMUNICATION TECHNOLOGY (EICT), 2019,
  • [35] Mixed integer linear programming for feature selection in support vector machine
    Labbe, Martine
    Martinez-Merino, Luisa I.
    Rodriguez-Chia, Antonio M.
    DISCRETE APPLIED MATHEMATICS, 2019, 261 : 276 - 304
  • [36] Feature Selection Method Based on Mutual Information and Support Vector Machine
    Liu, Gang
    Yang, Chunlei
    Liu, Sen
    Xiao, Chunbao
    Song, Bin
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2021, 35 (06)
  • [37] Sparse Support Vector Machine with L p Penalty for Feature Selection
    Yao, Lan
    Zeng, Feng
    Li, Dong-Hui
    Chen, Zhi-Gang
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2017, 32 (01) : 68 - 77
  • [38] Regularization feature selection projection twin support vector machine via exterior penalty
    Yi, Ping
    Song, Aiguo
    Guo, Jianhui
    Wang, Ruili
    NEURAL COMPUTING & APPLICATIONS, 2017, 28 : S683 - S697
  • [39] Regularization feature selection projection twin support vector machine via exterior penalty
    Ping Yi
    Aiguo Song
    Jianhui Guo
    Ruili Wang
    Neural Computing and Applications, 2017, 28 : 683 - 697
  • [40] A new approach to history matching based on feature selection and optimized least square support vector machine
    Karimi, Mojtaba
    JOURNAL OF GEOPHYSICS AND ENGINEERING, 2018, 15 (06) : 2378 - 2387