Information gain directed genetic algorithm wrapper feature selection for credit rating

被引:216
|
作者
Jadhav, Swati [1 ]
He, Hongmei [1 ]
Jenkins, Karl [1 ]
机构
[1] Cranfield Univ, Sch Aerosp Transport & Mfg, Cranfield MK43 0AL, Beds, England
关键词
Feature selection; Genetic algorithm in wrapper; Support vector machine; K nearest neighbour clustering; Naive Bayes classifier; Information gain; Credit scoring; Accuracy; ROC curve; SUPPORT VECTOR MACHINES; SWARM OPTIMIZATION; CLASSIFICATION; HYBRID; COMBINATION; MODEL; SVM; SET;
D O I
10.1016/j.asoc.2018.04.033
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Financial credit scoring is one of the crucial processes in the finance industry sector to be able to assess the credit-worthiness of individuals and enterprises. Various statistics-based machine learning techniques have been employed for this task. "Curse of Dimensionality" is still a significant challenge in machine learning techniques. Some research has been carried out on Feature Selection (FS) using genetic algorithm as wrapper to improve the performance of credit scoring models. However, the challenge lies in finding an overall best method in credit scoring problems and improving the time-consuming process of feature selection. In this study, the credit scoring problem is investigated through feature selection to improve classification performance. This work proposes a novel approach to feature selection in credit scoring applications, called as Information Gain Directed Feature Selection algorithm (IGDFS), which performs the ranking of features based on information gain, propagates the top in features through the GA wrapper (GAW) algorithm using three classical machine learning algorithms of KNN, Naive Bayes and Support Vector Machine (SVM) for credit scoring. The first stage of information gain guided feature selection can help reduce the computing complexity of GA wrapper, and the information gain of features selected with the IGDFS can indicate their importance to decision making. Regarding the classification accuracy, SVM accuracy is always better than KNN and NB for Baseline techniques, GAW and IGDFS. Also, we can conclude that the IGDFS achieved better performance than generic GAW, and GAW obtained better performance than the corresponding single classifiers (baseline) for almost all cases, except for the German Credit dataset, IGDFS + KNN has worse performance than generic GAW and the single classifier KNN. Removing features with low information gain could produce conflict with the original data structure for KNN, and thus affect the performance of IGDFS + KNN. Regarding the ROC performance, for the German Credit Dataset, the three classic machine learning algorithms, SVM, KNN and Naive Bayes in the wrapper of IGDFS GA obtained almost the same performance. For the Australian credit dataset and the Taiwan Credit dataset, the IGDFS + Naive Bayes achieved the largest area under ROC curves. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:541 / 553
页数:13
相关论文
共 50 条
  • [31] Gabor Feature Selection Based on Information Gain
    Lefkovits, Szidonia
    Lefkovits, Laszlo
    10TH INTERNATIONAL CONFERENCE INTERDISCIPLINARITY IN ENGINEERING, INTER-ENG 2016, 2017, 181 : 892 - 898
  • [32] Feature selection based on rough set approach, wrapper approach, and binary whale optimization algorithm
    Tawhid, Mohamed A.
    Ibrahim, Abdelmonem M.
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2020, 11 (03) : 573 - 602
  • [33] A hybrid system with filter approach and multiple population genetic algorithm for feature selection in credit scoring
    Wang, Di
    Zhang, Zuoquan
    Bai, Rongquan
    Mao, Yanan
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2018, 329 : 307 - 321
  • [34] An Improved Information Gain Feature Selection Algorithm for SVM Text Classifier
    Xu, Jiamin
    Jiang, Hong
    2015 INTERNATIONAL CONFERENCE ON CYBER-ENABLED DISTRIBUTED COMPUTING AND KNOWLEDGE DISCOVERY, 2015, : 273 - 276
  • [35] Wrapper Feature Subset Selection for Dimension Reduction Based on Ensemble Learning Algorithm
    Panthong, Rattanawadee
    Srivihok, Anongnart
    THIRD INFORMATION SYSTEMS INTERNATIONAL CONFERENCE 2015, 2015, 72 : 162 - 169
  • [36] Embedded chaotic whale survival algorithm for filter-wrapper feature selection
    Guha, Ritam
    Ghosh, Manosij
    Mutsuddi, Shyok
    Sarkar, Ram
    Mirjalili, Seyedali
    SOFT COMPUTING, 2020, 24 (17) : 12821 - 12843
  • [37] Information Gain Based Feature Selection for Improved Textual Sentiment Analysis
    Ramasamy, Madhumathi
    Kowshalya, A. Meena
    WIRELESS PERSONAL COMMUNICATIONS, 2022, 125 (02) : 1203 - 1219
  • [38] A novel hybrid wrapper-filter approach based on genetic algorithm, particle swarm optimization for feature subset selection
    Moslehi, Fateme
    Haeri, Abdorrahman
    JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2020, 11 (03) : 1105 - 1127
  • [39] Information Gain Based Feature Selection for Improved Textual Sentiment Analysis
    Madhumathi Ramasamy
    A. Meena Kowshalya
    Wireless Personal Communications, 2022, 125 : 1203 - 1219
  • [40] A novel multi-objective forest optimization algorithm for wrapper feature selection
    Nouri-Moghaddam, Babak
    Ghazanfari, Mehdi
    Fathian, Mohammad
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 175