GMDH-based semi-supervised feature selection for customer classification

被引:44
作者
Xiao, Jin [1 ,2 ]
Cao, Hanwen [1 ]
Jiang, Xiaoyi [3 ]
Gu, Xin [1 ,2 ]
Xie, Ling [1 ,2 ]
机构
[1] Sichuan Univ, Business Sch, Chengdu 610064, Sichuan, Peoples R China
[2] Sichuan Univ, Soft Sci Inst, Chengdu 610064, Sichuan, Peoples R China
[3] Univ Munster, Dept Math & Comp Sci, Einsteinstr 62, D-48149 Munster, Germany
基金
中国国家自然科学基金;
关键词
Feature selection; Group method of data handling (GMDH); Customer classification; Semi-supervised learning; CHURN PREDICTION; OBJECT DETECTION; NEURAL-NETWORKS; ALGORITHMS; CONSTRAINT; RELEVANCE; SYSTEM;
D O I
10.1016/j.knosys.2017.06.018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Data dimension reduction is an important step for customer classification modeling, and feature selection has been a research focus of the data dimension reduction field. This study introduces the group method of data handling (GMDH), puts forward a GMDH-based semi-supervised feature selection (GMDH-SSFS) algorithm, and applies it to customer feature selection. The algorithm can utilize a few samples with class labels L, and a large number of samples without class labels U simultaneously. What is more, it considers the relationship between features and class labels, and that between features during feature selection. The GMDH-SSFS model mainly consists of three stages: 1) Train N basic classification models based on the dataset L with class labels; 2) Label samples selectively in the dataset U without class labels, and add them to L; 3) Train the GMDH neural network based on the new training set L, and select the optimal feature subset Fs. Based on an empirical analysis of four customer classification datasets, results suggest that the features selected by the GMDH-SSFS model have a good explainability. Meanwhile, the customer classification performance of the classification model trained by the selected feature subset is superior to that of the models trained by the commonly used Laplacian score (an unsupervised feature selection algorithm), Fisher score (a supervised feature selection algorithm), and the FW-SemiFS and S3VM-FS (two semi-supervised feature selection algorithms). (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:236 / 248
页数:13
相关论文
共 60 条
[1]  
Abrishami H., 2008, IRANIAN EC RES, V12, P37
[2]   Semi-supervised SVM-based Feature Selection for Cancer Classification using Microarray Gene Expression Data [J].
Ang, Jun Chin ;
Haron, Habibollah ;
Hamed, Haza Nuzly Abdull .
CURRENT APPROACHES IN APPLIED ARTIFICIAL INTELLIGENCE, 2015, 9101 :468-477
[3]  
[Anonymous], 2001, Pattern Classification
[4]   Introduction to semi-supervised learning [J].
Goldberg, Xiaojin .
Synthesis Lectures on Artificial Intelligence and Machine Learning, 2009, 6 :1-116
[5]   VHR Object Detection Based on Structural Feature Extraction and Query Expansion [J].
Bai, Xiao ;
Zhang, Huigang ;
Zhou, Jun .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2014, 52 (10) :6508-6520
[6]   A Survey of Evolutionary Algorithms for Decision-Tree Induction [J].
Barros, Rodrigo Coelho ;
Basgalupp, Marcio Porto ;
de Carvalho, Andre C. P. L. F. ;
Freitas, Alex A. .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 2012, 42 (03) :291-312
[7]  
Belkin M, 2002, ADV NEUR IN, V14, P585
[8]   Efficient Semi-Supervised Feature Selection: Constraint, Relevance, and Redundancy [J].
Benabdeslem, Khalid ;
Hindawi, Mohammed .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2014, 26 (05) :1131-1143
[9]  
Benabdeslem K, 2011, LECT NOTES ARTIF INT, V6911, P204, DOI 10.1007/978-3-642-23780-5_23
[10]   Breast cancer diagnosis using Genetically Optimized Neural Network model [J].
Bhardwaj, Arpit ;
Tiwari, Aruna .
EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (10) :4611-4620