The Hybrid Dynamic Prototype Construction and Parameter Optimization with Genetic Algorithm for Support Vector Machine

被引:0
作者
Lu, Chun-Liang [1 ,2 ]
Chung, I-Fang [1 ]
Lin, Tsun-Chen [3 ]
机构
[1] Natl Yang Ming Univ, Inst Biomed Informat, Taipei, Taiwan
[2] Ching Kuo Inst Management & Hlth, Dept Appl Informat & Multimedia, Keelung, Keelung County, Taiwan
[3] Dahan Inst Technol, Dept Comp & Commun Engn, Hualien, Hualien County, Taiwan
关键词
Genetic Algorithm (GA); Dynamic Condensed Nearest Neighbor (DCNN); Support Vector Machine (SVM);
D O I
暂无
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The optimized hybrid artificial intelligence model is a potential tool to deal with construction engineering and management problems. Support vector machine (SVM) has achieved excellent performance in a wide variety of applications. Nevertheless, how to effectively reduce the training complexity for SVM is still a serious challenge. In this paper, a novel order-independent approach for instance selection, called the dynamic condensed nearest neighbor (DCNN) rule, is proposed to adaptively construct prototypes in the training dataset and to reduce the redundant or noisy instances in a classification process for the SVM. Furthermore, a hybrid model based on the genetic algorithm (GA) is proposed to simultaneously optimize the prototype construction and the SVM kernel parameters setting to enhance the classification accuracy. Several UCI benchmark datasets are considered to compare the proposed hybrid GA-DCNN-SVM approach with the previously published GA-based method. The experimental results illustrate that the proposed hybrid model outperforms the existing method and effectively improves the classification performance for the SVM.
引用
收藏
页码:220 / 232
页数:13
相关论文
共 15 条
[1]  
Abroudi A, 2012, IEEE CONF OPEN SYST, P112
[2]   Scaling Up Support Vector Machines Using Nearest Neighbor Condensation [J].
Angiulli, Fabrizio ;
Astorino, Annabella .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (02) :351-357
[3]   In-Sample and Out-of-Sample Model Selection and Error Estimation for Support Vector Machines [J].
Anguita, Davide ;
Ghio, Alessandro ;
Oneto, Luca ;
Ridella, Sandro .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (09) :1390-1406
[4]   A Hybrid Feature Selection Method for Classification Purposes [J].
Cateni, Silvia ;
Colla, Valentina ;
Vannucci, Marco .
UKSIM-AMSS EIGHTH EUROPEAN MODELLING SYMPOSIUM ON COMPUTER MODELLING AND SIMULATION (EMS 2014), 2014, :39-44
[5]   Training ν-support vector regression:: Theory and algorithms [J].
Chang, CC ;
Lin, CJ .
NEURAL COMPUTATION, 2002, 14 (08) :1959-1977
[6]  
Chen JL, 2013, INT CONF MACH LEARN, P1695, DOI 10.1109/ICMLC.2013.6890871
[7]   CONDENSED NEAREST NEIGHBOR RULE [J].
HART, PE .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1968, 14 (03) :515-+
[8]  
Holland JH., 1992, ADAPTATION NATURAL A, DOI [10.7551/mitpress/1090.001.0001, DOI 10.7551/MITPRESS/1090.001.0001]
[9]   A GA-based feature selection and parameters optimization for support vector machines [J].
Huang, Cheng-Lung ;
Wang, Chieh-Jen .
EXPERT SYSTEMS WITH APPLICATIONS, 2006, 31 (02) :231-240
[10]   Risk-based adaptive metric learning for nearest neighbour classification [J].
Miao, Yanan ;
Tao, Xiaoming ;
Sun, Yipeng ;
Li, Yang ;
Lu, Jianhua .
NEUROCOMPUTING, 2015, 156 :33-41