Parameter determination and feature selection for back-propagation network by particle swarm optimization

被引:35
作者
Lin, Shih-Wei [1 ]
Chen, Shih-Chieh [2 ]
Wu, Wen-Jie [1 ]
Chen, Chih-Hsien [3 ]
机构
[1] Chang Gung Univ, Dept Informat Management, Tao Yuan 333, Taiwan
[2] Natl Taiwan Univ Sci & Technol, Dept Ind Engn, Taipei 106, Taiwan
[3] Fo Guang Univ, Dept Management, Jiaosi Shiang 26247, Yilan, Taiwan
关键词
Back-propagation network; Particle swarm optimization; Feature selection; Parameter determination; ARTIFICIAL NEURAL-NETWORKS; SUPPORT VECTOR MACHINE; GENETIC ALGORITHM; GLOBAL OPTIMIZATION; DESIGN; CLASSIFICATION; IDENTIFICATION; CLASSIFIERS; SYSTEMS; MODEL;
D O I
10.1007/s10115-009-0242-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The back-propagation network (BPN) is a popular tool with applications in a variety of fields. Nevertheless, different problems may require different parameter settings for a given network architecture. A dataset may contain many features, but not all features are beneficial for classification by the BPN. Therefore, a particle-swarm-optimization-based approach, denoted as PSOBPN, is proposed to obtain the suitable parameter settings for BPN and to select the beneficial subset of features which result in a better classification accuracy rate. A set of 23 problems with a range of examples and features drawn from the UCI (University of California, Irvine) machine learning repository is adopted to test the performance of the proposed algorithm. The results are compared with several well-known published algorithms. The comparative study shows that the proposed approach improves the classification accuracy rate in most test problems. Furthermore, when the feature selection is taken into consideration, the classification accuracy rates of most datasets are increased. The proposed algorithm should thus be useful to both practitioners and researchers.
引用
收藏
页码:249 / 266
页数:18
相关论文
共 72 条
[1]   Non-parametric classifier-independent feature selection [J].
Abe, N ;
Kudo, M .
PATTERN RECOGNITION, 2006, 39 (05) :737-746
[2]  
AHLUWALIA M, 1999, P GEN EV COMP C, P947
[3]  
[Anonymous], 1998, Feature Extraction, Construction and Selection: A Data Mining Perspective
[4]   A new and improved version of particle swarm optimization algorithm with global-local best parameters [J].
Arumugam, M. Senthil ;
Rao, M. V. C. ;
Chandramohan, Aarthi .
KNOWLEDGE AND INFORMATION SYSTEMS, 2008, 16 (03) :331-357
[5]   Modelling of electrostatic fluidized bed (EFB) coating process using artificial neural networks [J].
Barletta, M. ;
Gisario, A. ;
Guarino, S. .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2007, 20 (06) :721-733
[6]  
Berry M.J., 2001, Data Mining Techniques: For Marking, Sales and Customer Support
[7]   G-Prop: Global optimization of multilayer perceptrons using GAs [J].
Castillo, PA ;
Merelo, JJ ;
Prieto, A ;
Rivas, V ;
Romero, G .
NEUROCOMPUTING, 2000, 35 :149-163
[8]   Evolving multilayer perceptrons [J].
Castillo, PA ;
Carpio, J ;
Merelo, JJ ;
Prieto, A ;
Rivas, V ;
Romero, G .
NEURAL PROCESSING LETTERS, 2000, 12 (02) :115-127
[9]   Process parameter optimization for MIMO plastic injection molding via soft computing [J].
Chen, Wen-Chin ;
Fu, Gong-Loung ;
Tai, Pei-Hao ;
Deng, Wei-Jaw .
EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (02) :1114-1122
[10]   Discovering Text Databases with Neural Nets [J].
Choi, Yong S. .
Knowledge and Information Systems, 2001, 3 (03) :356-373