Artificial neural network regression as a local search heuristic for ensemble strategies in differential evolution

被引:59
作者
Fister, Iztok [1 ]
Suganthan, Ponnuthurai Nagaratnam [2 ]
Fister, Iztok, Jr. [1 ]
Kamal, Salahuddin M. [3 ]
Al-Marzouki, Fahad M. [3 ]
Perc, Matjaz [3 ,4 ]
Strnad, Damjan [1 ]
机构
[1] Univ Maribor, Fac Elect Engn & Comp Sci, Smetanova 17, SLO-2000 Maribor, Slovenia
[2] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[3] King Abdulaziz Univ, Fac Sci, Dept Phys, Jeddah 21413, Saudi Arabia
[4] Univ Maribor, Fac Nat Sci & Math, Koroska Cesta 160, SLO-2000 Maribor, Slovenia
关键词
Nonlinear dynamics; Artificial neural network; Differential evolution; Regression; Local search; Ensemble strategies; NEIGHBORHOOD; OPTIMIZATION; ALGORITHM; PARAMETERS;
D O I
10.1007/s11071-015-2537-8
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
Nature frequently serves as an inspiration for developing new algorithms to solve challenging real-world problems. Mathematical modeling has led to the development of artificial neural networks (ANNs), which have proven especially useful for solving problems such as classification and regression. Moreover, evolutionary algorithms (EAs), inspired by Darwin's natural evolution, have been successfully applied to solve optimization, modeling, and simulation problems. Differential evolution (DE) is a particularly well-known EA that possesses a multitude of strategies for generating an offspring solution, where the best strategy is not known in advance. In this paper, the ANN regression is applied as a local search heuristic within the DE algorithm that tries predicting the best strategy or attempting to generate a better offspring from an ensemble of DE strategies. This local search heuristic is applied to the population of solutions according to a control parameter that regulates between the time complexity of calculation and the quality of the solution. The experiments on a CEC 2014 test suite consisting of 30 benchmark functions reveal the full potential in developing this idea.
引用
收藏
页码:895 / 914
页数:20
相关论文
共 56 条
[21]   A comprehensive review of firefly algorithms [J].
Fister, Iztok ;
Fister, Iztok, Jr. ;
Yang, Xin-She ;
Brest, Janez .
SWARM AND EVOLUTIONARY COMPUTATION, 2013, 13 :34-46
[22]   A comparison of alternative tests of significance for the problem of m rankings [J].
Friedman, M .
ANNALS OF MATHEMATICAL STATISTICS, 1940, 11 :86-92
[23]  
Garro BA, 2010, LECT NOTES COMPUT SC, V6444, P201, DOI 10.1007/978-3-642-17534-3_25
[24]   Dynamic group-based differential evolution using a self-adaptive strategy for global optimization problems [J].
Han, Ming-Feng ;
Liao, Shih-Hui ;
Chang, Jyh-Yeong ;
Lin, Chin-Teng .
APPLIED INTELLIGENCE, 2013, 39 (01) :41-56
[25]  
Hecht-Nielsen R., 1989, IJCNN: International Joint Conference on Neural Networks (Cat. No.89CH2765-6), P593, DOI 10.1109/IJCNN.1989.118638
[26]  
HOLGER H, 2004, STOCHASTIC LOCAL SEA
[27]   Fire analysis of steel frames with the use of artificial neural networks [J].
Hozjan, T. ;
Turk, G. ;
Srpcic, S. .
JOURNAL OF CONSTRUCTIONAL STEEL RESEARCH, 2007, 63 (10) :1396-1403
[28]  
Kartam N., 1997, ARTIFICIAL NEURAL NE
[29]   Neural network ensemble operators for time series forecasting [J].
Kourentzes, Nikolaos ;
Barrow, Devon K. ;
Crone, Sven F. .
EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (09) :4235-4244
[30]   A MOS-based dynamic memetic differential evolution algorithm for continuous optimization: a scalability test [J].
LaTorre, Antonio ;
Muelas, Santiago ;
Pena, Jose-Maria .
SOFT COMPUTING, 2011, 15 (11) :2187-2199