Evaluating methods for constant optimization of symbolic regression benchmark problems

被引:10
作者
de Melo, Vinicius Veloso [1 ]
Fowler, Benjamin [2 ]
Banzhaf, Wolfgang [2 ]
机构
[1] Univ Fed Sao Paulo, Inst Sci & Technol, Sao Jose Dos Campos, SP, Brazil
[2] Mem Univ Newfoundland, Dept Comp Sci, St John, NF A1B 3X5, Canada
来源
2015 BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS 2015) | 2015年
关键词
Symbolic Regression; Genetic programming; Curve fitting; Least-squares; Nonlinear regression;
D O I
10.1109/BRACIS.2015.55
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Constant optimization in symbolic regression is an important task addressed by several researchers. It has been demonstrated that continuous optimization techniques are adequate to find good values for the constants by minimizing the prediction error. In this paper, we evaluate several continuous optimization methods that can be used to perform constant optimization in symbolic regression. We have selected 14 wellknown benchmark problems and tested the performance of diverse optimization methods in finding the expected constant values, assuming that the correct formula has been found. The results show that Levenberg-Marquardt presented the highest success rate among the evaluated methods, followed by Powell's and Nelder-Mead's Simplex. However, two benchmark problems were not solved, and for two other problems the Levenberg-Marquardt was largely outperformed by Nelder-Mead Simplex in terms of success rate. We conclude that even though a symbolic regression technique may find the correct formula, constant optimization may fail; thus, this may also happen during the search for a formula and may guide the method towards the wrong solution. Also, the efficiency of LM in finding high-quality solutions by using only a few function evaluations could serve as inspiration for the development of better symbolic regression methods.
引用
收藏
页码:25 / 30
页数:6
相关论文
共 23 条
[1]  
[Anonymous], 1944, Q Appl Math, DOI [DOI 10.1090/QAM/10666, 10.1090/QAM/10666, DOI 10.1090/QAM/1944-02-02]
[2]  
Banzhaf W., 2001, International Encyclopedia of the Social Behavioral Sciences, P789, DOI 10.1016/B0-08-043076-7/00557-X
[3]  
Bates D.M., 1988, Nonlinear regression analysis and its applications, DOI DOI 10.1002/9780470316757
[4]   A comparison of linear genetic programming and neural networks in medical data mining [J].
Brameier, M ;
Banzhaf, W .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2001, 5 (01) :17-26
[5]   Investigation of Linear Genetic Programming Techniques for Symbolic Regression [J].
Dal Piccol Sotto, Leo Francoso ;
de Melo, Vinicius Veloso .
2014 BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2014, :146-151
[6]   Kaizen Programming [J].
de Melo, Vinicius Veloso .
GECCO'14: PROCEEDINGS OF THE 2014 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2014, :895-902
[7]  
Esparcia-Alcazar A. I., 1997, 1997 GEN PROGR C, P57
[8]  
Fletcher R., 1987, Practical methods of optimization, DOI 10.1002/9781118723203
[9]   METHODS OF CONJUGATE GRADIENTS FOR SOLVING LINEAR SYSTEMS [J].
HESTENES, MR ;
STIEFEL, E .
JOURNAL OF RESEARCH OF THE NATIONAL BUREAU OF STANDARDS, 1952, 49 (06) :409-436
[10]   Evolutionary Modeling of Systems of Ordinary Differential Equations with Genetic Programming [J].
Hongqing Cao ;
Lishan Kang ;
Yuping Chen ;
Jingxian Yu .
Genetic Programming and Evolvable Machines, 2000, 1 (4) :309-337