SymFormer: End-to-End Symbolic Regression Using Transformer-Based Architecture

被引:11
作者
Vastl, Martin [1 ,2 ]
Kulhanek, Jonas [1 ,3 ]
Kubalik, Jiri [1 ]
Derner, Erik [1 ]
Babuska, Robert [1 ,4 ]
机构
[1] Czech Tech Univ, Czech Inst Informat Robot & Cybernet, Prague 16000, Czech Republic
[2] Charles Univ Prague, Fac Math & Phys, Prague 12116, Czech Republic
[3] Czech Tech Univ, Fac Elect Engn, Prague 16000, Czech Republic
[4] Delft Univ Technol, Dept Cognit Robot, NL-2628 CD Delft, Netherlands
关键词
Transformers; Mathematical models; Vectors; Symbols; Decoding; Optimization; Predictive models; Neural networks; Genetic programming; Computational complexity; Benchmark testing; Regression analysis; Symbolic regression; neural networks; transformers;
D O I
10.1109/ACCESS.2024.3374649
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Many real-world systems can be naturally described by mathematical formulas. The task of automatically constructing formulas to fit observed data is called symbolic regression. Evolutionary methods such as genetic programming have been commonly used to solve symbolic regression tasks, but they have significant drawbacks, such as high computational complexity. Recently, neural networks have been applied to symbolic regression, among which the transformer-based methods seem to be most promising. After training a transformer on a large number of formulas, the actual inference, i.e., finding a formula for new, unseen data, is very fast (in the order of seconds). This is considerably faster than state-of-the-art evolutionary methods. The main drawback of transformers is that they generate formulas without numerical constants, which have to be optimized separately, yielding suboptimal results. We propose a transformer-based approach called SymFormer, which predicts the formula by outputting the symbols and the constants simultaneously. This helps to generate formulas that fit the data more accurately. In addition, the constants provided by SymFormer serve as a good starting point for subsequent tuning via gradient descent to further improve the model accuracy. We show on several benchmarks that SymFormer outperforms state-of-the-art methods while having faster inference.
引用
收藏
页码:37840 / 37849
页数:10
相关论文
共 40 条
[1]   Building Predictive Models via Feature Synthesis [J].
Arnaldo, Ignacio ;
O'Reilly, Una-May ;
Veeramachaneni, Kalyan .
GECCO'15: PROCEEDINGS OF THE 2015 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2015, :983-990
[2]  
Ba J, 2014, ACS SYM SER
[3]  
Ba J.L., 2016, arXiv preprint arXiv:1607.06450, DOI DOI 10.48550/ARXIV.1607.06450
[4]  
Biggio L. T., 2021, arXiv
[5]   Solving Symbolic Regression Problems with Formal Constraints [J].
Bladek, Iwo ;
Krawiec, Krzysztof .
PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, :977-984
[6]  
dAscoli S., 2022, PMLR, P4520
[7]   Genetic programming-based symbolic regression for goal-oriented dimension reduction [J].
Dorgo, Gyula ;
Kulcsar, Tibor ;
Abonyi, Janos .
CHEMICAL ENGINEERING SCIENCE, 2021, 244
[8]  
FLETCHER R, 1987, PRACTICAL METHODS OP
[9]  
Glantz S., 2000, Primer of applied regression and analysis of variance, V2nd
[10]  
Hein D, 2018, Arxiv, DOI arXiv:1712.04170