Transformer-Assisted Genetic Programming for Symbolic Regression [Research Frontier]
Created by W.Langdon from
gp-bibliography.bib Revision:1.8344
- @Article{Han:2025:CIM,
-
author = "Xiaoxu Han and Jinghui Zhong and Zhitong Ma and
Xin Mu and Nikola Gligorovski",
-
title = "Transformer-Assisted Genetic Programming for Symbolic
Regression [Research Frontier]",
-
journal = "IEEE Computational Intelligence Magazine",
-
year = "2025",
-
volume = "20",
-
number = "2",
-
pages = "58--79",
-
month = may,
-
keywords = "genetic algorithms, genetic programming, Training,
Accuracy, Computational modelling, Transformers,
Mathematical models, Complexity theory,
Problem-solving, Optimisation",
-
ISSN = "1556-6048",
-
DOI = "
doi:10.1109/MCI.2025.3540742",
-
abstract = "Symbolic Regression (SR) is a powerful technique for
uncovering hidden mathematical expressions from
observed data and has broad applications in scientific
discovery and automatic programming. Genetic
Programming (GP) has traditionally been the dominant
technique for solving the SR, benefiting from a robust
global search capability that enables the discovery of
solutions with high fitting accuracy. Whereas, GP
suffers from low search efficiency and may not fully
exploit the accumulated knowledge to accelerate
convergence. Conversely, deep learning-based methods,
particularly those employing Transformer backbones, are
trained offline on large-scale datasets. These methods
exhibit strong generalisation capabilities for unseen
tasks without additional training. However, the lack of
refinement mechanisms for specific tasks renders them
inferior to GP methods in accuracy. This study aims to
combine the specific problem-solving capabilities of GP
with the generalisation strengths of pretrained
Transformer models. Specifically, we propose a
pretrained model guided GP (PGGP) method, which is a
GP-based method that incorporates a pretrained
Transformer model to enhance SR problem-solving. New
initialization and mutation operators are proposed
based on the well-structured equation obtained using
the pre-trained model. Extensive experiments are
conducted, and the results show that our method not
only surpasses comparative methods in terms of accuracy
but also reduces the complexity of the generated
solutions, potentially enhancing interpretability.",
-
notes = "Also known as \cite{10976464}
South China University of Technology, Guangzhou, China
Pengcheng Laboratory, Shenzhen, China",
- }
Genetic Programming entries for
Xiaoxu Han
Jinghui Zhong
Zhitong Ma
Xin Mu
Nikola Gligorovski
Citations