Evolving Form and Function: Dual-Objective Optimization in Neural Symbolic Regression Networks
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @InProceedings{bertschinger:2024:GECCO,
-
author = "Amanda Bertschinger and James Bagrow and
Joshua Bongard",
-
title = "Evolving Form and Function: Dual-Objective
Optimization in Neural Symbolic Regression Networks",
-
booktitle = "Proceedings of the 2024 Genetic and Evolutionary
Computation Conference",
-
year = "2024",
-
editor = "Jean-Baptiste Mouret and Kai Qin and Julia Handl and
Xiaodong Li and Markus Wagner and Mario Garza-Fabre and
Kate Smith-Miles and Richard Allmendinger and
Ying Bi and Grant Dick and Amir H Gandomi and
Marcella Scoczynski Ribeiro Martins and Hirad Assimi and
Nadarajen Veerapen and Yuan Sun and
Mario Andres Munyoz and Ahmed Kheiri and Nguyen Su and
Dhananjay Thiruvady and Andy Song and Frank Neumann and Carla Silva",
-
pages = "277--285",
-
address = "Melbourne, Australia",
-
series = "GECCO '24",
-
month = "14-18 " # jul,
-
organisation = "SIGEVO",
-
publisher = "Association for Computing Machinery",
-
publisher_address = "New York, NY, USA",
-
keywords = "genetic algorithms, genetic programming, symbolic
regression, neuroevolution, multi-objective
optimization, Evolutionary Machine Learning",
-
isbn13 = "979-8-4007-0494-9",
-
DOI = "doi:10.1145/3638529.3654030",
-
size = "9 pages",
-
abstract = "Data increasingly abounds, but distilling their
underlying relationships down to something
interpretable remains challenging. One approach is
genetic programming, which 'symbolically regresses' a
data set down into an equation. However, symbolic
regression (SR) faces the issue of requiring training
from scratch for each new dataset. To generalize across
all datasets, deep learning techniques have been
applied to SR. These networks, however, are only able
to be trained using a symbolic objective: NN-generated
and target equations are symbolically compared. But
this does not consider the predictive power of these
equations, which could be measured by a behavioral
objective that compares the generated equation's
predictions to actual data. Here we introduce a method
that combines gradient descent and evolutionary
computation to yield neural networks that minimize the
symbolic and behavioral errors of the equations they
generate from data. As a result, these evolved networks
are shown to generate more symbolically and
behaviorally accurate equations than those generated by
networks trained by state-of-the-art gradient based
neural symbolic regression methods. We hope this method
suggests that evolutionary algorithms, combined with
gradient descent, can improve SR results by yielding
equations with more accurate form and function.",
-
notes = "GECCO-2024 EML A Recombination of the 33rd
International Conference on Genetic Algorithms (ICGA)
and the 29th Annual Genetic Programming Conference
(GP)",
- }
Genetic Programming entries for
Amanda Bertschinger
James Bagrow
Josh C Bongard
Citations