Program Synthesis with Generative Pre-trained Transformers and Grammar-Guided Genetic Programming Grammar
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @InProceedings{Tao:2023:LA-CCI,
-
author = "Ning Tao and Anthony Ventresque and Takfarinas Saber",
-
booktitle = "2023 IEEE Latin American Conference on Computational
Intelligence (LA-CCI)",
-
title = "Program Synthesis with Generative Pre-trained
Transformers and Grammar-Guided Genetic Programming
Grammar",
-
year = "2023",
-
abstract = "Grammar-Guided Genetic Programming (G3P) is widely
recognised as one of the most successful approaches to
program synthesis. Using a set of input/output tests,
G3P evolves programs that fit a defined BNF grammar and
that are capable of solving a wide range of program
synthesis problems. However, G3P's inability to scale
to more complex problems has limited its applicability.
Recently, Generative Pre-trained Transformers (GPTs)
have shown promise in revolutionizing program synthesis
by generating code based on natural language prompts.
However, challenges such as ensuring correctness and
safety still need to be addressed as some GPT-generated
programs might not work while others might include
security vulnerabilities or blacklisted library calls.
In this work, we proposed to combine GPT (in our case
ChatGPT) with a G3P system, forcing any synthesised
program to fit the BNF grammar-thus offering an
opportunity to evolve/fix incorrect programs and
reducing security threats. In our work, we leverage
GPT-generated programs in G3P's initial population.
However, since GPT-generated programs have an arbitrary
structure, the initial work that we undertake is to
devise a technique that maps such programs to a
predefined BNF grammar before seeding the code into
G3P's initial population. By seeding the grammar-mapped
code into the population of our G3P system, we were
able to successfully improve some of the desired
programs using a well-known program synthesis
benchmark. However, in its default configuration, G3P
is not successful in fixing some incorrect
GPT-generated programs-even when they are close to a
correct program. We analysed the performance of our
approach in depth and discussed its limitations and
possible future improvements.",
-
keywords = "genetic algorithms, genetic programming, ANN, Codes,
Sociology, Transformers, Grammar, Security, Statistics,
Program Synthesis, Grammar Guided Genetic Programming,
Generative Pre-trained Transformers, Large Language
Models, Grammar",
-
DOI = "doi:10.1109/LA-CCI58595.2023.10409384",
-
ISSN = "2769-7622",
-
month = oct,
-
notes = "Also known as \cite{10409384}",
- }
Genetic Programming entries for
Ning Tao
Anthony Ventresque
Takfarinas Saber
Citations