NeuroLGP-SM: Scalable Surrogate-Assisted Neuroevolution for Deep Neural Networks
Created by W.Langdon from
gp-bibliography.bib Revision:1.7975
- @InProceedings{stapleton:2024:CEC,
-
author = "Fergal Stapleton and Edgar Galvan",
-
title = "{NeuroLGP-SM:} Scalable Surrogate-Assisted
Neuroevolution for Deep Neural Networks",
-
booktitle = "2024 IEEE Congress on Evolutionary Computation (CEC)",
-
year = "2024",
-
editor = "Bing Xue",
-
address = "Yokohama, Japan",
-
month = "30 " # jun # " - 5 " # jul,
-
publisher = "IEEE",
-
keywords = "genetic algorithms, genetic programming, Training,
Support vector machines, Sociology, Semantics,
Evolutionary computation, Computer architecture,
Neuroevolution, Linear Genetic Programming,
Surrogate-assisted Evolutionary Algorithms",
-
isbn13 = "979-8-3503-0837-2",
-
DOI = "doi:10.1109/CEC60901.2024.10612039",
-
abstract = "Evolutionary Algorithms (EAs) play a crucial role in
the architectural configuration and training of
Artificial Deep Neural Networks (DNNs), a process known
as neuroevolution. However, neuroevolution is hindered
by its inherent computational expense, requiring
multiple generations, a large population, and numerous
epochs. The most computationally intensive aspect lies
in evaluating the fitness function of a single
candidate solution. To address this challenge, we
employ Surrogate-assisted EAs (SAEAs). While a few
SAEAs approaches have been proposed in neuroevolution,
none have been applied to truly large DNNs due to
issues like intractable information usage. In this
work, drawing inspiration from Genetic Programming
semantics, we use phenotypic distance vectors,
outputted from DNNs, alongside Kriging Partial Least
Squares (KPLS), an approach that is effective in
handling these large vectors, making them suitable for
search. Our proposed approach, named Neuro-Linear
Genetic Programming surrogate model (NeuroLGP-SM),
efficiently and accurately estimates DNN fitness
without the need for complete evaluations. NeuroLGP-SM
demonstrates competitive or superior results compared
to 12 other methods, including NeuroLGP without SM,
convolutional neural networks, support vector machines,
and autoencoders. Additionally, it is worth noting that
NeuroLGP-SM is 25percent more energy-efficient than its
NeuroLGP counterpart. This efficiency advantage adds to
the overall appeal of our proposed NeuroLGP-SM in
optimising the configuration of large DNNs.",
-
notes = "also known as \cite{10612039}
WCCI 2024",
- }
Genetic Programming entries for
Fergal Stapleton
Edgar Galvan Lopez
Citations