Modular Grammatical Evolution for the Generation of Artificial Neural Networks
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @Article{Soltanian:2022:EC,
-
author = "Khabat Soltanian and Ali Ebnenasir and
Mohsen Afsharchi",
-
title = "Modular Grammatical Evolution for the Generation of
Artificial Neural Networks",
-
journal = "Evolutionary Computation",
-
year = "2022",
-
volume = "30",
-
number = "2",
-
pages = "291--327",
-
month = "Summer",
-
keywords = "genetic algorithms, genetic programming, Grammatical
Evolution, Modular Representation, ANN,
NeuroEvolution",
-
ISSN = "1063-6560",
-
DOI = "doi:10.1162/evco_a_00302",
-
size = "37 pages",
-
abstract = "This paper presents a novel method, called Modular
Grammatical Evolution (MGE), towards validating the
hypothesis that restricting the solution space of
NeuroEvolution to modular and simple neural networks
enables the efficient generation of smaller and more
structured neural networks while providing acceptable
(and in some cases superior) accuracy on large data
sets. MGE also enhances the state-of-the-art
Grammatical Evolution (GE) methods in two directions.
First, MGE's representation is modular in that each
individual has a set of genes, and each gene is mapped
to a neuron by grammatical rules. Second, the proposed
representation mitigates two important drawbacks of GE,
namely the low scalability and weak locality of
representation, towards generating modular and
multi-layer networks with a high number of neurons. We
define and evaluate five different forms of structures
with and without modularity using MGE and find
single-layer modules with no coupling more productive.
Our experiments demonstrate that modularity helps in
finding better neural networks faster. We have
validated the proposed method using ten well-known
classification benchmarks with different sizes, feature
counts, and output class counts. Our experimental
results indicate that MGE provides superior accuracy
with respect to existing NeuroEvolution methods and
returns classifiers that are significantly simpler than
other machine learning generated classifiers. Finally,
we empirically demonstrate that MGE outperforms other
GE methods in terms of locality and scalability
properties.",
-
notes = "Department of Electrical and Computer Engineering,
University of Zanjan, Zanjan 45371-38791, Iran",
- }
Genetic Programming entries for
Khabat Soltanian
Ali Ebnenasir
Mohsen Afsharchi
Citations