Size/Accuracy Trade-Off in Convolutional Neural Networks: An Evolutionary Approach
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @InProceedings{DBLP:conf/inns/CettoBXM19,
-
author = "Tomaso Cetto and Jonathan Byrne and Xiaofan Xu and
David Moloney",
-
title = "Size/Accuracy Trade-Off in Convolutional Neural
Networks: An Evolutionary Approach",
-
booktitle = "Recent Advances in Big Data and Deep Learning,
Proceedings of the {INNS} Big Data and Deep Learning
Conference {INNSBDDL} 2019",
-
year = "2019",
-
editor = "Luca Oneto and Nicolo Navarin and
Alessandro Sperduti and Davide Anguita",
-
pages = "17--26",
-
address = "Sestri Levante, Genova, Italy",
-
month = "16-18 " # apr,
-
publisher = "Springer",
-
keywords = "genetic algorithms, genetic programming, Grammatical
evolution , ANN, CNN",
-
DOI = "doi:10.1007/978-3-030-16841-4_3",
-
timestamp = "Wed, 08 May 2019 11:33:25 +0200",
-
biburl = "https://dblp.org/rec/conf/inns/CettoBXM19.bib",
-
bibsource = "dblp computer science bibliography, https://dblp.org",
-
abstract = "In recent years, the shift from hand-crafted design of
Convolutional Neural Networks (CNN) to an automatic
approach (AutoML) has garnered much attention. However,
most of this work has been concentrated on generating
state of the art (SOTA) architectures that set new
standards of accuracy. In this paper, we use the
NSGA-II algorithm for multi-objective optimization to
optimize the size/accuracy trade-off in CNNs. This
approach is inspired by the need for simple, effective,
and mobile-sized architectures which can easily be
re-trained on any datasets. This optimization is
carried out using a Grammatical Evolution approach,
which, implemented alongside NSGA-II, automatically
generates valid network topologies which can best
optimize the size/accuracy trade-off. Furthermore, we
investigate how the algorithm responds to an increase
in the size of the search space, moving from strictly
topology optimization (number of layers, size of
filter, number of kernels,etc.) and then expanding the
search space to include possible variations in other
hyper-parameters such as the type of optimizer, dropout
rate, batch size, or learning rate, amongst others.",
- }
Genetic Programming entries for
Tomaso Cetto
Jonathan Byrne
Xiaofan Xu
David Moloney
Citations