Creating deep neural networks for text classification tasks using grammar genetic programming
Created by W.Langdon from
gp-bibliography.bib Revision:1.8081
- @Article{MAGALHAES:2023:asoc,
-
author = "Dimmy Magalhaes and Ricardo H. R. Lima and
Aurora Pozo",
-
title = "Creating deep neural networks for text classification
tasks using grammar genetic programming",
-
journal = "Applied Soft Computing",
-
year = "2023",
-
volume = "135",
-
pages = "110009",
-
month = mar,
-
keywords = "genetic algorithms, genetic programming, grammatical
evolution, ANN, Text classification, Evolutionary
algorithms, Automatic design, Deep neural networks",
-
ISSN = "1568-4946",
-
URL = "https://www.sciencedirect.com/science/article/pii/S1568494623000273",
-
DOI = "doi:10.1016/j.asoc.2023.110009",
-
code_url = "https://doi.org/10.24433/CO.5469683.v1",
-
abstract = "Text classification is one of the Natural Language
Processing (NLP) tasks. Its objective is to label
textual elements, such as phrases, queries, paragraphs,
and documents. In NLP, several approaches have achieved
promising results regarding this task. Deep
Learning-based approaches have been widely used in this
context, with deep neural networks (DNNs) adding the
ability to generate a representation for the data and a
learning model. The increasing scale and complexity of
DNN architectures was expected, creating new challenges
to design and configure the models. we present a study
on the application of a grammar-based evolutionary
approach to the design of DNNs, using models based on
Convolutional Neural Networks (CNNs), Long Short-Term
Memory (LSTM), and Graph Neural Networks (GNNs). We
propose different grammars, which were defined to
capture the features of each type of network, also
proposing some combinations, verifying their impact on
the produced designs and performance of the generated
models. We create a grammar that is able to generate
different networks specialized on text classification,
by modification of Grammatical Evolution (GE), and it
is composed of three main components: the grammar,
mapping, and search engine. Our results offer promising
future research directions as they show that the
projected architectures have a performance comparable
to that of their counterparts but can still be further
improved. We were able to improve the results of a
manually structured neural network in 8.18percent in
the best case",
-
notes = "also known as \cite{MAGALHAES2023110009}",
- }
Genetic Programming entries for
Dimmy Magalhaes
Ricardo Henrique Remes de Lima
Aurora Trinidad Ramirez Pozo
Citations