Dynamic Function Generation for Text Classification
Created by W.Langdon from
gp-bibliography.bib Revision:1.7975
- @InProceedings{gerber:2024:CEC,
-
author = "Mia Gerber and Nelishia Pillay",
-
title = "Dynamic Function Generation for Text Classification",
-
booktitle = "2024 IEEE Congress on Evolutionary Computation (CEC)",
-
year = "2024",
-
editor = "Bing Xue",
-
address = "Yokohama, Japan",
-
month = "30 " # jun # " - 5 " # jul,
-
publisher = "IEEE",
-
keywords = "genetic algorithms, genetic programming, Machine
learning algorithms, Reviews, Heuristic algorithms,
Neural networks, Text categorization, Subspace
constraints, Germanium, dynamic function generation,
text classification, grammatical evolution",
-
isbn13 = "979-8-3503-0837-2",
-
DOI = "doi:10.1109/CEC60901.2024.10611777",
-
abstract = "Genetic programming and its variants, such as
grammatical evolution, have been predominantly used for
generating functions for machine learning techniques,
such as loss or activation functions for neural
networks and choice functions for the Fuzzy ART
algorithm. These functions are evolved offline prior to
the execution of the neural network and remains the
same during the execution of the learning algorithm. We
refer to this as static function generation (SFG). This
study examines generating these functions in real-time
at different points during the execution of the machine
learning algorithm. We refer to this as dynamic
function generation (DFG). Grammatical evolution (GE)
is used to generate the function. Each function is
generated at every $m$ epochs of the learning
algorithm. Furthermore, the grammar used by GE also
changes every $g$ generations of the GE algorithm. A
selection perturbative hyper-heuristic is used to
determine the options to include in the grammar. In
previous work the effectiveness of using GE to evolve
the choice function for the Fuzzy Art algorithm was
shown. We use this as a case study to investigate DFG
given the success of generating choice functions for
this learning algorithm in previous work. However, DFG
can be used with any neural network learning algorithm.
Static and dynamic function generation is evaluated for
text classification using the Enron, SMS Spam, Chat GPT
tweets, IMDB movie reviews and Amazon product reviews
datasets. DFG improved on the performance of SFG for
all datasets. Additionally, DFG was found to be
competitive with the state of the art and improved on
the best known results for the SMS Spam and Chat GPT
Tweets datasets.",
-
notes = "also known as \cite{10611777}
WCCI 2024",
- }
Genetic Programming entries for
Mia Gerber
Nelishia Pillay
Citations