Auto-adaptive Grammar-Guided Genetic Programming algorithm to build Ensembles of Multi-Label Classifiers
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @Article{MOYANO:2022:IF,
-
author = "Jose M. Moyano and Sebastian Ventura",
-
title = "Auto-adaptive Grammar-Guided Genetic Programming
algorithm to build Ensembles of Multi-Label
Classifiers",
-
journal = "Information Fusion",
-
volume = "78",
-
pages = "1--19",
-
year = "2022",
-
ISSN = "1566-2535",
-
DOI = "doi:10.1016/j.inffus.2021.07.005",
-
URL = "https://www.sciencedirect.com/science/article/pii/S1566253521001469",
-
keywords = "genetic algorithms, genetic programming, Multi-label
classification, Ensemble learning, Evolutionary
algorithm, Grammar-guided genetic programming",
-
abstract = "Multi-label classification has been used to solve a
wide range of problems where each example in the
dataset may be related either to one class (as in
traditional classification problems) or to several
class labels at the same time. Many ensemble-based
approaches have been proposed in the literature, aiming
to improve the performance of traditional multi-label
classification algorithms. However, most of them do not
consider the data characteristics to build the
ensemble, and those that consider them need to tune
many parameters to maximize their performance. In this
paper, we propose an Auto-adaptive algorithm based on
Grammar-Guided Genetic Programming to generate
Ensembles of Multi-Label Classifiers based on
projections of k labels (AG3P-kEMLC). It creates a
tree-shaped ensemble, where each leaf is a multi-label
classifier focused on a subset of k labels. Unlike
other methods in the literature, our proposal can deal
with different values of k in the same ensemble,
instead of fixing one specific value. It also includes
an auto-adaptive process to reduce the number of
hyper-parameters to tune, prevent overfitting and
reduce the runtime required to execute it. Three
versions of the algorithm are proposed. The first,
fixed, uses the same value of k for all multi-label
classifiers in the ensemble. The remaining two deal
with different k values in the ensemble: uniform gives
the same probability to choose each available value of
k, and gaussian favors the selection of smaller values
of k. The experimental study carried out considering
twenty reference datasets and five evaluation metrics,
compared with eleven ensemble methods demonstrates that
our proposal performs significantly better than the
state-of-the-art methods",
- }
Genetic Programming entries for
Jose M Moyano
Sebastian Ventura
Citations