Designing genetic programming classifiers with feature selection and feature construction
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @Article{MA:2020:ASC,
-
author = "Jianbin Ma and Xiaoying Gao",
-
title = "Designing genetic programming classifiers with feature
selection and feature construction",
-
journal = "Applied Soft Computing",
-
volume = "97",
-
pages = "106826",
-
year = "2020",
-
ISSN = "1568-4946",
-
DOI = "doi:10.1016/j.asoc.2020.106826",
-
URL = "https://www.sciencedirect.com/science/article/pii/S156849462030764X",
-
keywords = "genetic algorithms, genetic programming, Feature
construction, Feature selection, Classifier,
Classification",
-
abstract = "Due to the flexibility of Genetic Programming (GP), GP
has been used for feature construction, feature
selection and classifier construction. In this paper,
GP classifiers with feature selection and feature
construction are investigated to obtain simple and
effective classification rules. During the construction
of a GP classifier, irrelevant and redundant features
affect the search ability of GP, and make GP easily
fall into local optimum. This paper proposes two new GP
classifier construction methods to restrict bad impact
of irrelevant and redundant features on GP classifier.
The first is to use a multiple-objective fitness
function that decreases both classification error rate
and the number of selected features, which is named as
GPMO. The second is to first use a feature selection
method, i.e., linear forward selection (LFS) to remove
irrelevant and redundant features and then use GPMO to
construct classifiers, which is named as FSGPMO.
Experiments on twelve datasets show that GPMO and
FSGPMO have advantages over GP classifiers with a
single-objective fitness function named GPSO in term of
classification performance, the number of selected
features, time cost and function complexity. The
proposed FSGPMO can achieve better classification
performance than GPMO on higher dimension datasets,
however, FSGPMO may remove potential effective features
for GP classifier and achieve much lower classification
performance than GPMO on some datasets. Compared with
two other GP-based classifiers, GPMO can significantly
improve the classification performance. Comparisons
with other classification algorithms show that GPMO can
achieve better or comparable classification performance
on most selected datasets. Our proposed GPMO can
achieve better performance than wrapper-based feature
construction methods using GP on applications with
insufficient instances. Further investigations show
that bloat phenomena exists in the process of GP
evolution and overfitting phenomena is not obvious.
Moreover, the benefits of GP over other machine
learning algorithms are discussed",
- }
Genetic Programming entries for
Jianbin Ma
Xiaoying (Sharon) Gao
Citations