Controlling Overfitting in Symbolic Regression Based on a Bias/Variance Error Decomposition
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @InProceedings{conf/ppsn/Agapitos12,
-
author = "Alexandros Agapitos and Anthony Brabazon and
Michael O'Neill",
-
title = "Controlling Overfitting in Symbolic Regression Based
on a Bias/Variance Error Decomposition",
-
booktitle = "Parallel Problem Solving from Nature, PPSN XII (part
1)",
-
year = "2012",
-
editor = "Carlos A. {Coello Coello} and Vincenzo Cutello and
Kalyanmoy Deb and Stephanie Forrest and
Giuseppe Nicosia and Mario Pavone",
-
volume = "7491",
-
series = "Lecture Notes in Computer Science",
-
pages = "438--447",
-
address = "Taormina, Italy",
-
month = sep # " 1-5",
-
publisher = "Springer",
-
keywords = "genetic algorithms, genetic programming",
-
isbn13 = "978-3-642-32936-4",
-
DOI = "doi:10.1007/978-3-642-32937-1_44",
-
size = "10 pages",
-
abstract = "We consider the fundamental property of generalisation
of data-driven models evolved by means of Genetic
Programming (GP). The statistical treatment of
decomposing the regression error into bias and variance
terms provides insight into the generalisation
capability of this modelling method. The error
decomposition is used as a source of inspiration to
design a fitness function that relaxes the sensitivity
of an evolved model to a particular training dataset.
Results on eight symbolic regression problems show that
new method is capable on inducing better-generalising
models than standard GP for most of the problems.",
-
affiliation = "Natural Computing Research and Applications Group,
University College Dublin, Ireland",
- }
Genetic Programming entries for
Alexandros Agapitos
Anthony Brabazon
Michael O'Neill
Citations