Inconsistency - Friend or Foe
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @InProceedings{Johansson:2007:IJCNN,
-
author = "Ulf Johansson and Rikard Konig and Lars Niklasson",
-
title = "Inconsistency - Friend or Foe",
-
booktitle = "International Joint Conference on Neural Networks,
IJCNN 2007",
-
year = "2007",
-
pages = "1383--1388",
-
address = "Orlando, USA",
-
month = "12-17 " # aug,
-
keywords = "genetic algorithms, genetic programming, G-REX tree,
consistency criterion, evolutionary algorithms,
inconsistency criterion, neural network ensembles,
probability estimation, publicly available data sets,
regression trees, rule extraction algorithms, data
integrity, data mining, estimation theory, evolutionary
computation, learning (artificial intelligence),
probability, regression analysis",
-
ISSN = "1098-7576",
-
isbn13 = "1-4244-1380-X",
-
DOI = "doi:10.1109/IJCNN.2007.4371160",
-
abstract = "One way of obtaining accurate yet comprehensible
models is to extract rules from opaque predictive
models. When evaluating rule extraction algorithms, one
frequently used criterion is consistency; i.e. the
algorithm must produce similar rules every time it is
applied to the same problem. Rule extraction algorithms
based on evolutionary algorithms are, however,
inherently inconsistent, something that is regarded as
their main drawback. In this paper, we argue that
consistency is an over valued criterion, and that
inconsistency can even be beneficial in some
situations. The study contains two experiments, both
using publicly available data sets, where rules are
extracted from neural network ensembles. In the first
experiment, it is shown that it is normally possible to
extract several different rule sets from an opaque
model, all having high and similar accuracy. The
implication is that consistency in that perspective is
useless; why should one specific rule set be considered
superior? Clearly, it should instead be regarded as an
advantage to obtain several accurate and comprehensible
descriptions of the relationship. In the second
experiment, rule extraction is used for probability
estimation. More specifically, an ensemble of extracted
trees is used in order to obtain probability estimates.
Here, it is exactly the inconsistency of the rule
extraction algorithm that makes the suggested approach
possible.",
-
notes = "Also known as \cite{4371160}",
- }
Genetic Programming entries for
Ulf Johansson
Rikard Konig
Lars Niklasson
Citations