abstract = "The evolution of explicitly represented topologies
such as graphs involves devising methods for mutating,
comparing and combining structures in meaningful ways
and identifying and maintaining the necessary
topological diversity. Research has been conducted in
the area of the evolution of trees in genetic
programming and of neural networks and some of these
problems have been addressed independently by the
different research communities. In the domain of neural
networks, NEAT (Neuroevolution of Augmenting
Topologies) has shown to be a successful method for
evolving increasingly complex networks. This systems
success is based on three interrelated elements:
speciation, marking of historical information in
topologies, and initializing search in a small
structures search space. This provides the dynamics
necessary for the exploration of diverse solution
spaces at once and a way to discriminate between
different structures. Although different
representations have emerged in the area of genetic
programming, the study of the tree representation has
remained of interest in great part because of its
mapping to programming languages and also because of
the observed phenomenon of unnecessary code growth or
bloat which hinders performance. The structural
similarity between trees and neural networks poses an
interesting question: Is it possible to apply the
techniques from NEAT to the evolution of trees and if
so, how does it affect performance and the dynamics of
code growth? In this work we address these questions
and present analogous techniques to those in NEAT for
genetic programming.",
notes = "p50 'size of trees was consistently proportional to
their fitness'