abstract = "In this paper we propose a new algorithm called
HyperGPEFS (HyperGP with Explicit Fitness Sharing). It
is based on a HyperNEAT, which is a well-established
evolutionary method employing indirect encoding of
artificial neural networks. Indirect encoding in
HyperNEAT is realized via special function called
Compositional and Pattern Producing Network (CPPN),
able to describe a neural network of arbitrary size.
CPPNs are represented by network structures, which are
evolved by means of a slightly modified version of
another, well-known algorithm NEAT (NeuroEvolution of
Augmenting Topologies). HyperGP is a variant of
HyperNEAT, where the CPPNs are optimized by Genetic
Programming (GP). Published results reported promising
improvement in the speed of convergence.
Our approach further extends HyperGP by using fitness
sharing to promote a diversity of a population. Here,
we thoroughly compare all three algorithms on six
different tasks. Fitness sharing demands a definition
of a tree distance measure. Among other five, we
propose a generalized distance measure which, in
conjunction with HyperGPEFS, significantly outperforms
HyperNEAT and HyperGP on all, but one testing problems.
Although this paper focuses on indirect encoding, the
proposed distance measures are generally applicable.",