Created by W.Langdon from gp-bibliography.bib Revision:1.8051
Features of EC are:
* Representation scheme for solution candidates and variation operators for them. * Fitness function that shows a quality of the solution candidate.
* Selection method that selects prospective solution candidates from a set of them. Usually, these features are defined independently. Although it is interesting to rethink this model completely, we reconsider only the representation scheme for GP.
We use GP to generate a program automatically. The program is represented by RTN. RTN can represent any algorithms, in other words, Turing-complete. Thus, a user of RTN need not worry about whether a solution of a given problem can be described by RTN. On the other hand, the expressiveness of solution candidates of standard GP, which is the most popular GP, is strongly restricted. A solution candidate of standard GP is represented by a single parse tree. The parse tree consists of terminals and non-terminals. If all non-terminals are pure functions and we treat an evaluated value of the parse tree as an output (or behaviour) of the solution candidate, the repertoire of this representation is smaller than the one of finite state machine.",
One conceivable approach is to introduce an ideally infinite indexed memory and non-terminals to access to it. It is proved that if the solution candidate is represented by a parse tree consisting of these non-terminals and we can repeat the evaluation of the parse tree until the data stored in the memory meets a halting condition, then the expressiveness is equivalent to the one of a Turing machine, i.e. Turing-complete.",
* User can specify the input-output scheme.
* Useful components can be introduced easily.
* The program is modularised and hierarchised.
We will discuss necessity of them in detail in this paper.
We propose another representation scheme, RTN. It satisfies above conditions. It is a natural extension of standard GP. Standard GP uses a single parse tree to represent a solution candidate. On the other hand, RTN is a recurrent network consisting of plural nodes. Each node consists of a value and a pure function represented by a parse tree. The parse tree consists of non-terminals and terminals. Special non-terminals are not needed. Terminal set consists of four variables and constants. In case of standard GP, the input data bind variables. On the other hand, they bind the values of the RTN nodes.",
It is straightforward to prove that RTN can simulate any Turing machine, in other words, RTN can represent any algorithms. We give the proof in this paper.",
Various representations for GP have been proposed so far. We discuss differences between RTN and other representation scheme. We also discuss the criteria used in comparing various approaches. For example, No Free Lunch Theorem does not matter.",
email Thu, 01 Sep 2005 16:45:44 +0900 No English translation available, but, the essential part of it, i.e. details of proposed representation scheme and results of its application, is described in \cite{yabuki04genetic}. There are many topics not included \cite{yabuki04genetic} but discussed in the thesis:
- In \cite{yabuki04genetic}, we said that the repertoire of single S-expression is the same as the one of finite state automaton (FSA). In the thesis, I showed it is usual that the repertoire of single S-expression is smaller than the one of FSA.
- Drawbacks of describing strategies of iterated prisonerB!Gs dilemma by means of tables or strings of GA.
- Requirements for the representation for GP were discussed concretely in the thesis.
- Drawbacks of automatically defined functions and recursive non-terminal.
- The way how to simulate multi-tape Turing machine directly by means of our representation scheme.
- Successful result of our method applied to a problem that cannot be solved by FSA.", BibTeX entry too long. Truncated
Genetic Programming entries for Taro Yabuki