ABSTRACT
When search operators in genetic programming (GP) insert new instructions into programs, they usually draw them uniformly from the available instruction set. Prefering some instructions to others would require additional domain knowledge, which is typically unavailable. However, it has been recently demonstrated that the likelihoods of instructions' occurrence in a program can be reasonably well estimated from its input-output behavior using a neural network. We exploit this idea to bias the choice of instructions used by search operators in GP. Given a large sample of programs and their input-output behaviors, a neural network is trained to predict the presence of individual instructions. When applied to a new program synthesis task, the network is first queried on the set of examples that define the task, and the obtained probabilities determine the frequencies of using instructions in initialization and mutation operators. This priming leads to significant improvements of the odds of successful synthesis on a range of benchmarks.
- Martín Abadi and et al. 2015. TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems. (2015). https://www.tensorflow.org/ Software available from tensorflow.org.Google Scholar
- Matej Balog, Alexander L. Gaunt, Marc Brockschmidt, Sebastian Nowozin, and Daniel Tarlow. 2016. DeepCoder: Learning to Write Programs. arXiv preprint arXiv:1611.01989 (November 2016). https://arxiv.org/abs/1611.01989Google Scholar
- M. Bošnjak, T. Rocktäschel, J. Naradowsky, and S. Riedel. 2016. Programming with a Differentiable Forth Interpreter. ArXiv e-prints (May 2016). arXiv:1605.06640Google Scholar
- J. Devlin, J. Uesato, S. Bhupatiraju, R. Singh, A.-r. Mohamed, and P. Kohli. 2017. RobustFill: Neural Program Learning under Noisy I/O. ArXiv e-prints (March 2017). arXiv:cs.AI/1703.07469Google Scholar
- John K. Feser, Swarat Chaudhuri, and Isil Dillig. 2015. Synthesizing Data Structure Transformations from Input-output Examples. SIGPLAN Not. 50, 6 (June 2015), 229--239. Google ScholarDigital Library
- A. Graves, G. Wayne, and I. Danihelka. 2014. Neural Turing Machines. ArXiv e-prints (Oct. 2014). arXiv: 1410.5401Google Scholar
- Sumit Gulwani, William R. Harris, and Rishabh Singh. 2012. Spreadsheet Data Manipulation Using Examples. Commun. ACM 55, 8 (Aug. 2012), 97--105. https://doi.org/ Google ScholarDigital Library
- K. He, X. Zhang, S. Ren, and J. Sun. 2015. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. In 2015 IEEE International Conference on Computer Vision (ICCV). 1026--1034. Google ScholarDigital Library
- Thomas Helmuth, Lee Spector, and James Matheson. 2015. Solving Uncompromising Problems with Lexicase Selection. IEEE Transactions on Evolutionary Computation 19, 5 (Oct. 2015), 630--643. https://doi.org/Google ScholarDigital Library
- Torsten Hildebrandt and Juergen Branke. 2015. On Using Surrogates with Genetic Programming. Evolutionary Computation 23, 3 (Fall 2015), 343--367. Google ScholarDigital Library
- Diederik P. Kingma and Jimmy Ba. 2014. Adam: A Method for Stochastic Optimization. CoRR abs/1412.6980 (2014). arXiv:1412.6980 http://arxiv.org/abs/1412.6980Google Scholar
- Krzysztof Krawiec and Una-May O'Reilly. 2014. Behavioral programming: a broader and more detailed take on semantic GP. In GECCO '14: Proceedings of the 2014 conference on Genetic and evolutionary computation, Christian Igel, et al. (Ed.). ACM, Vancouver, BC, Canada, 935--942. https://doi.org/ Best paper. Google ScholarDigital Library
- Paweł Liskowski and Krzysztof Krawiec. 2016. Surrogate Fitness via Factorization of Interaction Matrix. In EuroGP 2016: Proceedings of the 19th European Conference on Genetic Programming (LNCS), Malcolm I. Heywood, James McDermott, Mauro Castelli, Ernesto Costa, and Kevin Sim (Eds.), Vol. 9594. Springer Verlag, Porto, Portugal, 68--82. https://doi.org/ Best paper.Google Scholar
- Armando Solar-Lezama. 2008. Program Synthesis by Sketching. Ph.D. Dissertation. Electrical Engineering and Computer Science, University of California, Berkeley, USA. http://people.csail.mit.edu/asolar/papers/thesis.pdf Google ScholarDigital Library
- W. Zaremba, T. Mikolov, A. Joulin, and R. Fergus. 2015. Learning Simple Algorithms from Examples. ArXiv e-prints (Nov. 2015). arXiv:cs.AI/1511.07275Google Scholar
Index Terms
- Neuro-guided genetic programming: prioritizing evolutionary search with neural networks
Recommendations
Neural network crossover in genetic algorithms using genetic programming
AbstractThe use of genetic algorithms (GAs) to evolve neural network (NN) weights has risen in popularity in recent years, particularly when used together with gradient descent as a mutation operator. However, crossover operators are often omitted from ...
Cartesian genetic programming encoded artificial neural networks: a comparison using three benchmarks
GECCO '13: Proceedings of the 15th annual conference on Genetic and evolutionary computationNeuroevolution, the application of evolutionary algorithms to artificial neural networks (ANNs), is well-established in machine learning. Cartesian Genetic Programming (CGP) is a graph-based form of Genetic Programming which can easily represent ANNs. ...
Recurrent Cartesian Genetic Programming of Artificial Neural Networks
Cartesian Genetic Programming of Artificial Neural Networks is a NeuroEvolutionary method based on Cartesian Genetic Programming. Cartesian Genetic Programming has recently been extended to allow recurrent connections. This work investigates applying ...
Comments