ABSTRACT
This work uses genetic programming to explore the design space of local optimisation algorithms. Optimisers are expressed in the Push programming language, a stack-based language with a wide range of typed primitive instructions. The evolutionary framework provides the evolving optimisers with an outer loop and information about whether a solution has improved, but otherwise they are relatively unconstrained in how they explore optimisation landscapes. To test the utility of this approach, optimisers were evolved on four different types of continuous landscape, and the search behaviours of the evolved optimisers analysed. By making use of mathematical functions such as tangents and logarithms to explore different neighbourhoods, and also by learning features of the landscapes, it was observed that the evolved optimisers were often able to reach the optima using relatively short paths.
- Anne Auger and Nikolaus Hansen. 2005. Performance evaluation of an advanced local search evolutionary algorithm. In 2005 IEEE Congress on Evolutionary Computation, Vol. 2. IEEE, 1777--1784.Google ScholarCross Ref
- Edmund K Burke, Michel Gendreau, Matthew Hyde, Graham Kendall, Gabriela Ochoa, Ender Özcan, and Rong Qu. 2013. Hyper-heuristics: A survey of the state of the art. Journal of the Operational Research Society 64, 12 (2013), 1695--1724.Google ScholarCross Ref
- Iztok Fajfar, Janez Puhan, and Arpad Burmen. 2017. Evolving a Nelder-Mead algorithm for optimization with genetic programming. Evolutionary Computation 25, 3 (2017), 351--373. Google ScholarDigital Library
- Brian W Goldman and Daniel R Tauritz. 2011. Self-configuring crossover. In Proceedings of the 13th annual conference companion on Genetic and evolutionary computation. ACM, 575--582. Google ScholarDigital Library
- Thomas Helmuth, Lee Spector, and James Matheson. 2015. Solving uncompromising problems with lexicase selection. IEEE Transactions on Evolutionary Computation 19, 5 (2015), 630--643.Google ScholarDigital Library
- Thomas Joyce and J Michael Herrmann. 2018. A review of no free lunch theorems, and their implications for metaheuristic optimisation. In Nature-Inspired Algorithms and Applied Optimization. Springer, 27--51.Google Scholar
- Pascal Kerschke, Holger H Hoos, Frank Neumann, and Heike Trautmann. 2019. Automated algorithm selection: Survey and perspectives. Evolutionary computation 27, 1 (2019), 3--45.Google Scholar
- Pascal Kerschke and Heike Trautmann. 2018. Automated algorithm selection on continuous black-box problems by combining exploratory landscape analysis and machine learning. Evolutionary Computation (2018), 1--29.Google Scholar
- Ke Li and Jitendra Malik. 2017. Learning to optimize neural nets. arXiv preprint arXiv:1703.00441 (2017).Google Scholar
- Michael Adam Lones. 2019. Mitigating Metaphors: A Comprehensible Guide to Recent Nature-Inspired Algorithms. arXiv preprint arXiv:1902.08001 (2019).Google Scholar
- Nuno Lourenço, Francisco Pereira, and Ernesto Costa. 2012. Evolving evolutionary algorithms. In Proceedings of the 14th annual conference companion on Genetic and evolutionary computation. ACM, 51--58. Google ScholarDigital Library
- Matthew A Martin and Daniel R Tauritz. 2013. Evolving black-box search algorithms employing genetic programming. In Proceedings of the 15th annual conference companion on Genetic and evolutionary computation. ACM, 1497--1504. Google ScholarDigital Library
- Mihai Oltean. 2005. Evolving evolutionary algorithms using linear genetic programming. Evolutionary Computation 13, 3 (2005), 387--410. Google ScholarDigital Library
- Gisele L Pappa, Gabriela Ochoa, Matthew R Hyde, Alex A Freitas, John Woodward, and Jerry Swan. 2014. Contrasting meta-learning and hyper-heuristic research: the role of evolutionary algorithms. Genetic Programming and Evolvable Machines 15, 1 (2014), 3--35. Google ScholarDigital Library
- John R Rice. 1976. The algorithm selection problem. In Advances in computers. Vol. 15. Elsevier, 65--118.Google Scholar
- Samuel N Richter and Daniel R Tauritz. 2018. The automated design of probabilistic selection methods for evolutionary algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference Companion. ACM, 1545--1552. Google ScholarDigital Library
- Patricia Ryser-Welch, Julian F Miller, Jerry Swan, and Martin A Trefzer. 2016. Iterative Cartesian genetic programming: Creating general algorithms for solving travelling salesman problems. In European Conference on Genetic Programming. Springer, 294--310.Google ScholarCross Ref
- Kenneth Sörensen. 2015. Metaheuristics---the metaphor exposed. International Transactions in Operational Research 22, 1 (2015), 3--18.Google ScholarCross Ref
- Lee Spector. 2001. Autoconstructive evolution: Push, pushGP, and pushpop. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), Vol. 137.Google Scholar
- Lee Spector, Chris Perry, Jon Klein, and Maarten Keijzer. 2004. Push 3.0 programming language description. Technical Report. HC-CSTR-2004-02, School of Cognitive Science, Hampshire College.Google Scholar
- Lee Spector and Alan Robinson. 2002. Genetic programming and autoconstructive evolution with the push programming language. Genetic Programming and Evolvable Machines 3, 1 (2002), 7--40. Google ScholarDigital Library
- Ponnuthurai N Suganthan, Nikolaus Hansen, Jing J Liang, Kalyanmoy Deb, Ying-Ping Chen, Anne Auger, and Santosh Tiwari. 2005. Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. KanGAL report 2005005 (2005).Google Scholar
- Jerry Swan, Patrick De Causmaecker, Simon Martin, and Ender Özcan. 2018. A re-characterization of hyper-heuristics. In Recent Developments in Metaheuristics. Springer, 75--89.Google Scholar
- Sander van Rijn, Hao Wang, Matthijs van Leeuwen, and Thomas Bäck. 2016. Evolving the structure of evolution strategies. In 2016 IEEE Symposium Series on Computational Intelligence (SSCI). IEEE, 1--8.Google ScholarCross Ref
- Ricardo Vilalta and Youssef Drissi. 2002. A perspective view and survey of meta-learning. Artificial Intelligence Review 18, 2 (2002), 77--95. Google ScholarDigital Library
- Ricardo Vilalta, Christophe G Giraud-Carrier, Pavel Brazdil, and Carlos Soares. 2004. Using Meta-Learning to Support Data Mining. IJCSA 1,1 (2004), 31--45.Google Scholar
- Olga Wichrowska, Niru Maheswaranathan, Matthew W Hoffman, Sergio Gomez Colmenarejo, Misha Denil, Nando de Freitas, and Jascha Sohl-Dickstein. 2017. Learned optimizers that scale and generalize. In Proceedings of the 34th International Conference on Machine Learning-Volume 70. JML. org, 3751--3760. Google ScholarDigital Library
- David H Wolpert, William G Macready, et al. 1997. No free lunch theorems for optimiazation. IEEE Transactions on Evolutionary Commputation 1. 1 (1997). 67--82. Google ScholarDigital Library
- John R Woodward and Jerry Swan. 2012. The automatic generation of mutation operators for genetic algorithms. In Proceedings of the 14th annual conference companion on Genetic and evolutionary computation. ACM, 67--74. Google ScholarDigital Library
Index Terms
- Instruction-level design of local optimisers using push GP
Recommendations
The automated design of local optimizers for memetic algorithms employing supportive coevolution
GECCO '20: Proceedings of the 2020 Genetic and Evolutionary Computation Conference CompanionOne promising method of improving Evolutionary Algorithm (EA) performance is to improve its fine tuning capabilities by using an additional local optimization operator in the evolutionary cycle. This customization on the traditional EA is typically ...
Optimising Optimisers with Push GP
Genetic ProgrammingAbstractThis work uses Push GP to automatically design both local and population-based optimisers for continuous-valued problems. The optimisers are trained on a single function optimisation landscape, using random transformations to discourage ...
Evolving continuous optimisers from scratch
AbstractThis work uses genetic programming to explore the space of continuous optimisers, with the goal of discovering novel ways of doing optimisation. In order to keep the search space broad, the optimisers are evolved from scratch using Push, a Turing-...
Comments