skip to main content
10.1145/3319619.3326806acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Instruction-level design of local optimisers using push GP

Published:13 July 2019Publication History

ABSTRACT

This work uses genetic programming to explore the design space of local optimisation algorithms. Optimisers are expressed in the Push programming language, a stack-based language with a wide range of typed primitive instructions. The evolutionary framework provides the evolving optimisers with an outer loop and information about whether a solution has improved, but otherwise they are relatively unconstrained in how they explore optimisation landscapes. To test the utility of this approach, optimisers were evolved on four different types of continuous landscape, and the search behaviours of the evolved optimisers analysed. By making use of mathematical functions such as tangents and logarithms to explore different neighbourhoods, and also by learning features of the landscapes, it was observed that the evolved optimisers were often able to reach the optima using relatively short paths.

References

  1. Anne Auger and Nikolaus Hansen. 2005. Performance evaluation of an advanced local search evolutionary algorithm. In 2005 IEEE Congress on Evolutionary Computation, Vol. 2. IEEE, 1777--1784.Google ScholarGoogle ScholarCross RefCross Ref
  2. Edmund K Burke, Michel Gendreau, Matthew Hyde, Graham Kendall, Gabriela Ochoa, Ender Özcan, and Rong Qu. 2013. Hyper-heuristics: A survey of the state of the art. Journal of the Operational Research Society 64, 12 (2013), 1695--1724.Google ScholarGoogle ScholarCross RefCross Ref
  3. Iztok Fajfar, Janez Puhan, and Arpad Burmen. 2017. Evolving a Nelder-Mead algorithm for optimization with genetic programming. Evolutionary Computation 25, 3 (2017), 351--373. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Brian W Goldman and Daniel R Tauritz. 2011. Self-configuring crossover. In Proceedings of the 13th annual conference companion on Genetic and evolutionary computation. ACM, 575--582. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Thomas Helmuth, Lee Spector, and James Matheson. 2015. Solving uncompromising problems with lexicase selection. IEEE Transactions on Evolutionary Computation 19, 5 (2015), 630--643.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Thomas Joyce and J Michael Herrmann. 2018. A review of no free lunch theorems, and their implications for metaheuristic optimisation. In Nature-Inspired Algorithms and Applied Optimization. Springer, 27--51.Google ScholarGoogle Scholar
  7. Pascal Kerschke, Holger H Hoos, Frank Neumann, and Heike Trautmann. 2019. Automated algorithm selection: Survey and perspectives. Evolutionary computation 27, 1 (2019), 3--45.Google ScholarGoogle Scholar
  8. Pascal Kerschke and Heike Trautmann. 2018. Automated algorithm selection on continuous black-box problems by combining exploratory landscape analysis and machine learning. Evolutionary Computation (2018), 1--29.Google ScholarGoogle Scholar
  9. Ke Li and Jitendra Malik. 2017. Learning to optimize neural nets. arXiv preprint arXiv:1703.00441 (2017).Google ScholarGoogle Scholar
  10. Michael Adam Lones. 2019. Mitigating Metaphors: A Comprehensible Guide to Recent Nature-Inspired Algorithms. arXiv preprint arXiv:1902.08001 (2019).Google ScholarGoogle Scholar
  11. Nuno Lourenço, Francisco Pereira, and Ernesto Costa. 2012. Evolving evolutionary algorithms. In Proceedings of the 14th annual conference companion on Genetic and evolutionary computation. ACM, 51--58. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Matthew A Martin and Daniel R Tauritz. 2013. Evolving black-box search algorithms employing genetic programming. In Proceedings of the 15th annual conference companion on Genetic and evolutionary computation. ACM, 1497--1504. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Mihai Oltean. 2005. Evolving evolutionary algorithms using linear genetic programming. Evolutionary Computation 13, 3 (2005), 387--410. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Gisele L Pappa, Gabriela Ochoa, Matthew R Hyde, Alex A Freitas, John Woodward, and Jerry Swan. 2014. Contrasting meta-learning and hyper-heuristic research: the role of evolutionary algorithms. Genetic Programming and Evolvable Machines 15, 1 (2014), 3--35. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. John R Rice. 1976. The algorithm selection problem. In Advances in computers. Vol. 15. Elsevier, 65--118.Google ScholarGoogle Scholar
  16. Samuel N Richter and Daniel R Tauritz. 2018. The automated design of probabilistic selection methods for evolutionary algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference Companion. ACM, 1545--1552. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Patricia Ryser-Welch, Julian F Miller, Jerry Swan, and Martin A Trefzer. 2016. Iterative Cartesian genetic programming: Creating general algorithms for solving travelling salesman problems. In European Conference on Genetic Programming. Springer, 294--310.Google ScholarGoogle ScholarCross RefCross Ref
  18. Kenneth Sörensen. 2015. Metaheuristics---the metaphor exposed. International Transactions in Operational Research 22, 1 (2015), 3--18.Google ScholarGoogle ScholarCross RefCross Ref
  19. Lee Spector. 2001. Autoconstructive evolution: Push, pushGP, and pushpop. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), Vol. 137.Google ScholarGoogle Scholar
  20. Lee Spector, Chris Perry, Jon Klein, and Maarten Keijzer. 2004. Push 3.0 programming language description. Technical Report. HC-CSTR-2004-02, School of Cognitive Science, Hampshire College.Google ScholarGoogle Scholar
  21. Lee Spector and Alan Robinson. 2002. Genetic programming and autoconstructive evolution with the push programming language. Genetic Programming and Evolvable Machines 3, 1 (2002), 7--40. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Ponnuthurai N Suganthan, Nikolaus Hansen, Jing J Liang, Kalyanmoy Deb, Ying-Ping Chen, Anne Auger, and Santosh Tiwari. 2005. Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. KanGAL report 2005005 (2005).Google ScholarGoogle Scholar
  23. Jerry Swan, Patrick De Causmaecker, Simon Martin, and Ender Özcan. 2018. A re-characterization of hyper-heuristics. In Recent Developments in Metaheuristics. Springer, 75--89.Google ScholarGoogle Scholar
  24. Sander van Rijn, Hao Wang, Matthijs van Leeuwen, and Thomas Bäck. 2016. Evolving the structure of evolution strategies. In 2016 IEEE Symposium Series on Computational Intelligence (SSCI). IEEE, 1--8.Google ScholarGoogle ScholarCross RefCross Ref
  25. Ricardo Vilalta and Youssef Drissi. 2002. A perspective view and survey of meta-learning. Artificial Intelligence Review 18, 2 (2002), 77--95. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Ricardo Vilalta, Christophe G Giraud-Carrier, Pavel Brazdil, and Carlos Soares. 2004. Using Meta-Learning to Support Data Mining. IJCSA 1,1 (2004), 31--45.Google ScholarGoogle Scholar
  27. Olga Wichrowska, Niru Maheswaranathan, Matthew W Hoffman, Sergio Gomez Colmenarejo, Misha Denil, Nando de Freitas, and Jascha Sohl-Dickstein. 2017. Learned optimizers that scale and generalize. In Proceedings of the 34th International Conference on Machine Learning-Volume 70. JML. org, 3751--3760. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. David H Wolpert, William G Macready, et al. 1997. No free lunch theorems for optimiazation. IEEE Transactions on Evolutionary Commputation 1. 1 (1997). 67--82. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. John R Woodward and Jerry Swan. 2012. The automatic generation of mutation operators for genetic algorithms. In Proceedings of the 14th annual conference companion on Genetic and evolutionary computation. ACM, 67--74. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Instruction-level design of local optimisers using push GP

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        GECCO '19: Proceedings of the Genetic and Evolutionary Computation Conference Companion
        July 2019
        2161 pages
        ISBN:9781450367486
        DOI:10.1145/3319619

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 13 July 2019

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate1,669of4,410submissions,38%

        Upcoming Conference

        GECCO '24
        Genetic and Evolutionary Computation Conference
        July 14 - 18, 2024
        Melbourne , VIC , Australia

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader