Abstract
Hyper-heuristic methodologies have been extensively and successfully used to generate combinatorial optimization heuristics. On the other hand, there have been almost no attempts to build a hyper-heuristic to evolve an algorithm for solving real-valued optimization problems. In our previous research, we succeeded to evolve a Nelder–Mead-like real function minimization heuristic using genetic programming and the primitives extracted from the original Nelder–Mead algorithm. The resulting heuristic was better than the original Nelder–Mead method in the number of solved test problems but it was slower in that it needed considerably more cost function evaluations to solve the problems also solved by the original method. In this paper we exploit grammatical evolution as a hyper-heuristic to evolve heuristics that outperform the original Nelder–Mead method in all aspects. However, the main goal of the paper is not to build yet another real function optimization algorithm but to shed some light on the influence of different factors on the behavior of the evolution process as well as on the quality of the obtained heuristics. In particular, we investigate through extensive evolution runs the influence of the shape and dimensionality of the training function, and the impact of the size limit set to the evolving algorithms. At the end of this research we succeeded to evolve a number of heuristics that solved more test problems and in fewer cost function evaluations than the original Nelder–Mead method. Our solvers are also highly competitive with the improvements made to the original method based on rigorous mathematical convergence proofs found in the literature. Even more importantly, we identified some directions in which to continue the work in order to be able to construct a productive hyper-heuristic capable of evolving real function optimization heuristics that would outperform a human designer in all aspects.
Similar content being viewed by others
Notes
Note that we performed experiments using four-, eight-, 16-, and 24-dimensional quadratic functions as training problems but did not get any useful solvers using 24-dimensional training functions, and got solvers that were only successful with low-dimensional problems using 4-dimensional training functions. We therefore only give a detailed account of results obtained by eight- and 16-dimensional training functions.
If the solver solves \(70\%\) or more problems, there is a second number in the parentheses in the first column, denoting the number of simplex evaluations needed to solve the first \(70\%\) of the problems (i.e., the inverse of Eq. (7)). We included these numbers in order to be able to compare the speed of solvers that solved a different number of problems.
References
A.F. Ali, M.A. Tawhid, A hybrid cuckoo search algorithm with Nelder Mead method for solving global optimization problems. SpringerPlus 5, 473 (2016). https://doi.org/10.1186/s40064-016-2064-1
H. Assimi, A. Jamali, A hybrid algorithm coupling genetic programming and nelder-mead for topology and size optimization of trusses with static and dynamic constraints. Expert Syst. Appl. 95, 127–141 (2018)
E.K. Burke, M. Gendreau, M. Hyde, G. Kendall, G. Ochoa, E. Özcan, R. Qu, Hyper-heuristics: a survey of the state of the art. J. Oper. Res. Soc. 64(12), 1695–1724 (2013). https://doi.org/10.1057/jors.2013.71
Á. Bűrmen, J. Puhan, T. Tuma, Grid restrained Nelder–Mead algorithm. Comput. Optim. Appl. 34(3), 359–375 (2006)
J. Byrne, M. O’Neill, A. Brabazon, Structural and nodal mutation in grammatical evolution, in Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation, ACM, New York, NY, USA, GECCO ’09, pp. 1881–1882 (2009). https://doi.org/10.1145/1569901.1570215
R. Chelouah, P. Siarry, Genetic and neldermead algorithms hybridized for a more accurate global optimization of continuous multiminima functions. Eur. J. Oper. Res. 148, 335–348 (2003)
A.R. Conn, K. Scheinberg, L.N. Vicente, Introduction to Derivative-Free Optimization (Society for Industrial and Applied Mathematics, Philadelphia, 2009)
L. Dioşan, M. Oltean, Evolutionary design of evolutionary algorithms. Genet. Program Evolvable Mach. 10(3), 263–306 (2009)
I. Fajfar, J. Puhan, Á. Bürmen, Evolving a nelder-mead algorithm for optimization with genetic programming. Evol. Comput. 25(3), 351–373 (2017). https://doi.org/10.1162/evco_a_00174
F. Gao, L. Han, Implementing the Nelder–Mead simplex algorithm with adaptive parameters. Comput. Optim. Appl. 51(1), 259–277 (2012)
L. Han, M. Neumann, Effect of dimensionality on the Nelder–Mead simplex method. Optim. Methods Softw. 21(1), 1–16 (2006). https://doi.org/10.1080/10556780512331318290
R. Harper, A. Blair, A structure preserving crossover in grammatical evolution, in ed. by D. Corne, et al. Proceedings of the 2005 IEEE Congress on Evolutionary Computation, vol 3. (IEEE Press, 2005), pp. 2537–2544
T. Helmuth, L. Spector, B. Martin, Size-based tournaments for node selection, in ed. by N. Krasnogor, et al. Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation GECCO ’11 (ACM, 2011), pp. 799–802
J.R. Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection (MIT Press, Cambridge, 1992)
J.C. Lagarias, J.A. Reeds, M.H. Wright, P.E. Wright, Convergence properties of the Nelder–Mead simplex method in low dimensions. SIAM J. Optim. 9(1), 112–147 (1998)
S. Luke, L. Panait, A comparison of bloat control methods for genetic programming. Evol. Comput. 14(3), 309–344 (2006)
J.J. Moré, S.M. Wild, Benchmarking derivative-free optimization algorithms. SIAM J. Optim. 20(1), 172–191 (2009)
J.J. Moré, B.S. Garbow, K.E. Hillstrom, Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)
J.A. Nelder, R. Mead, A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)
M. Oltean, Evolving evolutionary algorithms using linear genetic programming. Evol. Comput. 13(3), 387–410 (2005). https://doi.org/10.1162/1063656054794815
M. O’Neill, C. Ryan, Grammatical evolution. IEEE Trans. Evol. Comput. 5(4), 349–358 (2001)
G.L. Pappa, G. Ochoa, M.R. Hyde, A.A. Freitas, J. Woodward, J. Swan, Contrasting meta-learning and hyper-heuristic research: the role of evolutionary algorithms. Genet. Program Evolvable Mach. 15(1), 3–35 (2014). https://doi.org/10.1007/s10710-013-9186-9
R. Poli, W.B. Langdon, N.F. McPhee, A field guide to genetic programming. Published via lulu.com (with contributions by J. R. Koza) (2008)
L.M. Rios, N.V. Sahinidis, Derivative-free optimization: a review of algorithms and comparison of software implementations. J. Glob. Optim. 56(3), 1247–1293 (2013)
C. Ryan, R.M.A. Azad, Sensible initialisation in chorus, in ed. by C. Ryan, T. Soule, M. Keijzer, E.P.K. Tsang, R. Poli, E. Costa, Genetic Programming, 6th European Conference, EuroGP 2003 (Springer, Berlin, 2003), pp. 394–403
M.H. Wright, Nelder, Mead, and the other simplex method. Documenta Math. special volume (“Optimization Stories”), 271–276 (2012)
Acknowledgements
This work was supported by the Ministry of Education, Science and Sport of Republic of Slovenia under Research Program P2-0246—Algorithms and optimization methods in telecommunications.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Fajfar, I., Bűrmen, Á. & Puhan, J. Grammatical evolution as a hyper-heuristic to evolve deterministic real-valued optimization algorithms. Genet Program Evolvable Mach 19, 473–504 (2018). https://doi.org/10.1007/s10710-018-9324-5
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10710-018-9324-5