Skip to main content

MATE: A Model-Based Algorithm Tuning Engine

A Proof of Concept Towards Transparent Feature-Dependent Parameter Tuning Using Symbolic Regression

  • Conference paper
  • First Online:
Evolutionary Computation in Combinatorial Optimization (EvoCOP 2021)

Abstract

In this paper, we introduce a Model-based Algorithm Tuning Engine, namely MATE, where the parameters of an algorithm are represented as expressions of the features of a target optimisation problem. In contrast to most static (feature-independent) algorithm tuning engines such as irace and SPOT, our approach aims to derive the best parameter configuration of a given algorithm for a specific problem, exploiting the relationships between the algorithm parameters and the features of the problem. We formulate the problem of finding the relationships between the parameters and the problem features as a symbolic regression problem and we use genetic programming to extract these expressions in a human-readable form. For the evaluation, we apply our approach to the configuration of the (1 + 1) EA and RLS algorithms for the OneMax, LeadingOnes, BinValue and Jump optimisation problems, where the theoretically optimal algorithm parameters to the problems are available as functions of the features of the problems. Our study shows that the found relationships typically comply with known theoretical results – this demonstrates (1) the potential of model-based parameter tuning as an alternative to existing static algorithm tuning engines, and (2) its potential to discover relationships between algorithm performance and instance features in human-readable form.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The current MATE implementation is publicly available at https://gitlab.com/yafrani/mate.

References

  1. Agrawal, A., Menzies, T., Minku, L.L., Wagner, M., Yu, Z.: Better software analytics via “duo”: data mining algorithms using/used-by optimizers. Empirical Softw. Eng. 25(3), 2099–2136 (2020)

    Article  Google Scholar 

  2. Ansótegui, C., Sellmann, M., Tierney, K.: A gender-based genetic algorithm for the automatic configuration of algorithms. In: Gent, I.P. (ed.) CP 2009. LNCS, vol. 5732, pp. 142–157. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-04244-7_14

    Chapter  Google Scholar 

  3. Bartz-Beielstein, T., Flasch, O., Koch, P., Konen, W., et al.: SPOT: a toolbox for interactive and automatic tuning in the R environment. In: Proceedings, vol. 20, pp. 264–273 (2010)

    Google Scholar 

  4. Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M.: Feature based algorithm configuration: a case study with differential evolution. In: Handl, J., Hart, E., Lewis, P.R., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds.) PPSN 2016. LNCS, vol. 9921, pp. 156–166. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-45823-6_15

    Chapter  Google Scholar 

  5. Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M.: Per instance algorithm configuration of CMA-ES with limited budget. In: Genetic and Evolutionary Computation Conference. GECCO 2017, pp. 681–688. ACM (2017)

    Google Scholar 

  6. Böttcher, S., Doerr, B., Neumann, F.: Optimal fixed and adaptive mutation rates for the leadingones problem. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 1–10. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_1

    Chapter  Google Scholar 

  7. Buskulic, N., Doerr, C.: Maximizing drift is not optimal for solving onemax. In: Genetic and Evolutionary Computation Conference, GECCO 2019, pp. 425–426. ACM (2019). http://arxiv.org/abs/1904.07818

  8. Chicano, F., Sutton, A.M., Whitley, L.D., Alba, E.: Fitness probability distribution of bit-flip mutation. Evol. Comput. 23(2), 217–248 (2015)

    Article  Google Scholar 

  9. Doerr, B.: Analyzing randomized search heuristics via stochastic domination. Theor. Comput. Sci. 773, 115–137 (2019)

    Article  MathSciNet  Google Scholar 

  10. Doerr, B., Doerr, C., Lengler, J.: Self-adjusting mutation rates with provably optimal success rules. In: Proceeding of Genetic and Evolutionary Computation Conference (GECCO 2019), pp. 1479–1487. ACM (2019). https://doi.org/10.1145/3321707.3321733, https://arxiv.org/abs/1902.02588

  11. Doerr, B., Doerr, C., Yang, J.: Optimal parameter choices via precise black-box analysis. Theor. Comput. Sci. 801, 1–34 (2020)

    Article  MathSciNet  Google Scholar 

  12. Doerr, B., Le, H.P., Makhmara, R., Nguyen, T.D.: Fast genetic algorithms. In: Genetic and Evolutionary Computation Conference, GECCO 2017, pp. 777–784. ACM (2017)

    Google Scholar 

  13. Doerr, B., Neumann, F.: Theory of evolutionary computation. In: Recent Developments in Discrete Optimization. Springer, Cham (2020)

    Google Scholar 

  14. Doerr, C., Wagner, M.: Simple on-the-fly parameter selection mechanisms for two classical discrete black-box optimization benchmark problems. In: Proceeding of Genetic and Evolutionary Computation Conference (GECCO 2018), pp. 943–950. ACM (2018). https://doi.org/10.1145/3205455.3205560

  15. El Yafrani, M., Ahiod, B.: Efficiently solving the traveling thief problem using hill climbing and simulated annealing. Inf. Sci. 432, 231–244 (2018)

    Article  MathSciNet  Google Scholar 

  16. Fawcett, C., Helmert, M., Hoos, H., Karpas, E., Röger, G., Seipp, J.: Fd-autotune: domain-specific configuration using fast downward. In: ICAPS 2011 Workshop on Planning and Learning, pp. 13–17 (2011)

    Google Scholar 

  17. Friedrich, T., Göbel, A., Quinzan, F., Wagner, M.: Heavy-tailed mutation operators in single-objective combinatorial optimization. In: Auger, A., Fonseca, C.M., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds.) PPSN 2018. LNCS, vol. 11101, pp. 134–145. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99253-2_11

    Chapter  Google Scholar 

  18. Friedrich, T., Quinzan, F., Wagner, M.: Escaping large deceptive basins of attraction with heavy-tailed mutation operators. In: Genetic and Evolutionary Computation Conference. GECCO 2018, pp. 293–300. ACM (2018)

    Google Scholar 

  19. Hoos, H.H.: Programming by optimization. Commun. ACM 55(2), 70–80 (2012)

    Article  Google Scholar 

  20. Hutter, F., Hamadi, Y., Hoos, H.H., Leyton-Brown, K.: Performance prediction and automated tuning of randomized and parametric algorithms. In: Benhamou, F. (ed.) CP 2006. LNCS, vol. 4204, pp. 213–228. Springer, Heidelberg (2006). https://doi.org/10.1007/11889205_17

    Chapter  MATH  Google Scholar 

  21. Hutter, F., Hoos, H.H., Leyton-Brown, K.: Automated configuration of mixed integer programming solvers. In: Lodi, A., Milano, M., Toth, P. (eds.) CPAIOR 2010. LNCS, vol. 6140, pp. 186–202. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-13520-0_23

    Chapter  Google Scholar 

  22. Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-25566-3_40

    Chapter  Google Scholar 

  23. Hutter, F., Hoos, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: an automatic algorithm configuration framework. J. Artif. Intell. Res. 36, 267–306 (2009)

    Article  Google Scholar 

  24. Hutter, F., Lindauer, M., Balint, A., Bayless, S., Hoos, H., Leyton-Brown, K.: The configurable SAT solver challenge (CSSC). Artif. Intell. 243, 1–25 (2017)

    Article  MathSciNet  Google Scholar 

  25. Hutter, F., Xu, L., Hoos, H.H., Leyton-Brown, K.: Algorithm runtime prediction: Methods & evaluation. Artif. Intell. 206, 79–111 (2014)

    Article  MathSciNet  Google Scholar 

  26. Jansen, T.: Analysing stochastic search heuristics operating on a fixed budget. Theory of Evolutionary Computation. NCS, pp. 249–270. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-29414-4_5

    Chapter  Google Scholar 

  27. Lengler, J., Spooner, N.: Fixed budget performance of the (1+1) EA on linear functions. In: ACM Conference on Foundations of Genetic Algorithms, FOGA 2015, pp. 52–61. ACM (2015)

    Google Scholar 

  28. Leyton-Brown, K., Nudelman, E., Shoham, Y.: Learning the empirical hardness of optimization problems: the case of combinatorial auctions. In: Van Hentenryck, P. (ed.) CP 2002. LNCS, vol. 2470, pp. 556–572. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-46135-3_37

    Chapter  Google Scholar 

  29. Liefooghe, A., Derbel, B., Verel, S., Aguirre, H., Tanaka, K.: Towards landscape-aware automatic algorithm configuration: preliminary experiments on neutral and rugged landscapes. In: Hu, B., López-Ibáñez, M. (eds.) EvoCOP 2017. LNCS, vol. 10197, pp. 215–232. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-55453-2_15

    Chapter  Google Scholar 

  30. López-Ibáñez, M., Dubois-Lacoste, J., Cáceres, L.P., Birattari, M., Stützle, T.: The irace package: iterated racing for automatic algorithm configuration. Oper. Res. Perspect. 3, 43–58 (2016)

    MathSciNet  Google Scholar 

  31. Mascia, F., Birattari, M., Stützle, T.: Tuning algorithms for tackling large instances: an experimental protocol. In: Nicosia, G., Pardalos, P. (eds.) LION 2013. LNCS, vol. 7997, pp. 410–422. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-44973-4_44

    Chapter  Google Scholar 

  32. Rai, A.: Explainable AI: from black box to glass box. J. Acad. Market. Sci. 48(1), 137–141 (2020)

    Article  Google Scholar 

  33. Snoek, J., Larochelle, H., Adams, R.P.: Practical bayesian optimization of machine learning algorithms. In: Advances in Neural Information Processing Systems, pp. 2951–2959 (2012)

    Google Scholar 

  34. Treude, C., Wagner, M.: Predicting good configurations for github and stack overflow topic models. In: 16th International Conference on Mining Software Repositories. MSR 2019, pp. 84–95. IEEE (2019)

    Google Scholar 

  35. Witt, C.: Tight bounds on the optimization time of a randomized search heuristic on linear functions. Comb. Probab. Comput. 22, 294–318 (2013)

    Article  MathSciNet  Google Scholar 

  36. Xu, L., Hutter, F., Hoos, H.H., Leyton-Brown, K.: SATzilla: portfolio-based algorithm selection for SAT. J. Artif. Intell. Res. 32, 565–606 (2008)

    Article  Google Scholar 

Download references

Acknowledgements

M. Martins acknowledges CNPq (Brazil Government). M. Wagner acknowledges the ARC Discovery Early Career Researcher Award DE160100850. C. Doerr acknowledges support from the Paris Ile-de-France Region. Experiments were performed on the AAU’s CLAUDIA compute cloud platform.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohamed El Yafrani .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

El Yafrani, M., Scoczynski, M., Sung, I., Wagner, M., Doerr, C., Nielsen, P. (2021). MATE: A Model-Based Algorithm Tuning Engine. In: Zarges, C., Verel, S. (eds) Evolutionary Computation in Combinatorial Optimization. EvoCOP 2021. Lecture Notes in Computer Science(), vol 12692. Springer, Cham. https://doi.org/10.1007/978-3-030-72904-2_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-72904-2_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-72903-5

  • Online ISBN: 978-3-030-72904-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics