Abstract
The No Free Lunch (NFL) theorem for search and optimisation states that averaged across all possible objective functions on a fixed search space, all search algorithms perform equally well. Several refined versions of the theorem find a similar outcome when averaging across smaller sets of functions. This paper argues that NFL results continue to be misunderstood by many researchers, and addresses this issue in several ways. Existing arguments against real-world implications of NFL results are collected and re-stated for accessibility and new ones are added. Specific misunderstandings extant in the literature are identified, with speculation as to how they may have arisen. This paper presents an argument against a common paraphrase of NFL findings—that algorithms must be specialised to problem domains to do well—after problematising the usually undefined term “domain”. It provides novel concrete counter-examples illustrating cases where NFL theorems do not apply. In conclusion, it offers a novel view of the real meaning of NFL, incorporating the anthropic principle and justifying the position that in many common situations researchers can ignore NFL.
Similar content being viewed by others
Change history
28 September 2023
A Correction to this paper has been published: https://doi.org/10.1007/s42979-023-02168-3
Notes
http://scholar.google.com/scholar?q=’no+free+lunch’+wolpert+macready, 20 February 2018.
A statement of practical consequences of more recent NFL variants; several corrections of NFL misunderstandings; problematising the term “problem domain”; argument that existing generic algorithms are already specialised; fitness distance correlation and modularity as escapes from NFL; concrete NFL counter-examples in several domains; and introduction of the anthropic principle as a justification for the position that in many common situations researchers can ignore NFL.
Due to an anonymous reviewer.
References
Wolpert DH, Macready WG. No free lunch theorems for search. Technical report SFI-TR-95-02-010, Santa Fe Institute. 1995.
Wolpert DH, Macready WG. No free lunch theorems for optimization. Trans Evol Comput. 1997;1(1):67–82.
Ho YC, Pepyne DL. Simple explanation of the no-free-lunch theorem and its implications. J. Optim Theory Appl. 2002;115(3):549–70. https://doi.org/10.1023/A:1021251113462 (ISSN 0022-3239).
Häggström O. Uniform distribution is a model assumption. Unpublished. 2007. http://www.math.chalmers.se/~olleh/papers.html. Accessed 11 Mar 2018.
Aaronson S. Quickies. In: Shtetl-optimized: the blog of Scott Aaronson. 2017. https://www.scottaaronson.com/blog/?p=3553. Accessed Dec 2017.
Hutter M. A complete theory of everything (will be subjective). Algorithms. 2010;3(4):329–50.
Wolpert DH. The lack of a priori distinctions between learning algorithms. Neural Comput. 1996;8(7):1341–90.
Corne DW, Knowles JD. No free lunch and free leftovers theorems for multiobjective optimisation problems. In: Evolutionary multi-criterion optimization. Springer; 2003. p. 327–341.
Whitley D, Watson JP. Complexity theory and the no free lunch theorem. In: Search methodologies, chapter 10. Springer; 2005. p. 317–339.
Joyce T, Herrmann JM. A review of no free lunch theorems, and their implications for metaheuristic optimisation. In: Nature-inspired algorithms and applied optimization. Springer; 2018. p. 27–51.
Poli R, Graff M. There is a free lunch for hyper-heuristics, genetic programming and computer scientists. In: Genetic programming. Springer; 2009. p. 195–207.
Droste S, Jansen T, Wegener I. Optimization with randomized search heuristics-the (A)NFL theorem, realistic scenarios, and difficult functions. Theor Comput Sci. 2002a;287(1):131–44.
Koppen M, Wolpert DH, Macready WG. Remarks on a recent paper on the “no free lunch” theorems. Trans Evol Comput. 2001;5(3):295–6.
Oltean M. Searching for a practical evidence of the no free lunch theorems. In: Biologically inspired approaches to advanced information technology. Springer; 2004. p. 472–483.
Radcliffe NJ, Surry PD. Fundamental limitations on search algorithms: evolutionary computing in perspective. In: Computer science today. Springer; 1995. p. 275–291.
Wolpert DH. What the no free lunch theorems really mean; how to improve search algorithms. In: Ubiquity symposium. 2012. http://www.santafe.edu/media/workingpapers/12-10-017.pdf. http://ubiquity.acm.org/symposia.cfm
Droste S, Jansen T, Wegener I. Perhaps not a free lunch but at least a free appetizer. In: Proceedings of the genetic and evolutionary computation conference (GECCO’99). Morgan Kaufmann; 1999. p. 833–839.
Bengio Y. LeCun, Yann: scaling learning algorithms towards ai. In: Bottou L, Chapelle O, DeCoste D, Weston J, editors. Large-scale kernel machines. Cambridge: MIT Press; 2007.
Schumacher C, Vose MD, Whitley LD. The no free lunch and problem description length. In: Proceedings of the genetic and evolutionary computation conference (GECCO). 2001. p. 565–570.
Igel C, Toussaint M. A no-free-lunch theorem for non-uniform distributions of target functions. J Math Model Algorithm. 2004;3(4):313–22.
Rowe JE, Vose MD, Wright AH. Reinterpreting no free lunch. Evol Comput. 2009;17(1):117–29.
Igel C, Toussaint M. On classes of functions for which no free lunch results hold. 2001. arXiv preprint arXiv:cs/0108011.
Koehler GJ. Conditions that obviate the no-free-lunch theorems for optimization. Inform J Comput. 2007;19(2):273–9.
Wegener I. Computational complexity and evolutionary computation. GECCO tutorial. 2004.
Streeter MJ. Two broad classes of functions for which a no free lunch result does not hold. In: Genetic and evolutionary computation (GECCO). Springer; 2003. p. 1418–1430.
English T. On the structure of sequential search: beyond “no free lunch”. In: European conference on evolutionary computation in combinatorial optimization. Springer; 2004a. p. 95–103.
English T. No more lunch: analysis of sequential search. In: Evolutionary computation, 2004. CEC2004. Congress on, vol. 1. IEEE; 2004b. p. 227–234.
Neil J, Woodward J. The universal distribution and a free lunch for program induction. Unpublished manuscript; date unknown; cited by English. 2004.
Whitley D, Rowe J. Focused no free lunch theorems. In: Proceedings of the 10th annual conference on genetic and evolutionary computation. ACM; 2008. p. 811–818.
Auger A, Teytaud O. Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica. 2010;57(1):121–46.
Alabert A, Berti A, Caballero R, Ferrante M. No-free-lunch theorems in the continuum. Theor Comput Sci. 2015;600:98–106.
Culberson JC. On the futility of blind search: an algorithmic view of “no free lunch”. Evol Comput. 1998;6(2):109–27.
Serafino, L.: No free lunch theorem and Bayesian probability theory: two sides of the same coin. Some implications for black-box optimization and metaheuristics. arXiv preprint arXiv:1311.6041 (2013)
Woodward, J.R., Neil, J.R.: No free lunch, program induction and combinatorial problems. In: European Conference on Genetic Programming, pp. 475–484. Springer (2003)
Duéñez Guzmán EA, Vose MD. No free lunch and benchmarks. Evol Comput. 2013;21(2):293–312. https://doi.org/10.1162/EVCO_a_00077.
Wolpert DH, Macready WG. Coevolutionary free lunches. IEEE Trans Evol Comput. 2005;9(6):721–35.
Goldberg D. Genetic algorithms in search, optimization and machine learning. Reading: Addison-Wesley; 1989.
Poli R, Graff M, McPhee NF. Free lunches for function and program induction. In: Proceedings of FOGA. ACM; 2009. p. 183–194. http://lsc.fie.umich.mx/~mgraffg/pdf/foga2009.pdf
Sewell M, Shawe-Taylor J. Forecasting foreign exchange rates using kernel methods. Expert Syst Appl. 2012;39(9):7652–62.
Lipson H. How does Eureqa compare to other machine learning methods? http://blog.nutonian.com/bid-330675-how-does-eureqa-compare-to-other-machine-learning-methods/, August 2013. Accessed 1 Mar 2018.
Watson J-P, Barbulescu L, Howe AE, Whitley LD. Algorithm performance and problem structure for flow-shop scheduling. In: AAAI/IAAI; 1999. p. 688–695.
Lam AYS, Li VOK. Chemical-reaction-inspired metaheuristic for optimization. IEEE Trans Evol Comput. 2010;14(3):381–99.
Sörensen K. Metaheuristics-the metaphor exposed. Int Trans Oper Res. 2015;22(1):3–18.
Weyland D. A critical analysis of the harmony search algorithm-how not to solve sudoku. Oper Res Perspect. 2015;2:97–105.
Al-Betar MA, Khader AT. A harmony search algorithm for university course timetabling. Ann Oper Res. 2012;194(1):3–31.
Ouaarab A, Ahiod B, Yang X-S. Discrete cuckoo search algorithm for the travelling salesman problem. Neural Comput Appl. 2014;24(7–8):1659–69.
Chawda BV, Patel JM. Investigating performance of various natural computing algorithms. Int J Intell Syst Appl. 2017;9(1):46–59.
Beheshti Z, Shamsuddin SMH. CAPSO: centripetal accelerated particle swarm optimization. Inform Sci. 2014;258:54–79.
Koza JR. Genetic programming: on the programming of computers by means of natural selection. Cambridge: MIT Press; 1992.
Ciuffo B, Punzo V. “No free lunch” theorems applied to the calibration of traffic simulation models. IEEE Trans Intel Trans Syst. 2014;15(2):553–62.
Vrugt JA, Robinson BA. Improved evolutionary optimization from genetically adaptive multimethod search. PNAS. 2007;104(3):708–11. https://doi.org/10.1073/pnas.0610471104 (ISSN 0027-8424).
English TM. Evaluation of evolutionary and genetic optimizers: no free lunch. In: Fogel LJ, Angeline PJ, Bäck T, editors. Evol Program V. Cambridge: MIT Press; 1996. p. 163–9.
Smith-Miles KA. Cross-disciplinary perspectives on meta-learning for algorithm selection. ACM Comput Surv. (CSUR). 2009;41(1):6.
Rice JR. The algorithm selection problem. In: Advances in computers, vol. 15. Elsevier; 1976. p. 65–118.
Yuen SY, Zhang X. On composing an algorithm portfolio. Memetic Comput. 2015;7(3):203–14.
Loshchilov I, Glasmachers T. Doesn’t the NFL theorem show that black box optimization is flawed? https://bbcomp.ini.rub.de/faq.html#q20, 2015. Accessed 7 Mar 2018.
Krawiec K, Wieloch B. Analysis of semantic modularity for genetic programming. Found Comput Decis Sci. 2009;34(4):265.
Christensen S, Oppacher F. What can we learn from no free lunch? A first attempt to characterize the concept of a searchable function. In: Proceedings of the 3rd annual conference on genetic and evolutionary computation. Morgan Kaufmann Publishers Inc.; 2001. p. 1219–1226.
Droste S, Jansen T, Wegener I. On the analysis of the (1+1) evolutionary algorithm. Theor Comput Sci. 2002b;276(1):51–81.
Serafino L. Optimizing without derivatives: what does the no free lunch theorem actually say. Notices of the AMS. 2014;61(7)
Jiang P, Chen Y-P. Free lunches on the discrete Lipschitz class. Theor Comput Sci. 2011;412(17):1614–28.
Jones T, Forrest S. Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In: Proceedings of the 6th international conference on genetic algorithms. San Francisco: Morgan Kaufmann Publishers Inc.; 1995. p. 184–192 (ISBN 1-55860-370-0).
Kimbrough SO, Koehler GJ, Lu M, Wood DH. On a feasible-infeasible two-population (FI-2Pop) genetic algorithm for constrained optimization: distance tracing and no free lunch. Eur J Oper Res. 2008;190(2):310–27.
Whitley D, Rowe J. A “no free lunch” tutorial: Sharpened and focused no free lunch. In: Theory of randomized search heuristics: foundations and recent developments. World Scientific; 2011. p. 255–287.
Schaffer C. A conservation law for generalization performance. In: Machine learning proceedings 1994. Elsevier; 1994. p. 259–265.
Domingos P. A few useful things to know about machine learning. Commun ACM. 2012;55(10):78–87.
Murphy K. Machine learning: a probabilistic approach. Cambridge: Massachusetts Institute of Technology; 2012.
Hume D. A treatise of human nature. Oxford: Oxford University Press; 1973 (1740. Modern edition published).
Lin HW, Tegmark M, Rolnick D. Why does deep and cheap learning work so well? J Stat Phys. 2017;168(6):1223–1247. arxiv.org/abs/1608.08225
Carter B. Large number coincidences and the anthropic principle in cosmology. In: Symposium-International Astronomical Union, vol. 63. Cambridge University Press; 1974. p. 291–298.
Obolski U, Ram Y, Hadany L. Key issues review: evolution on rugged adaptive landscapes. Rep Prog Phys. 2017;81(1):012602.
Mendes R, Kennedy J, Neves J. The fully informed particle swarm: simpler, maybe better. IEEE Trans Evol Comput. 2004;8(3):204–10.
Solomonoff RJ. A formal theory of inductive inference. Part I Inf Control. 1964;7(1):1–22.
Neri F, Cotta C. A primer on memetic algorithms. In: Handbook of memetic algorithms. Springer; 2012. p. 43–52.
Bonissone PP, Subbu R, Eklund N, Kiehl TR. Evolutionary algorithms + domain knowledge = real-world evolutionary computation. Trans Evol Comput. 2006;10(3):256–80.
Acknowledgements
The author thanks the reviewers for suggesting some significant improvements. This work was carried out while the author was at University College Dublin.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
When and Why Metaheuristics Researchers can Ignore “No Free Lunch” Theorems was initially published in Metaheuristics that was closed without publishing an Issue. Consequently, the article has been re-published in SN Computer Science.
Rights and permissions
About this article
Cite this article
McDermott, J. When and Why Metaheuristics Researchers can Ignore “No Free Lunch” Theorems. SN COMPUT. SCI. 1, 60 (2020). https://doi.org/10.1007/s42979-020-0063-3
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s42979-020-0063-3