ABSTRACT
Symbolic regression is a common problem in genetic programming (GP), but the syntactic search carried out by the standard GP algorithm often struggles to tune the learned expressions. On the other hand, gradient-based optimizers can efficiently tune parametric functions by exploring the search space locally. While there is a large amount of research on the combination of evolutionary algorithms and local search (LS) strategies, few of these studies deal with GP. To get the best from both worlds, we propose embedding learnable parameters in GP programs and combining the standard GP evolutionary approach with a gradient-based refinement of the individuals employing the Adam optimizer. We devise two different algorithms that differ in how these parameters are shared in the expression operators and report experimental results performed on a set of standard real-life application datasets. Our findings show that the proposed gradient-based LS approach can be effectively combined with GP to outperform the original algorithm.
Supplemental Material
Available for Download
Supplemental material.
- Douglas Adriano Augusto and Helio JC Barbosa. 2000. Symbolic regression via genetic programming. In Proceedings. Vol. 1. Sixth Brazilian Symposium on Neural Networks. IEEE, 173--178.Google Scholar
- Yoshua Bengio. 2000. Gradient-based optimization of hyperparameters. Neural computation 12, 8 (2000), 1889--1900.Google Scholar
- Mauro Castelli, Leonardo Trujillo, Leonardo Vanneschi, Sara Silva, Emigdio Z-Flores, and Pierrick Legrand. 2015. Geometric semantic genetic programming with local search. In Proceedings of the 2015 annual conference on genetic and evolutionary computation. 999--1006.Google ScholarDigital Library
- Xianshun Chen, Yew-Soon Ong, Meng-Hiot Lim, and Kay Chen Tan. 2011. A multi-facet survey on memetic computation. IEEE Transactions on evolutionary computation 15, 5 (2011), 591--607.Google ScholarDigital Library
- Matej Črepinšek, Shih-Hsi Liu, and Marjan Mernik. 2013. Exploration and exploitation in evolutionary algorithms: A survey. ACM computing surveys (CSUR) 45, 3 (2013), 1--33.Google Scholar
- Mario Graff, Rafael Pena, and Aurelio Medina. 2013. Wind speed forecasting using genetic programming. In 2013 IEEE Congress on Evolutionary Computation. IEEE, 408--415.Google ScholarCross Ref
- Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).Google Scholar
- Michael Kommenda, Gabriel Kronberger, Stephan Winkler, Michael Affenzeller, and Stefan Wagner. 2013. Effects of constant optimization by nonlinear least squares minimization in symbolic regression. In Proceedings of the 15th annual conference companion on Genetic and evolutionary computation. 1121--1128.Google ScholarDigital Library
- John R Koza. 1994. Genetic programming as a means for programming computers by natural selection. Statistics and computing 4, 2 (1994), 87--112.Google Scholar
- William La Cava, Patryk Orzechowski, Bogdan Burlacu, Fabricio de Franca, Marco Virgolin, Ying Jin, Michael Kommenda, and Jason Moore. 2021. Contemporary Symbolic Regression Methods and their Relative Performance. In Proceedings of the Neural Information Processing Systems Track on Datasets and Benchmarks, J. Vanschoren and S. Yeung (Eds.), Vol. 1.Google Scholar
- James McDermott, David R White, Sean Luke, Luca Manzoni, Mauro Castelli, Leonardo Vanneschi, Wojciech Jaskowski, Krzysztof Krawiec, Robin Harper, Kenneth De Jong, et al. 2012. Genetic programming needs better benchmarks. In Proceedings of the 14th annual conference on Genetic and evolutionary computation. 791--798.Google ScholarDigital Library
- Gloria Pietropolli, Luca Manzoni, Alessia Paoletti, and Mauro Castelli. 2022. Combining geometric semantic gp with gradient-descent optimization. In European Conference on Genetic Programming (Part of EvoStar). Springer, 19--33.Google ScholarDigital Library
- Will Smart and Mengjie Zhang. 2004. Continuously evolving programs in genetic programming using gradient descent. In Proceedings of the 7th Asia-Pacific Conference on Complex Systems.Google Scholar
- Alexander Topchy, William F Punch, et al. 2001. Faster genetic programming based on local gradient search of numeric leaf values. In Proceedings of the genetic and evolutionary computation conference (GECCO-2001), Vol. 155162. Morgan Kaufmann San Francisco, CA.Google Scholar
- Leonardo Trujillo, Perla S Juárez-Smith, Pierrick Legrand, Sara Silva, Mauro Castelli, Leonardo Vanneschi, Oliver Schütze, Luis Muñoz, et al. 2018. Local search is underused in genetic programming. In Genetic Programming Theory and Practice XIV. Springer, 119--137.Google Scholar
- Leonardo Trujillo, Oliver Schütze, Pierrick Legrand, et al. 2014. Evaluating the effects of local search in genetic programming. In EVOLVE-A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation V. Springer, 213--228.Google Scholar
Index Terms
- Parametrizing GP Trees for Better Symbolic Regression Performance through Gradient Descent.
Recommendations
Mini-Batching, Gradient-Clipping, First- versus Second-Order: What Works in Gradient-Based Coefficient Optimisation for Symbolic Regression?
GECCO '23: Proceedings of the Genetic and Evolutionary Computation ConferenceThe aim of Symbolic Regression (SR) is to discover interpretable expressions that accurately describe data. The accuracy of an expression depends on both its structure and coefficients. To keep the structure simple enough to be interpretable, ...
HMXT-GP: an information-theoretic approach to genetic programming that maintains diversity
SAC '11: Proceedings of the 2011 ACM Symposium on Applied ComputingThis paper applies a recent information--theoretic approach to controlling Genetic Algorithms (GAs) called HMXT to tree--based Genetic Programming (GP). HMXT, in a GA domain, requires the setting of selection thresholds in a population and the ...
Comments