skip to main content
10.1145/3583133.3596332acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Pretraining Reduces Runtime in Denoising Autoencoder Genetic Programming by an Order of Magnitude

Published:24 July 2023Publication History

ABSTRACT

Denoising autoencoder genetic programming (DAE-GP) is an estimation of distribution genetic programming (EDA-GP) algorithm. It uses denoising autoencoder long short-term memory networks as probabilistic model to replace the standard mutation and recombination operators of genetic programming (GP). Recent work has shown several advantages regarding solution length and overall performance of DAE-GP when compared to GP. However, training a neural network at each generation is computationally expensive, where model training is the most time consuming process of DAE-GP. In this work, we propose pretraining to reduce the runtime of the DAE-GP. In pretraining, the neural network is trained preceding the evolutionary search. In experiments on 8 real-world symbolic regression tasks we find that DAE-GP with pretraining has a reduced overall runtime of an order of magnitude while generating individuals with similar or better fitness.

References

  1. François Chollet. 2015. keras. https://github.com/fchollet/keras.Google ScholarGoogle Scholar
  2. Félix Antoine Fortin, François Michel De Rainville, Marc André Gardner, Marc Parizeau, and Christian Gagńe. 2012. DEAP: Evolutionary algorithms made easy. Journal of Machine Learning Research 13, 1 (2012), 2171--2175.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Christian Olmscheid, David Wittenberg, Dominik Sobania, and Franz Rothlauf. 2021. Improving Estimation of Distribution Genetic Programming with Novelty Initialization. In Proceedings of the Genetic and Evolutionary Computation Conference Companion (Lille, France) (GECCO '21). Association for Computing Machinery, New York, NY, USA, 261--262.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Dirk Schweim, David Wittenberg, and Franz Rothlauf. 2021. On Sampling Error in Evolutionary Algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference Companion (Lille, France) (GECCO '21). Association for Computing Machinery, New York, NY, USA, 43--44.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Dirk Schweim, David Wittenberg, and Franz Rothlauf. 2021. On sampling error in genetic programming. Natural Computing 21, 2 (2021), 1--14.Google ScholarGoogle Scholar
  6. Pascal Vincent, Hugo Larochelle, Yoshua Bengio, and Pierre Antoine Manzagol. 2008. Extracting and composing robust features with denoising autoencoders. In Proceedings of the 25th International Conference on Machine Learning (ICML'08). ACM, Helsinki, Finland, 1096--1103. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. David Wittenberg. 2022. Using Denoising Autoencoder Genetic Programming to control Exploration and Exploitation in Search. In Proceedings of the 25th European Conference on Genetic Programming (EuroGP'22). Springer, Madrid, Spain, 96--111.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. David Wittenberg and Franz Rothlauf. 2022. Denoising Autoencoder Genetic Programming for Real-World Symbolic Regression. In Proceedings of the Genetic and Evolutionary Computation Conference Companion (Boston, Massachusetts) (GECCO '22). Association for Computing Machinery, New York, NY, USA, 612--614.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. David Wittenberg and Franz Rothlauf. 2023. Small Solutions for Real-World Symbolic Regression Using Denoising Autoencoder Genetic Programming. In Genetic Programming: 26th European Conference, EuroGP 2023, Held as Part of EvoStar 2023, Brno, Czech Republic, April 12--14, 2023, Proceedings. Springer, 101--116.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. David Wittenberg, Franz Rothlauf, and Dirk Schweim. 2020. DAE-GP: Denoising Autoencoder LSTM Networks as Probabilistic Models in Estimation of Distribution Genetic Programming. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference (Cancún, Mexico) (GECCO '20). ACM, New York, NY, USA, 1037--1045.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Pretraining Reduces Runtime in Denoising Autoencoder Genetic Programming by an Order of Magnitude

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '23 Companion: Proceedings of the Companion Conference on Genetic and Evolutionary Computation
      July 2023
      2519 pages
      ISBN:9798400701207
      DOI:10.1145/3583133

      Copyright © 2023 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 24 July 2023

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader