Skip to main content

GPAM: Genetic Programming with Associative Memory

  • Conference paper
  • First Online:
Genetic Programming (EuroGP 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13986))

Included in the following conference series:

  • 426 Accesses

Abstract

We focus on the evolutionary design of programs capable of capturing more randomness and outliers in the input data set than the standard genetic programming (GP)-based methods typically allow. We propose Genetic Programming with Associative Memory (GPAM) – a GP-based system for symbolic regression which can utilize a small associative memory to store various data points to better approximate the original data set. The method is evaluated on five standard benchmarks in which a certain number of data points is replaced by randomly generated values. In another case study, GPAM is used as an on-chip generator capable of approximating the weights for a convolutional neural network (CNN) to reduce the access to an external weight memory. Using Cartesian genetic programming (CGP), we evolved expression-memory pairs that can generate weights of a single CNN layer. If the associative memory contains 10% of the original weights, the weight generator evolved for a convolutional layer can approximate the original weights such that the CNN utilizing the generated weights shows less than a 1% drop in the classification accuracy on the MNIST data set.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Data sets are available at https://doi.org/10.5281/zenodo.7583555.

References

  1. Capra, M., Bussolino, B., Marchisio, A., Shafique, M., Masera, G., Martina, M.: An updated survey of efficient hardware architectures for accelerating deep convolutional neural networks. Future Internet 12(7), 113 (2020)

    Article  Google Scholar 

  2. Dupuis, E., Novo, D., O’Connor, I., Bosio, A.: A heuristic exploration of retraining-free weight-sharing for CNN compression. In: 27th Asia and South Pacific Design Automation Conference, ASP-DAC, pp. 134–139. IEEE (2022)

    Google Scholar 

  3. Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT Press, Cambridge (2016)

    MATH  Google Scholar 

  4. Han, S., Mao, H., Dally, W.J.: Deep compression: compressing deep neural network with pruning, trained quantization and Huffman coding. In: 4th International Conference on Learning Representations, ICLR (2016)

    Google Scholar 

  5. Koza, J.R.: Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press, Cambridge (1992)

    MATH  Google Scholar 

  6. La Cava, W., et al.: Contemporary symbolic regression methods and their relative performance. In: Vanschoren, J., Yeung, S. (eds.) Proceedings of the Neural Information Processing Systems Track on Datasets and Benchmarks. vol. 1 (2021)

    Google Scholar 

  7. La Cava, W.G., Helmuth, T., Spector, L., Moore, J.H.: A probabilistic and multi-objective analysis of lexicase selection and \(\epsilon \)-lexicase selection. Evol. Comput. 27(3), 377–402 (2019)

    Article  Google Scholar 

  8. Langdon, W.B.: Genetic Programming and Data Structures: Genetic Programming \(+\) Data Structures = Automatic Programming! Springer, Cham (1998)

    Book  MATH  Google Scholar 

  9. LeCun, Y., Cortes, C., Burges, C.: MNIST handwritten digit database. ATT Labs. https://yann.lecun.com/exdb/mnist (2010)

  10. McDermott, J., et al.: Genetic programming needs better benchmarks. In: Proceedings of the 14th International Conference on Genetic and Evolutionary Computation, pp. 791–798. ACM (2012)

    Google Scholar 

  11. Miller, J.F.: Cartesian Genetic Programming. Springer, Berlin (2011)

    Book  MATH  Google Scholar 

  12. Schmidt, M.D., Lipson, H.: Coevolution of fitness predictors. IEEE Trans. Evol. Comput. 12(6), 736–749 (2008)

    Article  Google Scholar 

  13. Stanley, K.O.: Compositional pattern producing networks: a novel abstraction of development. Genet. Program Evolvable Mach. 8(2), 131–162 (2007)

    Article  Google Scholar 

  14. Stanley, K.O., D’Ambrosio, D.B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks. Artif. Life 15(2), 185–212 (2009)

    Article  Google Scholar 

  15. Sze, V., Chen, Y., Yang, T., Emer, J.S.: Efficient processing of deep neural networks. Synth. Lect. Comput. Archit. 15(2), 1–341 (2020)

    MATH  Google Scholar 

  16. Teller, A.: The evolution of mental models. In: Kinnear, K.E., Jr. (ed.) Advances in Genetic Programming, pp. 199–219. MIT Press, Cambridge (1994)

    Google Scholar 

  17. Vanneschi, L., Castelli, M., Silva, S.: A survey of semantic methods in genetic programming. Genet. Program Evolvable Mach. 15(2), 195–214 (2014). https://doi.org/10.1007/s10710-013-9210-0

    Article  Google Scholar 

  18. Villegas-Cortez, J., Olague, G., Aviles, C., Sossa, H., Ferreyra, A.: Automatic synthesis of associative memories through genetic programming: a first co-evolutionary approach. In: Di Chio, C., et al. (eds.) EvoApplications 2010. LNCS, vol. 6024, pp. 344–351. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-12239-2_36

    Chapter  Google Scholar 

Download references

Acknowledgements

This work was supported by the Czech science foundation project 21-13001S, and it was partly carried out under the COST Action CA19135 (CERCIRAS).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lukas Sekanina .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Juza, T., Sekanina, L. (2023). GPAM: Genetic Programming with Associative Memory. In: Pappa, G., Giacobini, M., Vasicek, Z. (eds) Genetic Programming. EuroGP 2023. Lecture Notes in Computer Science, vol 13986. Springer, Cham. https://doi.org/10.1007/978-3-031-29573-7_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-29573-7_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-29572-0

  • Online ISBN: 978-3-031-29573-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics