Skip to main content

Incremental Evolution and Development of Deep Artificial Neural Networks

  • Conference paper
  • First Online:
Book cover Genetic Programming (EuroGP 2020)

Abstract

NeuroEvolution (NE) methods are known for applying Evolutionary Computation to the optimisation of Artificial Neural Networks (ANNs). Despite aiding non-expert users to design and train ANNs, the vast majority of NE approaches disregard the knowledge that is gathered when solving other tasks, i.e., evolution starts from scratch for each problem, ultimately delaying the evolutionary process. To overcome this drawback, we extend Fast Deep Evolutionary Network Structured Representation (Fast-DENSER) to incremental development. We hypothesise that by transferring the knowledge gained from previous tasks we can attain superior results and speedup evolution. The results show that the average performance of the models generated by incremental development is statistically superior to the non-incremental average performance. In case the number of evaluations performed by incremental development is smaller than the performed by non-incremental development the attained results are similar in performance, which indicates that incremental development speeds up evolution. Lastly, the models generated using incremental development generalise better, and thus, without further evolution, report a superior performance on unseen problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Floreano, D., Dürr, P., Mattiussi, C.: Neuroevolution: from architectures to learning. Evol. Intell. 1(1), 47–62 (2008)

    Article  Google Scholar 

  2. Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: Fast DENSER: efficient deep neuroevolution. In: Sekanina, L., Hu, T., Lourenço, N., Richter, H., García-Sánchez, P. (eds.) EuroGP 2019. LNCS, vol. 11451, pp. 197–212. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-16670-0_13

    Chapter  Google Scholar 

  3. Gruau, F., Whitley, D., Pyeatt, L.: A comparison between cellular encoding and direct encoding for genetic neural networks. In: Proceedings of the 1st Annual Conference on Genetic Programming, pp. 81–89. MIT Press, Cambridge (1996)

    Google Scholar 

  4. Miller, G.F., Todd, P.M., Hegde, S.U.: Designing neural networks using genetic algorithms. In: ICGA, pp. 379–384. Morgan Kaufmann (1989)

    Google Scholar 

  5. Whitley, D.: Applying genetic algorithms to neural network learning. In: Proceedings of the Seventh Conference (AISB89) on Artificial Intelligence and Simulation of Behaviour, pp. 137–144. Morgan Kaufmann Publishers Inc. (1989)

    Google Scholar 

  6. Gomez, F.J., Schmidhuber, J., Miikkulainen, R.: Accelerated neural evolution through cooperatively coevolved synapses. J. Mach. Learn. Res. 9, 937–965 (2008)

    MathSciNet  MATH  Google Scholar 

  7. Stanley, K.O., D’Ambrosio, D.B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks. Artif. Life 15(2), 185–212 (2009)

    Article  Google Scholar 

  8. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  9. Turner, A.J., Miller, J.F.: Cartesian genetic programming encoded artificial neural networks: a comparison using three benchmarks. In: GECCO, pp. 1005–1012. ACM (2013)

    Google Scholar 

  10. Miikkulainen, R., et al.: Evolving deep neural networks. CoRR abs/1703.00548 (2017)

    Google Scholar 

  11. Suganuma, M., Shirakawa, S., Nagao, T.: A genetic programming approach to designing convolutional neural network architectures. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 497–504. ACM (2017)

    Google Scholar 

  12. Real, E., et al.: Large-scale evolution of image classifiers. In: ICML. Proceedings of Machine Learning Research, vol. 70, pp. 2902–2911. PMLR (2017)

    Google Scholar 

  13. Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: Denser: deep evolutionary network structured representation. Genet. Program. Evolvable Mach. 20, 5–35 (2018). https://doi.org/10.1007/s10710-018-9339-y

    Article  Google Scholar 

  14. Baldominos, A., Saez, Y., Isasi, P.: On the automated, evolutionary design of neural networks: past, present, and future. Neural Comput. Appl. 32(2), 519–545 (2019). https://doi.org/10.1007/s00521-019-04160-6

    Article  Google Scholar 

  15. Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. arXiv preprint arXiv:1802.01548 (2018)

  16. Lorenzo, P.R., Nalepa, J.: Memetic evolution of deep neural networks. In: GECCO, pp. 505–512. ACM (2018)

    Google Scholar 

  17. Thrun, S.: Is learning the n-th thing any easier than learning the first? In: NIPS, pp. 640–646. MIT Press (1995)

    Google Scholar 

  18. Tirumala, S.S., Ali, S., Ramesh, C.P.: Evolving deep neural networks: a new prospect. In: ICNC-FSKD, pp. 69–74. IEEE (2016)

    Google Scholar 

  19. Wong, C., Houlsby, N., Lu, Y., Gesmundo, A.: Transfer learning with neural AutoML. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems 31, pp. 8356–8365. Curran Associates, Inc. (2018)

    Google Scholar 

  20. Verbancsics, P., Stanley, K.O.: Evolving static representations for task transfer. J. Mach. Learn. Res. 11, 1737–1769 (2010)

    MathSciNet  MATH  Google Scholar 

  21. Long, M., Cao, Y., Wang, J., Jordan, M.I.: Learning transferable features with deep adaptation networks. arXiv preprint arXiv:1502.02791 (2015)

  22. Ciresan, D.C., Meier, U., Schmidhuber, J.: Transfer learning for Latin and Chinese characters with deep neural networks. In: IJCNN, pp. 1–6. IEEE (2012)

    Google Scholar 

  23. Lourenço, N., Assunção, F., Pereira, F.B., Costa, E., Machado, P.: Structured grammatical evolution: a dynamic approach. In: Ryan, C., O’Neill, M., Collins, J.J. (eds.) Handbook of Grammatical Evolution, pp. 137–161. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-78717-6_6

    Chapter  Google Scholar 

  24. Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)

    Article  Google Scholar 

  25. Netzer, Y., Wang, T., Coates, A., Bissacco, A., Wu, B., Ng, A.Y.: Reading digits in natural images with unsupervised feature learning. In: NIPS Workshop on Deep Learning and Unsupervised Feature Learning 2011 (2011)

    Google Scholar 

  26. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms (2017)

    Google Scholar 

  27. Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images. Technical report, Citeseer (2009)

    Google Scholar 

Download references

Acknowledgments

This work is partially funded by: Fundação para a Ciência e Tecnologia (FCT), Portugal, under the PhD grant agreement SFRH/BD/114865/2016, the project grant DSAIPA/DS/0022/2018 (GADgET), and is based upon work from COST Action CA15140: ImAppNIO, supported by COST (European Cooperation in Science and Technology): www.cost.eu.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Filipe Assunção .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Assunção, F., Lourenço, N., Ribeiro, B., Machado, P. (2020). Incremental Evolution and Development of Deep Artificial Neural Networks. In: Hu, T., Lourenço, N., Medvet, E., Divina, F. (eds) Genetic Programming. EuroGP 2020. Lecture Notes in Computer Science(), vol 12101. Springer, Cham. https://doi.org/10.1007/978-3-030-44094-7_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-44094-7_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-44093-0

  • Online ISBN: 978-3-030-44094-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics