Skip to main content

Towards Physical Plausibility in Neuroevolution Systems

  • Conference paper
  • First Online:
Applications of Evolutionary Computation (EvoApplications 2024)

Abstract

The increasing usage of Artificial Intelligence (AI) models, especially Deep Neural Networks (DNNs), is increasing the power consumption during training and inference, posing environmental concerns and driving the need for more energy-efficient algorithms and hardware solutions. This work addresses the growing energy consumption problem in Machine Learning (ML), particularly during the inference phase. Even a slight reduction in power usage can lead to significant energy savings, benefiting users, companies, and the environment. Our approach focuses on maximizing the accuracy of Artificial Neural Network (ANN) models using a neuroevolutionary framework whilst minimizing their power consumption. To do so, power consumption is considered in the fitness function. We introduce a new mutation strategy that stochastically reintroduces modules of layers, with power-efficient modules having a higher chance of being chosen. We introduce a novel technique that allows training two separate models in a single training step whilst promoting one of them to be more power efficient than the other while maintaining similar accuracy. The results demonstrate a reduction in power consumption of ANN models by up to 29.2% without a significant decrease in predictive performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: DENSER: deep evolutionary network structured representation. Genet. Prog. Evolvable Mach. 20(1), 5–35 (2019). https://doi.org/10.1007/S10710-018-9339-Y

    Article  Google Scholar 

  2. Assunção, F., Lourenço, N., Ribeiro, B., Machado, P.: Fast-DENSER: fast deep evolutionary network structured representation. SoftwareX 14, 100694 (2021). https://doi.org/10.1016/j.softx.2021.100694

    Article  Google Scholar 

  3. Balas, V.E., Roy, S.S., Sharma, D., Samui, P. (eds.): Handbook of Deep Learning Applications. SIST, vol. 136. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-11479-4

    Book  Google Scholar 

  4. Chen, W., Gong, X., Wang, Z.: Neural architecture search on ImageNet in four GPU hours: a theoretically inspired perspective. In: 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, 3–7 May 2021. OpenReview.net (2021). https://doi.org/10.48550/arXiv.2102.11535

  5. Deng, L.: The MNIST database of handwritten digit images for machine learning research. IEEE Sig. Process. Mag. 29(6), 141–142 (2012). https://doi.org/10.1109/MSP.2012.2211477

    Article  Google Scholar 

  6. Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. NCS, Springer, Heidelberg (2015). https://doi.org/10.1007/978-3-662-44874-8

    Book  Google Scholar 

  7. Galván, E., Mooney, P.: Neuroevolution in deep neural networks: current trends and future challenges. IEEE Trans. Artif. Intell. 2(6), 476–493 (2021). https://doi.org/10.1109/TAI.2021.3067574

    Article  Google Scholar 

  8. Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report, University of Toronto (2009). https://www.cs.toronto.edu/kriz/learning-features-2009-TR.pdf

  9. LeCun, Y., Bengio, Y., Hinton, G.E.: Deep learning. Nature 521(7553), 436–444 (2015). https://doi.org/10.1038/nature14539

    Article  Google Scholar 

  10. Liu, W., Wang, Z., Liu, X., Zeng, N., Liu, Y., Alsaadi, F.E.: A survey of deep neural network architectures and their applications. Neurocomputing 234, 11–26 (2017). https://doi.org/10.1016/J.NEUCOM.2016.12.038

    Article  Google Scholar 

  11. Lourenço, N., Assunção, F., Pereira, F.B., Costa, E., Machado, P.: Structured grammatical evolution: a dynamic approach. In: Ryan, C., O’Neill, M., Collins, J.J. (eds.) Handbook of Grammatical Evolution, pp. 137–161. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-78717-6_6

    Chapter  Google Scholar 

  12. Lourenço, N., Pereira, F.B., Costa, E.: SGE: a structured representation for grammatical evolution. In: Bonnevay, S., Legrand, P., Monmarché, N., Lutton, E., Schoenauer, M. (eds.) EA 2015. LNCS, vol. 9554, pp. 136–148. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-31471-6_11

    Chapter  Google Scholar 

  13. Luccioni, A.S., Viguier, S., Ligozat, A.L.: Estimating the carbon footprint of BLOOM, a 176B parameter language model (2022). https://doi.org/10.48550/ARXIV.2211.02001

  14. Mellor, J., Turner, J., Storkey, A.J., Crowley, E.J.: Neural architecture search without training. In: Meila, M., Zhang, T. (eds.) Proceedings of the 38th International Conference on Machine Learning, ICML 2021, Virtual Event. Proceedings of Machine Learning Research, 18–24 July 2021, vol. 139, pp. 7588–7598. PMLR (2021). https://doi.org/10.48550/arXiv.2006.04647

  15. Meshkini, K., Platos, J., Ghassemain, H.: An analysis of convolutional neural network for fashion images classification (fashion-MNIST). In: Kovalev, S., Tarassov, V., Snasel, V., Sukhanov, A. (eds.) IITI 2019. AISC, vol. 1156, pp. 85–95. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-50097-9_10

    Chapter  Google Scholar 

  16. Patterson, D.A., et al.: The carbon footprint of machine learning training will plateau, then shrink. Computer 55(7), 18–28 (2022). https://doi.org/10.1109/MC.2022.3148714

    Article  Google Scholar 

  17. Vikhar, P.A.: Evolutionary algorithms: a critical review and its future prospects. In: 2016 International Conference on Global Trends in Signal Processing, Information Computing and Communication (ICGTSPICC), pp. 261–265 (2016). https://doi.org/10.1109/ICGTSPICC.2016.7955308

  18. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms (2017). https://doi.org/10.48550/arXiv.1708.07747

  19. Yegnanarayana, B.: Artificial Neural Networks. PHI Learning (2009)

    Google Scholar 

Download references

Acknowledgments

This work was supported by the Portuguese Recovery and Resilience Plan (PRR) through project C645008882-00000055, Center for Responsible AI, by the FCT, I.P./MCTES through national funds (PIDDAC), by Project No. 7059 - Neuraspace - AI fights Space Debris, reference C644877546-00000020, supported by the RRP - Recovery and Resilience Plan and the European Next Generation EU Funds, following Notice No. 02/C05-i01/2022, Component 5 - Capitalization and Business Innovation - Mobilizing Agendas for Business Innovation, and within the scope of CISUC R&D Unit - UIDB/00326/2020.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gabriel Cortês .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cortês, G., Lourenço, N., Machado, P. (2024). Towards Physical Plausibility in Neuroevolution Systems. In: Smith, S., Correia, J., Cintrano, C. (eds) Applications of Evolutionary Computation. EvoApplications 2024. Lecture Notes in Computer Science, vol 14635. Springer, Cham. https://doi.org/10.1007/978-3-031-56855-8_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-56855-8_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-56854-1

  • Online ISBN: 978-3-031-56855-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics