skip to main content
10.1145/3583133.3596399acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article
Open Access

SPENSER: Towards a NeuroEvolutionary Approach for Convolutional Spiking Neural Networks

Published:24 July 2023Publication History

ABSTRACT

Spiking Neural Networks (SNNs) have attracted recent interest due to their energy efficiency and biological plausibility. However, the performance of SNNs still lags behind traditional Artificial Neural Networks (ANNs), as there is no consensus on the best learning algorithm for SNNs. Best-performing SNNs are based on ANN to SNN conversion or learning with spike-based backpropagation through surrogate gradients. The focus of recent research has been on developing and testing different learning strategies, with hand-tailored architectures and parameter tuning. Neuroevolution (NE), has proven successful as a way to automatically design ANNs and tune parameters, but its applications to SNNs are still at an early stage. DENSER is a NE framework for the automatic design and parametrization of ANNs, based on the principles of Genetic Algorithms (GA) and Structured Grammatical Evolution (SGE). In this paper, we propose SPENSER, a NE framework for SNN generation based on DENSER, for image classification on the MNIST and Fashion-MNIST datasets. SPENSER generates competitive performing networks with a test accuracy of 99.42% and 91.65% respectively.

References

  1. Filipe Assunção, Nuno Lourenço, Penousal Machado, and Bernardete Ribeiro. 2019. DENSER: deep evolutionary network structured representation. Genetic Programming and Evolvable Machines 20, 1 (2019), 5--35.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Filipe Assunção, Nuno Lourenço, Bernardete Ribeiro, and Penousal Machado. 2021. Fast-DENSER: Fast Deep Evolutionary Network Structured Representation. SoftwareX 14 (2021), 100694.Google ScholarGoogle ScholarCross RefCross Ref
  3. Thomas Bäck and Hans-Paul Schwefel. 1993. An overview of evolutionary algorithms for parameter optimization. Evolutionary computation 1, 1 (1993), 1--23.Google ScholarGoogle Scholar
  4. Alejandro Baldominos, Yago Saez, and Pedro Isasi. 2020. On the automated, evolutionary design of neural networks: past, present, and future. Neural computing and applications 32 (2020), 519--545.Google ScholarGoogle Scholar
  5. Sander M Bohte, Joost N Kok, and Han La Poutre. 2002. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48, 1--4 (2002), 17--37.Google ScholarGoogle ScholarCross RefCross Ref
  6. Xiang Cheng, Yunzhe Hao, Jiaming Xu, and Bo Xu. 2021. LISNN: improving spiking neural networks with lateral interactions for robust object recognition. In Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence. 1519--1525.Google ScholarGoogle Scholar
  7. Shikuang Deng and Shi Gu. 2021. Optimal Conversion of Conventional Artificial Neural Networks to Spiking Neural Networks. In International Conference on Learning Representations.Google ScholarGoogle Scholar
  8. Peter U Diehl and Matthew Cook. 2015. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Frontiers in computational neuroscience 9 (2015), 99.Google ScholarGoogle Scholar
  9. Sangya Dutta, Vinay Kumar, Aditya Shukla, Nihar R Mohapatra, and Udayan Ganguly. 2017. Leaky integrate and fire neuron by charge-discharge dynamics in floating-body MOSFET. Scientific reports 7, 1 (2017), 8257.Google ScholarGoogle Scholar
  10. Daniel Elbrecht and Catherine Schuman. 2020. Neuroevolution of Spiking Neural Networks Using Compositional Pattern Producing Networks. In International Conference on Neuromorphic Systems 2020 (ICONS 2020). Association for Computing Machinery, New York, NY, USA, 1--5.Google ScholarGoogle Scholar
  11. Jason K. Eshraghian, Max Ward, Emre Neftci, Xinxin Wang, Gregor Lenz, Girish Dwivedi, Mohammed Bennamoun, Doo Seok Jeong, and Wei D. Lu. 2022. Training Spiking Neural Networks Using Lessons From Deep Learning. arXiv:2109.12894 [cs].Google ScholarGoogle Scholar
  12. Wei Fang, Zhaofei Yu, Yanqi Chen, Timothée Masquelier, Tiejun Huang, and Yonghong Tian. 2021. Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 2661--2671.Google ScholarGoogle ScholarCross RefCross Ref
  13. A. L. Hodgkin and A. F. Huxley. 1952. A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of Physiology 117, 4 (Aug. 1952), 500--544.Google ScholarGoogle ScholarCross RefCross Ref
  14. David H Hubel and Torsten N Wiesel. 1962. Receptive fields, binocular interaction and functional architecture in the cat's visual cortex. The Journal of physiology 160, 1 (1962), 106.Google ScholarGoogle ScholarCross RefCross Ref
  15. Dongsung Huh and Terrence J Sejnowski. 2018. Gradient descent for spiking neural networks. Advances in neural information processing systems 31 (2018).Google ScholarGoogle Scholar
  16. Eric Hunsberger and Chris Eliasmith. 2015. Spiking deep networks with LIF neurons. arXiv preprint arXiv:1510.08829 (2015).Google ScholarGoogle Scholar
  17. Eugene M Izhikevich. 2003. Simple model of spiking neurons. IEEE Transactions on neural networks 14, 6 (2003), 1569--1572.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Chunming Jiang and Yilei Zhang. 2023. KLIF: An optimized spiking neuron unit for tuning surrogate gradient slope and membrane potential. arXiv preprint arXiv:2302.09238 (2023).Google ScholarGoogle Scholar
  19. Youngeun Kim, Yuhang Li, Hyoungseob Park, Yeshwanth Venkatesha, and Priyadarshini Panda. 2022. Neural Architecture Search for Spiking Neural Networks. arXiv:2201.10355 [cs, eess]. Google ScholarGoogle ScholarCross RefCross Ref
  20. Katarzyna Kozdon and Peter Bentley. 2018. The Evolution of Training Parameters for Spiking Neural Networks with Hebbian Learning. In ALIFE 2018: The 2018 Conference on Artificial Life. MIT Press, 276--283.Google ScholarGoogle Scholar
  21. Alex Krizhevsky, Geoffrey Hinton, et al. 2009. Learning multiple layers of features from tiny images. (2009).Google ScholarGoogle Scholar
  22. Louis Lapicque. 1907. Recherches quantitatives sur l'excitation electrique des nerfs traitee comme une polarization. Journal de physiologie et de pathologie générale 9 (1907), 620--635.Google ScholarGoogle Scholar
  23. Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. nature 521, 7553 (2015), 436--444.Google ScholarGoogle Scholar
  24. Yann LeCun, Léon Bottou, Yoshua Bengio, and Patrick Haffner. 1998. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278--2324.Google ScholarGoogle ScholarCross RefCross Ref
  25. Eimantas Ledinauskas, Julius Ruseckas, Alfonsas Juršėnas, and Giedrius Buračas. 2020. Training deep spiking neural networks. arXiv preprint arXiv:2006.04436 (2020).Google ScholarGoogle Scholar
  26. Lyle Long and Guoliang Fang. 2010. A Review of Biologically Plausible Neuron Models for Spiking Neural Networks. In AIAA Infotech@Aerospace 2010. American Institute of Aeronautics and Astronautics, Atlanta, Georgia.Google ScholarGoogle Scholar
  27. Nuno Lourenço, Filipe Assunção, Francisco B Pereira, Ernesto Costa, and Penousal Machado. 2018. Structured grammatical evolution: a dynamic approach. In Handbook of Grammatical Evolution. Springer, 137--161.Google ScholarGoogle Scholar
  28. Nuno Lourenço, Francisco B Pereira, and Ernesto Costa. 2015. SGE: a structured representation for grammatical evolution. In International Conference on Artificial Evolution (Evolution Artificielle). Springer, 136--148.Google ScholarGoogle Scholar
  29. Sen Lu and Abhronil Sengupta. 2022. Neuroevolution Guided Hybrid Spiking Neural Network Training. Frontiers in Neuroscience 16 (April 2022), 838523.Google ScholarGoogle Scholar
  30. G. López-Vázquez, M. Ornelas-Rodriguez, A. Espinal, J. A. Soria-Alcaraz, A. Rojas-Domínguez, H. J. Puga-Soberanes, J. M. Carpio, and H. Rostro-Gonzalez. 2019. Evolutionary Spiking Neural Networks for Solving Supervised Classification Problems. Computational Intelligence and Neuroscience 2019 (March 2019), e4182639. Publisher: Hindawi.Google ScholarGoogle Scholar
  31. Wolfgang Maass. 1997. Networks of spiking neurons: The third generation of neural network models. Neural Networks 10, 9 (Dec. 1997), 1659--1671.Google ScholarGoogle ScholarCross RefCross Ref
  32. Byunggook Na, Jisoo Mok, Seongsik Park, Dongjin Lee, Hyeokjun Choe, and Sungroh Yoon. 2022. AutoSNN: Towards Energy-Efficient Spiking Neural Networks. In Proceedings of the 39th International Conference on Machine Learning. PMLR, 16253--16269.Google ScholarGoogle Scholar
  33. Emre O Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Processing Magazine 36, 6 (2019), 51--63.Google ScholarGoogle ScholarCross RefCross Ref
  34. Joao D Nunes, Marcelo Carvalho, Diogo Carneiro, and Jaime S Cardoso. 2022. Spiking neural networks: A survey. IEEE Access 10 (2022), 60738--60764.Google ScholarGoogle ScholarCross RefCross Ref
  35. David Patterson, Joseph Gonzalez, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, and Jeff Dean. 2021. Carbon emissions and large neural network training. arXiv preprint arXiv:2104.10350 (2021).Google ScholarGoogle Scholar
  36. Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, and Shih-Chii Liu. 2017. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Frontiers in neuroscience 11 (2017), 682.Google ScholarGoogle Scholar
  37. Catherine D Schuman, J Parker Mitchell, Robert M Patton, Thomas E Potok, and James S Plank. 2020. Evolutionary optimization for neuromorphic systems. In Proceedings of the Neuro-inspired Computational Elements Workshop. 1--9.Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Sumit B Shrestha and Garrick Orchard. 2018. Slayer: Spike layer error reassignment in time. Advances in neural information processing systems 31 (2018).Google ScholarGoogle Scholar
  39. S.N. Sivanandam and S. N. Deepa. 2007. Introduction to Genetic Algorithms.Google ScholarGoogle Scholar
  40. Kenneth O Stanley, David B D'Ambrosio, and Jason Gauci. 2009. A hypercube-based encoding for evolving large-scale neural networks. Artificial life 15, 2 (2009), 185--212.Google ScholarGoogle Scholar
  41. Paul J Werbos. 1990. Backpropagation through time: what it does and how to do it. Proc. IEEE 78, 10 (1990), 1550--1560.Google ScholarGoogle ScholarCross RefCross Ref
  42. Han Xiao, Kashif Rasul, and Roland Vollgraf. 2017. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747 (2017).Google ScholarGoogle Scholar
  43. Wenrui Zhang and Peng Li. 2019. Spike-train level backpropagation for training deep recurrent spiking neural networks. Advances in neural information processing systems 32 (2019).Google ScholarGoogle Scholar

Index Terms

  1. SPENSER: Towards a NeuroEvolutionary Approach for Convolutional Spiking Neural Networks

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          GECCO '23 Companion: Proceedings of the Companion Conference on Genetic and Evolutionary Computation
          July 2023
          2519 pages
          ISBN:9798400701207
          DOI:10.1145/3583133

          Copyright © 2023 Owner/Author(s)

          This work is licensed under a Creative Commons Attribution International 4.0 License.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 24 July 2023

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate1,669of4,410submissions,38%

          Upcoming Conference

          GECCO '24
          Genetic and Evolutionary Computation Conference
          July 14 - 18, 2024
          Melbourne , VIC , Australia
        • Article Metrics

          • Downloads (Last 12 months)75
          • Downloads (Last 6 weeks)19

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader