ABSTRACT
Spiking Neural Networks (SNNs) have attracted recent interest due to their energy efficiency and biological plausibility. However, the performance of SNNs still lags behind traditional Artificial Neural Networks (ANNs), as there is no consensus on the best learning algorithm for SNNs. Best-performing SNNs are based on ANN to SNN conversion or learning with spike-based backpropagation through surrogate gradients. The focus of recent research has been on developing and testing different learning strategies, with hand-tailored architectures and parameter tuning. Neuroevolution (NE), has proven successful as a way to automatically design ANNs and tune parameters, but its applications to SNNs are still at an early stage. DENSER is a NE framework for the automatic design and parametrization of ANNs, based on the principles of Genetic Algorithms (GA) and Structured Grammatical Evolution (SGE). In this paper, we propose SPENSER, a NE framework for SNN generation based on DENSER, for image classification on the MNIST and Fashion-MNIST datasets. SPENSER generates competitive performing networks with a test accuracy of 99.42% and 91.65% respectively.
- Filipe Assunção, Nuno Lourenço, Penousal Machado, and Bernardete Ribeiro. 2019. DENSER: deep evolutionary network structured representation. Genetic Programming and Evolvable Machines 20, 1 (2019), 5--35.Google ScholarDigital Library
- Filipe Assunção, Nuno Lourenço, Bernardete Ribeiro, and Penousal Machado. 2021. Fast-DENSER: Fast Deep Evolutionary Network Structured Representation. SoftwareX 14 (2021), 100694.Google ScholarCross Ref
- Thomas Bäck and Hans-Paul Schwefel. 1993. An overview of evolutionary algorithms for parameter optimization. Evolutionary computation 1, 1 (1993), 1--23.Google Scholar
- Alejandro Baldominos, Yago Saez, and Pedro Isasi. 2020. On the automated, evolutionary design of neural networks: past, present, and future. Neural computing and applications 32 (2020), 519--545.Google Scholar
- Sander M Bohte, Joost N Kok, and Han La Poutre. 2002. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48, 1--4 (2002), 17--37.Google ScholarCross Ref
- Xiang Cheng, Yunzhe Hao, Jiaming Xu, and Bo Xu. 2021. LISNN: improving spiking neural networks with lateral interactions for robust object recognition. In Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence. 1519--1525.Google Scholar
- Shikuang Deng and Shi Gu. 2021. Optimal Conversion of Conventional Artificial Neural Networks to Spiking Neural Networks. In International Conference on Learning Representations.Google Scholar
- Peter U Diehl and Matthew Cook. 2015. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Frontiers in computational neuroscience 9 (2015), 99.Google Scholar
- Sangya Dutta, Vinay Kumar, Aditya Shukla, Nihar R Mohapatra, and Udayan Ganguly. 2017. Leaky integrate and fire neuron by charge-discharge dynamics in floating-body MOSFET. Scientific reports 7, 1 (2017), 8257.Google Scholar
- Daniel Elbrecht and Catherine Schuman. 2020. Neuroevolution of Spiking Neural Networks Using Compositional Pattern Producing Networks. In International Conference on Neuromorphic Systems 2020 (ICONS 2020). Association for Computing Machinery, New York, NY, USA, 1--5.Google Scholar
- Jason K. Eshraghian, Max Ward, Emre Neftci, Xinxin Wang, Gregor Lenz, Girish Dwivedi, Mohammed Bennamoun, Doo Seok Jeong, and Wei D. Lu. 2022. Training Spiking Neural Networks Using Lessons From Deep Learning. arXiv:2109.12894 [cs].Google Scholar
- Wei Fang, Zhaofei Yu, Yanqi Chen, Timothée Masquelier, Tiejun Huang, and Yonghong Tian. 2021. Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 2661--2671.Google ScholarCross Ref
- A. L. Hodgkin and A. F. Huxley. 1952. A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of Physiology 117, 4 (Aug. 1952), 500--544.Google ScholarCross Ref
- David H Hubel and Torsten N Wiesel. 1962. Receptive fields, binocular interaction and functional architecture in the cat's visual cortex. The Journal of physiology 160, 1 (1962), 106.Google ScholarCross Ref
- Dongsung Huh and Terrence J Sejnowski. 2018. Gradient descent for spiking neural networks. Advances in neural information processing systems 31 (2018).Google Scholar
- Eric Hunsberger and Chris Eliasmith. 2015. Spiking deep networks with LIF neurons. arXiv preprint arXiv:1510.08829 (2015).Google Scholar
- Eugene M Izhikevich. 2003. Simple model of spiking neurons. IEEE Transactions on neural networks 14, 6 (2003), 1569--1572.Google ScholarDigital Library
- Chunming Jiang and Yilei Zhang. 2023. KLIF: An optimized spiking neuron unit for tuning surrogate gradient slope and membrane potential. arXiv preprint arXiv:2302.09238 (2023).Google Scholar
- Youngeun Kim, Yuhang Li, Hyoungseob Park, Yeshwanth Venkatesha, and Priyadarshini Panda. 2022. Neural Architecture Search for Spiking Neural Networks. arXiv:2201.10355 [cs, eess]. Google ScholarCross Ref
- Katarzyna Kozdon and Peter Bentley. 2018. The Evolution of Training Parameters for Spiking Neural Networks with Hebbian Learning. In ALIFE 2018: The 2018 Conference on Artificial Life. MIT Press, 276--283.Google Scholar
- Alex Krizhevsky, Geoffrey Hinton, et al. 2009. Learning multiple layers of features from tiny images. (2009).Google Scholar
- Louis Lapicque. 1907. Recherches quantitatives sur l'excitation electrique des nerfs traitee comme une polarization. Journal de physiologie et de pathologie générale 9 (1907), 620--635.Google Scholar
- Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. nature 521, 7553 (2015), 436--444.Google Scholar
- Yann LeCun, Léon Bottou, Yoshua Bengio, and Patrick Haffner. 1998. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278--2324.Google ScholarCross Ref
- Eimantas Ledinauskas, Julius Ruseckas, Alfonsas Juršėnas, and Giedrius Buračas. 2020. Training deep spiking neural networks. arXiv preprint arXiv:2006.04436 (2020).Google Scholar
- Lyle Long and Guoliang Fang. 2010. A Review of Biologically Plausible Neuron Models for Spiking Neural Networks. In AIAA Infotech@Aerospace 2010. American Institute of Aeronautics and Astronautics, Atlanta, Georgia.Google Scholar
- Nuno Lourenço, Filipe Assunção, Francisco B Pereira, Ernesto Costa, and Penousal Machado. 2018. Structured grammatical evolution: a dynamic approach. In Handbook of Grammatical Evolution. Springer, 137--161.Google Scholar
- Nuno Lourenço, Francisco B Pereira, and Ernesto Costa. 2015. SGE: a structured representation for grammatical evolution. In International Conference on Artificial Evolution (Evolution Artificielle). Springer, 136--148.Google Scholar
- Sen Lu and Abhronil Sengupta. 2022. Neuroevolution Guided Hybrid Spiking Neural Network Training. Frontiers in Neuroscience 16 (April 2022), 838523.Google Scholar
- G. López-Vázquez, M. Ornelas-Rodriguez, A. Espinal, J. A. Soria-Alcaraz, A. Rojas-Domínguez, H. J. Puga-Soberanes, J. M. Carpio, and H. Rostro-Gonzalez. 2019. Evolutionary Spiking Neural Networks for Solving Supervised Classification Problems. Computational Intelligence and Neuroscience 2019 (March 2019), e4182639. Publisher: Hindawi.Google Scholar
- Wolfgang Maass. 1997. Networks of spiking neurons: The third generation of neural network models. Neural Networks 10, 9 (Dec. 1997), 1659--1671.Google ScholarCross Ref
- Byunggook Na, Jisoo Mok, Seongsik Park, Dongjin Lee, Hyeokjun Choe, and Sungroh Yoon. 2022. AutoSNN: Towards Energy-Efficient Spiking Neural Networks. In Proceedings of the 39th International Conference on Machine Learning. PMLR, 16253--16269.Google Scholar
- Emre O Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Processing Magazine 36, 6 (2019), 51--63.Google ScholarCross Ref
- Joao D Nunes, Marcelo Carvalho, Diogo Carneiro, and Jaime S Cardoso. 2022. Spiking neural networks: A survey. IEEE Access 10 (2022), 60738--60764.Google ScholarCross Ref
- David Patterson, Joseph Gonzalez, Quoc Le, Chen Liang, Lluis-Miquel Munguia, Daniel Rothchild, David So, Maud Texier, and Jeff Dean. 2021. Carbon emissions and large neural network training. arXiv preprint arXiv:2104.10350 (2021).Google Scholar
- Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, and Shih-Chii Liu. 2017. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Frontiers in neuroscience 11 (2017), 682.Google Scholar
- Catherine D Schuman, J Parker Mitchell, Robert M Patton, Thomas E Potok, and James S Plank. 2020. Evolutionary optimization for neuromorphic systems. In Proceedings of the Neuro-inspired Computational Elements Workshop. 1--9.Google ScholarDigital Library
- Sumit B Shrestha and Garrick Orchard. 2018. Slayer: Spike layer error reassignment in time. Advances in neural information processing systems 31 (2018).Google Scholar
- S.N. Sivanandam and S. N. Deepa. 2007. Introduction to Genetic Algorithms.Google Scholar
- Kenneth O Stanley, David B D'Ambrosio, and Jason Gauci. 2009. A hypercube-based encoding for evolving large-scale neural networks. Artificial life 15, 2 (2009), 185--212.Google Scholar
- Paul J Werbos. 1990. Backpropagation through time: what it does and how to do it. Proc. IEEE 78, 10 (1990), 1550--1560.Google ScholarCross Ref
- Han Xiao, Kashif Rasul, and Roland Vollgraf. 2017. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747 (2017).Google Scholar
- Wenrui Zhang and Peng Li. 2019. Spike-train level backpropagation for training deep recurrent spiking neural networks. Advances in neural information processing systems 32 (2019).Google Scholar
Index Terms
- SPENSER: Towards a NeuroEvolutionary Approach for Convolutional Spiking Neural Networks
Recommendations
Sparse and burst spiking in artificial neural networks inspired by synaptic retrograde signaling
The bursting of action potential and sparse activity are ubiquitously observed in the brain. Although the functions of these activity modes remain to be understood, it is expected that they play a critical role in information processing. In addition, ...
Context-Dependent Computations in Spiking Neural Networks with Apical Modulation
Artificial Neural Networks and Machine Learning – ICANN 2023AbstractNeocortical pyramidal neurons integrate two distinct streams of information. Bottom-up information arrives at their basal dendrites, and resulting neuronal activity is modulated by top-down input that targets the apical tufts of these neurons and ...
Analog Circuit Implementation of LIF and STDP Models for Spiking Neural Networks
GLSVLSI '20: Proceedings of the 2020 on Great Lakes Symposium on VLSISpiking Neural Networks (SNN) is one special implementation of Artificial Neural Networks (ANN), where the input signals are encoded in the temporal relationship between consecutive spikes (spike trains) instead of real-numbered values. Nevertheless, ...
Comments