Data Driven Threshold and Potential Initialization for Spiking Neural Networks

Velibor Bojkovic, Srinivas Anumasa, Giulia De Masi, Bin Gu, Huan Xiong
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4771-4779, 2024.

Abstract

Spiking neural networks (SNNs) present an increasingly popular alternative to artificial neural networks (ANNs), due to their energy and time efficiency when deployed on neuromorphic hardware. However, due to their discrete and highly non-differentiable nature, training SNNs is a challenging task and remains an active area of research. Some of the most prominent ways to train SNNs are based on ANN-to-SNN conversion where an SNN model is initialized with parameters from the corresponding, pre-trained ANN model. SNN models trained through ANN-to-SNN conversion or hybrid training show state of the art performance among SNNs on many machine learning tasks, comparable to those of ANNs. However, the top performing models need high latency or tailored ANNs to perform well, and in general are not using the full information available from ANNs. In this work, we propose novel method to initialize SNN’s thresholds and initial membrane potential after ANN-to-SNN conversion, using distributions of ANN’s activation values. We provide a theoretical framework for feature distribution-based conversion error, providing theoretical results on optimal membrane initialization and thresholds which minimize this error, as well as a practical algorithm for finding these optimal values. We test our method, both as a stand-alone ANN-to-SNN conversion and in combination with other methods, and show state of the art results on high-dimensional datasets such as CIFAR10, CIFAR100 and ImageNet and various architectures. Our code is available at \url{https://github.com/srinuvaasu/data_driven_init}

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-bojkovic24a, title = {Data Driven Threshold and Potential Initialization for Spiking Neural Networks}, author = {Bojkovic, Velibor and Anumasa, Srinivas and De Masi, Giulia and Gu, Bin and Xiong, Huan}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {4771--4779}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/bojkovic24a/bojkovic24a.pdf}, url = {https://proceedings.mlr.press/v238/bojkovic24a.html}, abstract = {Spiking neural networks (SNNs) present an increasingly popular alternative to artificial neural networks (ANNs), due to their energy and time efficiency when deployed on neuromorphic hardware. However, due to their discrete and highly non-differentiable nature, training SNNs is a challenging task and remains an active area of research. Some of the most prominent ways to train SNNs are based on ANN-to-SNN conversion where an SNN model is initialized with parameters from the corresponding, pre-trained ANN model. SNN models trained through ANN-to-SNN conversion or hybrid training show state of the art performance among SNNs on many machine learning tasks, comparable to those of ANNs. However, the top performing models need high latency or tailored ANNs to perform well, and in general are not using the full information available from ANNs. In this work, we propose novel method to initialize SNN’s thresholds and initial membrane potential after ANN-to-SNN conversion, using distributions of ANN’s activation values. We provide a theoretical framework for feature distribution-based conversion error, providing theoretical results on optimal membrane initialization and thresholds which minimize this error, as well as a practical algorithm for finding these optimal values. We test our method, both as a stand-alone ANN-to-SNN conversion and in combination with other methods, and show state of the art results on high-dimensional datasets such as CIFAR10, CIFAR100 and ImageNet and various architectures. Our code is available at \url{https://github.com/srinuvaasu/data_driven_init}} }
Endnote
%0 Conference Paper %T Data Driven Threshold and Potential Initialization for Spiking Neural Networks %A Velibor Bojkovic %A Srinivas Anumasa %A Giulia De Masi %A Bin Gu %A Huan Xiong %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-bojkovic24a %I PMLR %P 4771--4779 %U https://proceedings.mlr.press/v238/bojkovic24a.html %V 238 %X Spiking neural networks (SNNs) present an increasingly popular alternative to artificial neural networks (ANNs), due to their energy and time efficiency when deployed on neuromorphic hardware. However, due to their discrete and highly non-differentiable nature, training SNNs is a challenging task and remains an active area of research. Some of the most prominent ways to train SNNs are based on ANN-to-SNN conversion where an SNN model is initialized with parameters from the corresponding, pre-trained ANN model. SNN models trained through ANN-to-SNN conversion or hybrid training show state of the art performance among SNNs on many machine learning tasks, comparable to those of ANNs. However, the top performing models need high latency or tailored ANNs to perform well, and in general are not using the full information available from ANNs. In this work, we propose novel method to initialize SNN’s thresholds and initial membrane potential after ANN-to-SNN conversion, using distributions of ANN’s activation values. We provide a theoretical framework for feature distribution-based conversion error, providing theoretical results on optimal membrane initialization and thresholds which minimize this error, as well as a practical algorithm for finding these optimal values. We test our method, both as a stand-alone ANN-to-SNN conversion and in combination with other methods, and show state of the art results on high-dimensional datasets such as CIFAR10, CIFAR100 and ImageNet and various architectures. Our code is available at \url{https://github.com/srinuvaasu/data_driven_init}
APA
Bojkovic, V., Anumasa, S., De Masi, G., Gu, B. & Xiong, H.. (2024). Data Driven Threshold and Potential Initialization for Spiking Neural Networks. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:4771-4779 Available from https://proceedings.mlr.press/v238/bojkovic24a.html.

Related Material