A Differentiable Point Process with Its Application to Spiking Neural Networks

Hiroshi Kajino
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:5226-5235, 2021.

Abstract

This paper is concerned about a learning algorithm for a probabilistic model of spiking neural networks (SNNs). Jimenez Rezende & Gerstner (2014) proposed a stochastic variational inference algorithm to train SNNs with hidden neurons. The algorithm updates the variational distribution using the score function gradient estimator, whose high variance often impedes the whole learning algorithm. This paper presents an alternative gradient estimator for SNNs based on the path-wise gradient estimator. The main technical difficulty is a lack of a general method to differentiate a realization of an arbitrary point process, which is necessary to derive the path-wise gradient estimator. We develop a differentiable point process, which is the technical highlight of this paper, and apply it to derive the path-wise gradient estimator for SNNs. We investigate the effectiveness of our gradient estimator through numerical simulation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-kajino21a, title = {A Differentiable Point Process with Its Application to Spiking Neural Networks}, author = {Kajino, Hiroshi}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {5226--5235}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/kajino21a/kajino21a.pdf}, url = {https://proceedings.mlr.press/v139/kajino21a.html}, abstract = {This paper is concerned about a learning algorithm for a probabilistic model of spiking neural networks (SNNs). Jimenez Rezende & Gerstner (2014) proposed a stochastic variational inference algorithm to train SNNs with hidden neurons. The algorithm updates the variational distribution using the score function gradient estimator, whose high variance often impedes the whole learning algorithm. This paper presents an alternative gradient estimator for SNNs based on the path-wise gradient estimator. The main technical difficulty is a lack of a general method to differentiate a realization of an arbitrary point process, which is necessary to derive the path-wise gradient estimator. We develop a differentiable point process, which is the technical highlight of this paper, and apply it to derive the path-wise gradient estimator for SNNs. We investigate the effectiveness of our gradient estimator through numerical simulation.} }
Endnote
%0 Conference Paper %T A Differentiable Point Process with Its Application to Spiking Neural Networks %A Hiroshi Kajino %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-kajino21a %I PMLR %P 5226--5235 %U https://proceedings.mlr.press/v139/kajino21a.html %V 139 %X This paper is concerned about a learning algorithm for a probabilistic model of spiking neural networks (SNNs). Jimenez Rezende & Gerstner (2014) proposed a stochastic variational inference algorithm to train SNNs with hidden neurons. The algorithm updates the variational distribution using the score function gradient estimator, whose high variance often impedes the whole learning algorithm. This paper presents an alternative gradient estimator for SNNs based on the path-wise gradient estimator. The main technical difficulty is a lack of a general method to differentiate a realization of an arbitrary point process, which is necessary to derive the path-wise gradient estimator. We develop a differentiable point process, which is the technical highlight of this paper, and apply it to derive the path-wise gradient estimator for SNNs. We investigate the effectiveness of our gradient estimator through numerical simulation.
APA
Kajino, H.. (2021). A Differentiable Point Process with Its Application to Spiking Neural Networks. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:5226-5235 Available from https://proceedings.mlr.press/v139/kajino21a.html.

Related Material