Adversarially Robust Spiking Neural Networks with Sparse Connectivity

Mathias Schmolli, Maximilian Baronig, Robert Legenstein, Ozan Ozdenizci
Conference on Parsimony and Learning, PMLR 280:865-883, 2025.

Abstract

Deployment of deep neural networks in resource-constrained embedded systems requires innovative algorithmic solutions to facilitate their energy and memory efficiency. To further ensure the reliability of these systems against malicious actors, recent works have extensively studied adversarial robustness of existing architectures. Our work focuses on the intersection of adversarial robustness, memory- and energy-efficiency in neural networks. We introduce a neural network conversion algorithm designed to produce sparse and adversarially robust spiking neural networks (SNNs) by leveraging the sparse connectivity and weights from a robustly pretrained artificial neural network (ANN). Our approach combines the energy-efficient architecture of SNNs with a novel conversion algorithm, leading to state-of-the-art performance with enhanced energy and memory efficiency through sparse connectivity and activations. Our models are shown to achieve up to 100x reduction in the number of weights to be stored in memory, with an estimated 8.6x increase in energy efficiency compared to dense SNNs, while maintaining high performance and robustness against adversarial threats.

Cite this Paper


BibTeX
@InProceedings{pmlr-v280-schmolli25a, title = {Adversarially Robust Spiking Neural Networks with Sparse Connectivity}, author = {Schmolli, Mathias and Baronig, Maximilian and Legenstein, Robert and Ozdenizci, Ozan}, booktitle = {Conference on Parsimony and Learning}, pages = {865--883}, year = {2025}, editor = {Chen, Beidi and Liu, Shijia and Pilanci, Mert and Su, Weijie and Sulam, Jeremias and Wang, Yuxiang and Zhu, Zhihui}, volume = {280}, series = {Proceedings of Machine Learning Research}, month = {24--27 Mar}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v280/main/assets/schmolli25a/schmolli25a.pdf}, url = {https://proceedings.mlr.press/v280/schmolli25a.html}, abstract = {Deployment of deep neural networks in resource-constrained embedded systems requires innovative algorithmic solutions to facilitate their energy and memory efficiency. To further ensure the reliability of these systems against malicious actors, recent works have extensively studied adversarial robustness of existing architectures. Our work focuses on the intersection of adversarial robustness, memory- and energy-efficiency in neural networks. We introduce a neural network conversion algorithm designed to produce sparse and adversarially robust spiking neural networks (SNNs) by leveraging the sparse connectivity and weights from a robustly pretrained artificial neural network (ANN). Our approach combines the energy-efficient architecture of SNNs with a novel conversion algorithm, leading to state-of-the-art performance with enhanced energy and memory efficiency through sparse connectivity and activations. Our models are shown to achieve up to 100x reduction in the number of weights to be stored in memory, with an estimated 8.6x increase in energy efficiency compared to dense SNNs, while maintaining high performance and robustness against adversarial threats.} }
Endnote
%0 Conference Paper %T Adversarially Robust Spiking Neural Networks with Sparse Connectivity %A Mathias Schmolli %A Maximilian Baronig %A Robert Legenstein %A Ozan Ozdenizci %B Conference on Parsimony and Learning %C Proceedings of Machine Learning Research %D 2025 %E Beidi Chen %E Shijia Liu %E Mert Pilanci %E Weijie Su %E Jeremias Sulam %E Yuxiang Wang %E Zhihui Zhu %F pmlr-v280-schmolli25a %I PMLR %P 865--883 %U https://proceedings.mlr.press/v280/schmolli25a.html %V 280 %X Deployment of deep neural networks in resource-constrained embedded systems requires innovative algorithmic solutions to facilitate their energy and memory efficiency. To further ensure the reliability of these systems against malicious actors, recent works have extensively studied adversarial robustness of existing architectures. Our work focuses on the intersection of adversarial robustness, memory- and energy-efficiency in neural networks. We introduce a neural network conversion algorithm designed to produce sparse and adversarially robust spiking neural networks (SNNs) by leveraging the sparse connectivity and weights from a robustly pretrained artificial neural network (ANN). Our approach combines the energy-efficient architecture of SNNs with a novel conversion algorithm, leading to state-of-the-art performance with enhanced energy and memory efficiency through sparse connectivity and activations. Our models are shown to achieve up to 100x reduction in the number of weights to be stored in memory, with an estimated 8.6x increase in energy efficiency compared to dense SNNs, while maintaining high performance and robustness against adversarial threats.
APA
Schmolli, M., Baronig, M., Legenstein, R. & Ozdenizci, O.. (2025). Adversarially Robust Spiking Neural Networks with Sparse Connectivity. Conference on Parsimony and Learning, in Proceedings of Machine Learning Research 280:865-883 Available from https://proceedings.mlr.press/v280/schmolli25a.html.

Related Material