Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons

Rasmus Høier, D. Staudt, Christopher Zach
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:13141-13156, 2023.

Abstract

Activity difference based learning algorithms—such as contrastive Hebbian learning and equilibrium propagation—have been proposed as biologically plausible alternatives to error back-propagation. However, on traditional digital chips these algorithms suffer from having to solve a costly inference problem twice, making these approaches more than two orders of magnitude slower than back-propagation. In the analog realm equilibrium propagation may be promising for fast and energy efficient learning, but states still need to be inferred and stored twice. Inspired by lifted neural networks and compartmental neuron models we propose a simple energy based compartmental neuron model, termed dual propagation, in which each neuron is a dyad with two intrinsic states. At inference time these intrinsic states encode the error/activity duality through their difference and their mean respectively. The advantage of this method is that only a single inference phase is needed and that inference can be solved in layerwise closed-form. Experimentally we show on common computer vision datasets, including Imagenet32x32, that dual propagation performs equivalently to back-propagation both in terms of accuracy and runtime.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-hoier23a, title = {Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons}, author = {H{\o}ier, Rasmus and Staudt, D. and Zach, Christopher}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {13141--13156}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/hoier23a/hoier23a.pdf}, url = {https://proceedings.mlr.press/v202/hoier23a.html}, abstract = {Activity difference based learning algorithms—such as contrastive Hebbian learning and equilibrium propagation—have been proposed as biologically plausible alternatives to error back-propagation. However, on traditional digital chips these algorithms suffer from having to solve a costly inference problem twice, making these approaches more than two orders of magnitude slower than back-propagation. In the analog realm equilibrium propagation may be promising for fast and energy efficient learning, but states still need to be inferred and stored twice. Inspired by lifted neural networks and compartmental neuron models we propose a simple energy based compartmental neuron model, termed dual propagation, in which each neuron is a dyad with two intrinsic states. At inference time these intrinsic states encode the error/activity duality through their difference and their mean respectively. The advantage of this method is that only a single inference phase is needed and that inference can be solved in layerwise closed-form. Experimentally we show on common computer vision datasets, including Imagenet32x32, that dual propagation performs equivalently to back-propagation both in terms of accuracy and runtime.} }
Endnote
%0 Conference Paper %T Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons %A Rasmus Høier %A D. Staudt %A Christopher Zach %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-hoier23a %I PMLR %P 13141--13156 %U https://proceedings.mlr.press/v202/hoier23a.html %V 202 %X Activity difference based learning algorithms—such as contrastive Hebbian learning and equilibrium propagation—have been proposed as biologically plausible alternatives to error back-propagation. However, on traditional digital chips these algorithms suffer from having to solve a costly inference problem twice, making these approaches more than two orders of magnitude slower than back-propagation. In the analog realm equilibrium propagation may be promising for fast and energy efficient learning, but states still need to be inferred and stored twice. Inspired by lifted neural networks and compartmental neuron models we propose a simple energy based compartmental neuron model, termed dual propagation, in which each neuron is a dyad with two intrinsic states. At inference time these intrinsic states encode the error/activity duality through their difference and their mean respectively. The advantage of this method is that only a single inference phase is needed and that inference can be solved in layerwise closed-form. Experimentally we show on common computer vision datasets, including Imagenet32x32, that dual propagation performs equivalently to back-propagation both in terms of accuracy and runtime.
APA
Høier, R., Staudt, D. & Zach, C.. (2023). Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:13141-13156 Available from https://proceedings.mlr.press/v202/hoier23a.html.

Related Material