Dendritic Localized Learning: Toward Biologically Plausible Algorithm

Changze Lv, Jingwen Xu, Yiyang Lu, Xiaohua Wang, Zhenghua Wang, Zhibo Xu, Di Yu, Xin Du, Xiaoqing Zheng, Xuanjing Huang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:41682-41700, 2025.

Abstract

Backpropagation is the foundational algorithm for training neural networks and a key driver of deep learning’s success. However, its biological plausibility has been challenged due to three primary limitations: weight symmetry, reliance on global error signals, and the dual-phase nature of training, as highlighted by the existing literature. Although various alternative learning approaches have been proposed to address these issues, most either fail to satisfy all three criteria simultaneously or yield suboptimal results. Inspired by the dynamics and plasticity of pyramidal neurons, we propose Dendritic Localized Learning (DLL), a novel learning algorithm designed to overcome these challenges. Extensive empirical experiments demonstrate that DLL satisfies all three criteria of biological plausibility while achieving state-of-the-art performance among algorithms that meet these requirements. Furthermore, DLL exhibits strong generalization across a range of architectures, including MLPs, CNNs, and RNNs. These results, benchmarked against existing biologically plausible learning algorithms, offer valuable empirical insights for future research. We hope this study can inspire the development of new biologically plausible algorithms for training multilayer networks and advancing progress in both neuroscience and machine learning. Our code is available at https://github.com/Lvchangze/Dendritic-Localized-Learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-lv25c, title = {Dendritic Localized Learning: Toward Biologically Plausible Algorithm}, author = {Lv, Changze and Xu, Jingwen and Lu, Yiyang and Wang, Xiaohua and Wang, Zhenghua and Xu, Zhibo and Yu, Di and Du, Xin and Zheng, Xiaoqing and Huang, Xuanjing}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {41682--41700}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/lv25c/lv25c.pdf}, url = {https://proceedings.mlr.press/v267/lv25c.html}, abstract = {Backpropagation is the foundational algorithm for training neural networks and a key driver of deep learning’s success. However, its biological plausibility has been challenged due to three primary limitations: weight symmetry, reliance on global error signals, and the dual-phase nature of training, as highlighted by the existing literature. Although various alternative learning approaches have been proposed to address these issues, most either fail to satisfy all three criteria simultaneously or yield suboptimal results. Inspired by the dynamics and plasticity of pyramidal neurons, we propose Dendritic Localized Learning (DLL), a novel learning algorithm designed to overcome these challenges. Extensive empirical experiments demonstrate that DLL satisfies all three criteria of biological plausibility while achieving state-of-the-art performance among algorithms that meet these requirements. Furthermore, DLL exhibits strong generalization across a range of architectures, including MLPs, CNNs, and RNNs. These results, benchmarked against existing biologically plausible learning algorithms, offer valuable empirical insights for future research. We hope this study can inspire the development of new biologically plausible algorithms for training multilayer networks and advancing progress in both neuroscience and machine learning. Our code is available at https://github.com/Lvchangze/Dendritic-Localized-Learning.} }
Endnote
%0 Conference Paper %T Dendritic Localized Learning: Toward Biologically Plausible Algorithm %A Changze Lv %A Jingwen Xu %A Yiyang Lu %A Xiaohua Wang %A Zhenghua Wang %A Zhibo Xu %A Di Yu %A Xin Du %A Xiaoqing Zheng %A Xuanjing Huang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-lv25c %I PMLR %P 41682--41700 %U https://proceedings.mlr.press/v267/lv25c.html %V 267 %X Backpropagation is the foundational algorithm for training neural networks and a key driver of deep learning’s success. However, its biological plausibility has been challenged due to three primary limitations: weight symmetry, reliance on global error signals, and the dual-phase nature of training, as highlighted by the existing literature. Although various alternative learning approaches have been proposed to address these issues, most either fail to satisfy all three criteria simultaneously or yield suboptimal results. Inspired by the dynamics and plasticity of pyramidal neurons, we propose Dendritic Localized Learning (DLL), a novel learning algorithm designed to overcome these challenges. Extensive empirical experiments demonstrate that DLL satisfies all three criteria of biological plausibility while achieving state-of-the-art performance among algorithms that meet these requirements. Furthermore, DLL exhibits strong generalization across a range of architectures, including MLPs, CNNs, and RNNs. These results, benchmarked against existing biologically plausible learning algorithms, offer valuable empirical insights for future research. We hope this study can inspire the development of new biologically plausible algorithms for training multilayer networks and advancing progress in both neuroscience and machine learning. Our code is available at https://github.com/Lvchangze/Dendritic-Localized-Learning.
APA
Lv, C., Xu, J., Lu, Y., Wang, X., Wang, Z., Xu, Z., Yu, D., Du, X., Zheng, X. & Huang, X.. (2025). Dendritic Localized Learning: Toward Biologically Plausible Algorithm. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:41682-41700 Available from https://proceedings.mlr.press/v267/lv25c.html.

Related Material