Finding trainable sparse networks through Neural Tangent Transfer

Tianlin Liu, Friedemann Zenke
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6336-6347, 2020.

Abstract

Deep neural networks have dramatically transformed machine learning, but their memory and energy demands are substantial. The requirements of real biological neural networks are rather modest in comparison, and one feature that might underlie this austerity is their sparse connectivity. In deep learning, trainable sparse networks that perform well on a specific task are usually constructed using label-dependent pruning criteria. In this article, we introduce Neural Tangent Transfer, a method that instead finds trainable sparse networks in a label-free manner. Specifically, we find sparse networks whose training dynamics, as characterized by the neural tangent kernel, mimic those of dense networks in function space. Finally, we evaluate our label-agnostic approach on several standard classification tasks and show that the resulting sparse networks achieve higher classification performance while converging faster.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-liu20o, title = {Finding trainable sparse networks through Neural Tangent Transfer}, author = {Liu, Tianlin and Zenke, Friedemann}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6336--6347}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/liu20o/liu20o.pdf}, url = {http://proceedings.mlr.press/v119/liu20o.html}, abstract = {Deep neural networks have dramatically transformed machine learning, but their memory and energy demands are substantial. The requirements of real biological neural networks are rather modest in comparison, and one feature that might underlie this austerity is their sparse connectivity. In deep learning, trainable sparse networks that perform well on a specific task are usually constructed using label-dependent pruning criteria. In this article, we introduce Neural Tangent Transfer, a method that instead finds trainable sparse networks in a label-free manner. Specifically, we find sparse networks whose training dynamics, as characterized by the neural tangent kernel, mimic those of dense networks in function space. Finally, we evaluate our label-agnostic approach on several standard classification tasks and show that the resulting sparse networks achieve higher classification performance while converging faster.} }
Endnote
%0 Conference Paper %T Finding trainable sparse networks through Neural Tangent Transfer %A Tianlin Liu %A Friedemann Zenke %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-liu20o %I PMLR %P 6336--6347 %U http://proceedings.mlr.press/v119/liu20o.html %V 119 %X Deep neural networks have dramatically transformed machine learning, but their memory and energy demands are substantial. The requirements of real biological neural networks are rather modest in comparison, and one feature that might underlie this austerity is their sparse connectivity. In deep learning, trainable sparse networks that perform well on a specific task are usually constructed using label-dependent pruning criteria. In this article, we introduce Neural Tangent Transfer, a method that instead finds trainable sparse networks in a label-free manner. Specifically, we find sparse networks whose training dynamics, as characterized by the neural tangent kernel, mimic those of dense networks in function space. Finally, we evaluate our label-agnostic approach on several standard classification tasks and show that the resulting sparse networks achieve higher classification performance while converging faster.
APA
Liu, T. & Zenke, F.. (2020). Finding trainable sparse networks through Neural Tangent Transfer. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6336-6347 Available from http://proceedings.mlr.press/v119/liu20o.html.

Related Material