A Probabilistic Approach to Neural Network Pruning

Xin Qian, Diego Klabjan
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8640-8649, 2021.

Abstract

Neural network pruning techniques reduce the number of parameters without compromising predicting ability of a network. Many algorithms have been developed for pruning both over-parameterized fully-connected networks (FCN) and convolutional neural networks (CNN), but analytical studies of capabilities and compression ratios of such pruned sub-networks are lacking. We theoretically study the performance of two pruning techniques (random and magnitude-based) on FCN and CNN. Given a target network, we provide a universal approach to bound the gap between a pruned and the target network in a probabilistic sense, which is the first study of this nature. The results establish that there exist pruned networks with expressive power within any specified bound from the target network and with a significant compression ratio.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-qian21a, title = {A Probabilistic Approach to Neural Network Pruning}, author = {Qian, Xin and Klabjan, Diego}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {8640--8649}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/qian21a/qian21a.pdf}, url = {https://proceedings.mlr.press/v139/qian21a.html}, abstract = {Neural network pruning techniques reduce the number of parameters without compromising predicting ability of a network. Many algorithms have been developed for pruning both over-parameterized fully-connected networks (FCN) and convolutional neural networks (CNN), but analytical studies of capabilities and compression ratios of such pruned sub-networks are lacking. We theoretically study the performance of two pruning techniques (random and magnitude-based) on FCN and CNN. Given a target network, we provide a universal approach to bound the gap between a pruned and the target network in a probabilistic sense, which is the first study of this nature. The results establish that there exist pruned networks with expressive power within any specified bound from the target network and with a significant compression ratio.} }
Endnote
%0 Conference Paper %T A Probabilistic Approach to Neural Network Pruning %A Xin Qian %A Diego Klabjan %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-qian21a %I PMLR %P 8640--8649 %U https://proceedings.mlr.press/v139/qian21a.html %V 139 %X Neural network pruning techniques reduce the number of parameters without compromising predicting ability of a network. Many algorithms have been developed for pruning both over-parameterized fully-connected networks (FCN) and convolutional neural networks (CNN), but analytical studies of capabilities and compression ratios of such pruned sub-networks are lacking. We theoretically study the performance of two pruning techniques (random and magnitude-based) on FCN and CNN. Given a target network, we provide a universal approach to bound the gap between a pruned and the target network in a probabilistic sense, which is the first study of this nature. The results establish that there exist pruned networks with expressive power within any specified bound from the target network and with a significant compression ratio.
APA
Qian, X. & Klabjan, D.. (2021). A Probabilistic Approach to Neural Network Pruning. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:8640-8649 Available from https://proceedings.mlr.press/v139/qian21a.html.

Related Material