An Information-Theoretic Justification for Model Pruning

Berivan Isik, Tsachy Weissman, Albert No
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:3821-3846, 2022.

Abstract

We study the neural network (NN) compression problem, viewing the tension between the compression ratio and NN performance through the lens of rate-distortion theory. We choose a distortion metric that reflects the effect of NN compression on the model output and then derive the tradeoff between rate (compression ratio) and distortion. In addition to characterizing theoretical limits of NN compression, this formulation shows that pruning, implicitly or explicitly, must be a part of a good compression algorithm. This observation bridges a gap between parts of the literature pertaining to NN and data compression, respectively, providing insight into the empirical success of pruning for NN compression. Finally, we propose a novel pruning strategy derived from our information-theoretic formulation and show that it outperforms the relevant baselines on CIFAR-10 and ImageNet datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-isik22a, title = { An Information-Theoretic Justification for Model Pruning }, author = {Isik, Berivan and Weissman, Tsachy and No, Albert}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {3821--3846}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/isik22a/isik22a.pdf}, url = {https://proceedings.mlr.press/v151/isik22a.html}, abstract = { We study the neural network (NN) compression problem, viewing the tension between the compression ratio and NN performance through the lens of rate-distortion theory. We choose a distortion metric that reflects the effect of NN compression on the model output and then derive the tradeoff between rate (compression ratio) and distortion. In addition to characterizing theoretical limits of NN compression, this formulation shows that pruning, implicitly or explicitly, must be a part of a good compression algorithm. This observation bridges a gap between parts of the literature pertaining to NN and data compression, respectively, providing insight into the empirical success of pruning for NN compression. Finally, we propose a novel pruning strategy derived from our information-theoretic formulation and show that it outperforms the relevant baselines on CIFAR-10 and ImageNet datasets. } }
Endnote
%0 Conference Paper %T An Information-Theoretic Justification for Model Pruning %A Berivan Isik %A Tsachy Weissman %A Albert No %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-isik22a %I PMLR %P 3821--3846 %U https://proceedings.mlr.press/v151/isik22a.html %V 151 %X We study the neural network (NN) compression problem, viewing the tension between the compression ratio and NN performance through the lens of rate-distortion theory. We choose a distortion metric that reflects the effect of NN compression on the model output and then derive the tradeoff between rate (compression ratio) and distortion. In addition to characterizing theoretical limits of NN compression, this formulation shows that pruning, implicitly or explicitly, must be a part of a good compression algorithm. This observation bridges a gap between parts of the literature pertaining to NN and data compression, respectively, providing insight into the empirical success of pruning for NN compression. Finally, we propose a novel pruning strategy derived from our information-theoretic formulation and show that it outperforms the relevant baselines on CIFAR-10 and ImageNet datasets.
APA
Isik, B., Weissman, T. & No, A.. (2022). An Information-Theoretic Justification for Model Pruning . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:3821-3846 Available from https://proceedings.mlr.press/v151/isik22a.html.

Related Material