EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis

Chaoqi Wang, Roger Grosse, Sanja Fidler, Guodong Zhang
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6566-6575, 2019.

Abstract

Reducing the test time resource requirements of a neural network while preserving test accuracy is crucial for running inference on resource-constrained devices. To achieve this goal, we introduce a novel network reparameterization based on the Kronecker-factored eigenbasis (KFE), and then apply Hessian-based structured pruning methods in this basis. As opposed to existing Hessian-based pruning algorithms which do pruning in parameter coordinates, our method works in the KFE where different weights are approximately independent, enabling accurate pruning and fast computation. We demonstrate empirically the effectiveness of the proposed method through extensive experiments. In particular, we highlight that the improvements are especially significant for more challenging datasets and networks. With negligible loss of accuracy, an iterative-pruning version gives a 10x reduction in model size and a 8x reduction in FLOPs on wide ResNet32.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-wang19g, title = {{E}igen{D}amage: Structured Pruning in the {K}ronecker-Factored Eigenbasis}, author = {Wang, Chaoqi and Grosse, Roger and Fidler, Sanja and Zhang, Guodong}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6566--6575}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/wang19g/wang19g.pdf}, url = {https://proceedings.mlr.press/v97/wang19g.html}, abstract = {Reducing the test time resource requirements of a neural network while preserving test accuracy is crucial for running inference on resource-constrained devices. To achieve this goal, we introduce a novel network reparameterization based on the Kronecker-factored eigenbasis (KFE), and then apply Hessian-based structured pruning methods in this basis. As opposed to existing Hessian-based pruning algorithms which do pruning in parameter coordinates, our method works in the KFE where different weights are approximately independent, enabling accurate pruning and fast computation. We demonstrate empirically the effectiveness of the proposed method through extensive experiments. In particular, we highlight that the improvements are especially significant for more challenging datasets and networks. With negligible loss of accuracy, an iterative-pruning version gives a 10x reduction in model size and a 8x reduction in FLOPs on wide ResNet32.} }
Endnote
%0 Conference Paper %T EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis %A Chaoqi Wang %A Roger Grosse %A Sanja Fidler %A Guodong Zhang %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-wang19g %I PMLR %P 6566--6575 %U https://proceedings.mlr.press/v97/wang19g.html %V 97 %X Reducing the test time resource requirements of a neural network while preserving test accuracy is crucial for running inference on resource-constrained devices. To achieve this goal, we introduce a novel network reparameterization based on the Kronecker-factored eigenbasis (KFE), and then apply Hessian-based structured pruning methods in this basis. As opposed to existing Hessian-based pruning algorithms which do pruning in parameter coordinates, our method works in the KFE where different weights are approximately independent, enabling accurate pruning and fast computation. We demonstrate empirically the effectiveness of the proposed method through extensive experiments. In particular, we highlight that the improvements are especially significant for more challenging datasets and networks. With negligible loss of accuracy, an iterative-pruning version gives a 10x reduction in model size and a 8x reduction in FLOPs on wide ResNet32.
APA
Wang, C., Grosse, R., Fidler, S. & Zhang, G.. (2019). EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6566-6575 Available from https://proceedings.mlr.press/v97/wang19g.html.

Related Material