Spectral Pruning for Recurrent Neural Networks

Takashi Furuya, Kazuma Suetake, Koichi Taniguchi, Hiroyuki Kusumoto, Ryuji Saiin, Tomohiro Daimon
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:3458-3482, 2022.

Abstract

Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks. However, in general, RNNs have a large number of parameters and involve enormous computational costs by repeating the recurrent structures in many time steps. As a method to overcome this difficulty, RNN pruning has attracted increasing attention in recent years, and it brings us benefits in terms of the reduction of computational cost as the time step progresses. However, most existing methods of RNN pruning are heuristic. The purpose of this paper is to study the theoretical scheme for RNN pruning method. We propose an appropriate pruning algorithm for RNNs inspired by "spectral pruning", and provide the generalization error bounds for compressed RNNs. We also provide numerical experiments to demonstrate our theoretical results and show the effectiveness of our pruning method compared with the existing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-furuya22a, title = { Spectral Pruning for Recurrent Neural Networks }, author = {Furuya, Takashi and Suetake, Kazuma and Taniguchi, Koichi and Kusumoto, Hiroyuki and Saiin, Ryuji and Daimon, Tomohiro}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {3458--3482}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/furuya22a/furuya22a.pdf}, url = {https://proceedings.mlr.press/v151/furuya22a.html}, abstract = { Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks. However, in general, RNNs have a large number of parameters and involve enormous computational costs by repeating the recurrent structures in many time steps. As a method to overcome this difficulty, RNN pruning has attracted increasing attention in recent years, and it brings us benefits in terms of the reduction of computational cost as the time step progresses. However, most existing methods of RNN pruning are heuristic. The purpose of this paper is to study the theoretical scheme for RNN pruning method. We propose an appropriate pruning algorithm for RNNs inspired by "spectral pruning", and provide the generalization error bounds for compressed RNNs. We also provide numerical experiments to demonstrate our theoretical results and show the effectiveness of our pruning method compared with the existing methods. } }
Endnote
%0 Conference Paper %T Spectral Pruning for Recurrent Neural Networks %A Takashi Furuya %A Kazuma Suetake %A Koichi Taniguchi %A Hiroyuki Kusumoto %A Ryuji Saiin %A Tomohiro Daimon %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-furuya22a %I PMLR %P 3458--3482 %U https://proceedings.mlr.press/v151/furuya22a.html %V 151 %X Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks. However, in general, RNNs have a large number of parameters and involve enormous computational costs by repeating the recurrent structures in many time steps. As a method to overcome this difficulty, RNN pruning has attracted increasing attention in recent years, and it brings us benefits in terms of the reduction of computational cost as the time step progresses. However, most existing methods of RNN pruning are heuristic. The purpose of this paper is to study the theoretical scheme for RNN pruning method. We propose an appropriate pruning algorithm for RNNs inspired by "spectral pruning", and provide the generalization error bounds for compressed RNNs. We also provide numerical experiments to demonstrate our theoretical results and show the effectiveness of our pruning method compared with the existing methods.
APA
Furuya, T., Suetake, K., Taniguchi, K., Kusumoto, H., Saiin, R. & Daimon, T.. (2022). Spectral Pruning for Recurrent Neural Networks . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:3458-3482 Available from https://proceedings.mlr.press/v151/furuya22a.html.

Related Material