Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank

Liang Zhao, Siyu Liao, Yanzhi Wang, Zhe Li, Jian Tang, Bo Yuan
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:4082-4090, 2017.

Abstract

Recently low displacement rank (LDR) matrices, or so-called structured matrices, have been proposed to compress large-scale neural networks. Empirical results have shown that neural networks with weight matrices of LDR matrices, referred as LDR neural networks, can achieve significant reduction in space and computational complexity while retaining high accuracy. This paper gives theoretical study on LDR neural networks. First, we prove the universal approximation property of LDR neural networks with a mild condition on the displacement operators. We then show that the error bounds of LDR neural networks are as efficient as general neural networks with both single-layer and multiple-layer structure. Finally, we propose back-propagation based training algorithm for general LDR neural networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-zhao17b, title = {Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank}, author = {Liang Zhao and Siyu Liao and Yanzhi Wang and Zhe Li and Jian Tang and Bo Yuan}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {4082--4090}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/zhao17b/zhao17b.pdf}, url = {https://proceedings.mlr.press/v70/zhao17b.html}, abstract = {Recently low displacement rank (LDR) matrices, or so-called structured matrices, have been proposed to compress large-scale neural networks. Empirical results have shown that neural networks with weight matrices of LDR matrices, referred as LDR neural networks, can achieve significant reduction in space and computational complexity while retaining high accuracy. This paper gives theoretical study on LDR neural networks. First, we prove the universal approximation property of LDR neural networks with a mild condition on the displacement operators. We then show that the error bounds of LDR neural networks are as efficient as general neural networks with both single-layer and multiple-layer structure. Finally, we propose back-propagation based training algorithm for general LDR neural networks.} }
Endnote
%0 Conference Paper %T Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank %A Liang Zhao %A Siyu Liao %A Yanzhi Wang %A Zhe Li %A Jian Tang %A Bo Yuan %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-zhao17b %I PMLR %P 4082--4090 %U https://proceedings.mlr.press/v70/zhao17b.html %V 70 %X Recently low displacement rank (LDR) matrices, or so-called structured matrices, have been proposed to compress large-scale neural networks. Empirical results have shown that neural networks with weight matrices of LDR matrices, referred as LDR neural networks, can achieve significant reduction in space and computational complexity while retaining high accuracy. This paper gives theoretical study on LDR neural networks. First, we prove the universal approximation property of LDR neural networks with a mild condition on the displacement operators. We then show that the error bounds of LDR neural networks are as efficient as general neural networks with both single-layer and multiple-layer structure. Finally, we propose back-propagation based training algorithm for general LDR neural networks.
APA
Zhao, L., Liao, S., Wang, Y., Li, Z., Tang, J. & Yuan, B.. (2017). Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:4082-4090 Available from https://proceedings.mlr.press/v70/zhao17b.html.

Related Material