Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:40824090, 2017.
Abstract
Recently low displacement rank (LDR) matrices, or socalled structured matrices, have been proposed to compress largescale neural networks. Empirical results have shown that neural networks with weight matrices of LDR matrices, referred as LDR neural networks, can achieve significant reduction in space and computational complexity while retaining high accuracy. This paper gives theoretical study on LDR neural networks. First, we prove the universal approximation property of LDR neural networks with a mild condition on the displacement operators. We then show that the error bounds of LDR neural networks are as efficient as general neural networks with both singlelayer and multiplelayer structure. Finally, we propose backpropagation based training algorithm for general LDR neural networks.
Related Material


