Spectral Tensor Train Parameterization of Deep Learning Layers

Anton Obukhov, Maxim Rakhuba, Alexander Liniger, Zhiwu Huang, Stamatios Georgoulis, Dengxin Dai, Luc Van Gool
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:3547-3555, 2021.

Abstract

We study low-rank parameterizations of weight matrices with embedded spectral properties in the Deep Learning context. The low-rank property leads to parameter efficiency and permits taking computational shortcuts when computing mappings. Spectral properties are often subject to constraints in optimization problems, leading to better models and stability of optimization. We start by looking at the compact SVD parameterization of weight matrices and identifying redundancy sources in the parameterization. We further apply the Tensor Train (TT) decomposition to the compact SVD components, and propose a non-redundant differentiable parameterization of fixed TT-rank tensor manifolds, termed the Spectral Tensor Train Parameterization (STTP). We demonstrate the effects of neural network compression in the image classification setting, and both compression and improved training stability in the generative adversarial training setting. Project website: www.obukhov.ai/sttp

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-obukhov21a, title = { Spectral Tensor Train Parameterization of Deep Learning Layers }, author = {Obukhov, Anton and Rakhuba, Maxim and Liniger, Alexander and Huang, Zhiwu and Georgoulis, Stamatios and Dai, Dengxin and Van Gool, Luc}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {3547--3555}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/obukhov21a/obukhov21a.pdf}, url = {https://proceedings.mlr.press/v130/obukhov21a.html}, abstract = { We study low-rank parameterizations of weight matrices with embedded spectral properties in the Deep Learning context. The low-rank property leads to parameter efficiency and permits taking computational shortcuts when computing mappings. Spectral properties are often subject to constraints in optimization problems, leading to better models and stability of optimization. We start by looking at the compact SVD parameterization of weight matrices and identifying redundancy sources in the parameterization. We further apply the Tensor Train (TT) decomposition to the compact SVD components, and propose a non-redundant differentiable parameterization of fixed TT-rank tensor manifolds, termed the Spectral Tensor Train Parameterization (STTP). We demonstrate the effects of neural network compression in the image classification setting, and both compression and improved training stability in the generative adversarial training setting. Project website: www.obukhov.ai/sttp } }
Endnote
%0 Conference Paper %T Spectral Tensor Train Parameterization of Deep Learning Layers %A Anton Obukhov %A Maxim Rakhuba %A Alexander Liniger %A Zhiwu Huang %A Stamatios Georgoulis %A Dengxin Dai %A Luc Van Gool %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-obukhov21a %I PMLR %P 3547--3555 %U https://proceedings.mlr.press/v130/obukhov21a.html %V 130 %X We study low-rank parameterizations of weight matrices with embedded spectral properties in the Deep Learning context. The low-rank property leads to parameter efficiency and permits taking computational shortcuts when computing mappings. Spectral properties are often subject to constraints in optimization problems, leading to better models and stability of optimization. We start by looking at the compact SVD parameterization of weight matrices and identifying redundancy sources in the parameterization. We further apply the Tensor Train (TT) decomposition to the compact SVD components, and propose a non-redundant differentiable parameterization of fixed TT-rank tensor manifolds, termed the Spectral Tensor Train Parameterization (STTP). We demonstrate the effects of neural network compression in the image classification setting, and both compression and improved training stability in the generative adversarial training setting. Project website: www.obukhov.ai/sttp
APA
Obukhov, A., Rakhuba, M., Liniger, A., Huang, Z., Georgoulis, S., Dai, D. & Van Gool, L.. (2021). Spectral Tensor Train Parameterization of Deep Learning Layers . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:3547-3555 Available from https://proceedings.mlr.press/v130/obukhov21a.html.

Related Material