Tensor Decomposition with Smoothness

Masaaki Imaizumi, Kohei Hayashi
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1597-1606, 2017.

Abstract

Real data tensors are usually high dimensional but their intrinsic information is preserved in low-dimensional space, which motivates to use tensor decompositions such as Tucker decomposition. Often, real data tensors are not only low dimensional, but also smooth, meaning that the adjacent elements are similar or continuously changing, which typically appear as spatial or temporal data. To incorporate the smoothness property, we propose the smoothed Tucker decomposition (STD). STD leverages the smoothness by the sum of a few basis functions, which reduces the number of parameters. The objective function is formulated as a convex problem and, to solve that, an algorithm based on the alternating direction method of multipliers is derived. We theoretically show that, under the smoothness assumption, STD achieves a better error bound. The theoretical result and performances of STD are numerically verified.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-imaizumi17a, title = {Tensor Decomposition with Smoothness}, author = {Masaaki Imaizumi and Kohei Hayashi}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {1597--1606}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/imaizumi17a/imaizumi17a.pdf}, url = {https://proceedings.mlr.press/v70/imaizumi17a.html}, abstract = {Real data tensors are usually high dimensional but their intrinsic information is preserved in low-dimensional space, which motivates to use tensor decompositions such as Tucker decomposition. Often, real data tensors are not only low dimensional, but also smooth, meaning that the adjacent elements are similar or continuously changing, which typically appear as spatial or temporal data. To incorporate the smoothness property, we propose the smoothed Tucker decomposition (STD). STD leverages the smoothness by the sum of a few basis functions, which reduces the number of parameters. The objective function is formulated as a convex problem and, to solve that, an algorithm based on the alternating direction method of multipliers is derived. We theoretically show that, under the smoothness assumption, STD achieves a better error bound. The theoretical result and performances of STD are numerically verified.} }
Endnote
%0 Conference Paper %T Tensor Decomposition with Smoothness %A Masaaki Imaizumi %A Kohei Hayashi %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-imaizumi17a %I PMLR %P 1597--1606 %U https://proceedings.mlr.press/v70/imaizumi17a.html %V 70 %X Real data tensors are usually high dimensional but their intrinsic information is preserved in low-dimensional space, which motivates to use tensor decompositions such as Tucker decomposition. Often, real data tensors are not only low dimensional, but also smooth, meaning that the adjacent elements are similar or continuously changing, which typically appear as spatial or temporal data. To incorporate the smoothness property, we propose the smoothed Tucker decomposition (STD). STD leverages the smoothness by the sum of a few basis functions, which reduces the number of parameters. The objective function is formulated as a convex problem and, to solve that, an algorithm based on the alternating direction method of multipliers is derived. We theoretically show that, under the smoothness assumption, STD achieves a better error bound. The theoretical result and performances of STD are numerically verified.
APA
Imaizumi, M. & Hayashi, K.. (2017). Tensor Decomposition with Smoothness. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:1597-1606 Available from https://proceedings.mlr.press/v70/imaizumi17a.html.

Related Material