Tensor Decomposition via Joint Matrix Schur Decomposition

Nicolo Colombo, Nikos Vlassis
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2820-2828, 2016.

Abstract

We describe an approach to tensor decomposition that involves extracting a set of observable matrices from the tensor and applying an approximate joint Schur decomposition on those matrices, and we establish the corresponding first-order perturbation bounds. We develop a novel iterative Gauss-Newton algorithm for joint matrix Schur decomposition, which minimizes a nonconvex objective over the manifold of orthogonal matrices, and which is guaranteed to converge to a global optimum under certain conditions. We empirically demonstrate that our algorithm is faster and at least as accurate and robust than state-of-the-art algorithms for this problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-colombo16, title = {Tensor Decomposition via Joint Matrix Schur Decomposition}, author = {Colombo, Nicolo and Vlassis, Nikos}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {2820--2828}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/colombo16.pdf}, url = {https://proceedings.mlr.press/v48/colombo16.html}, abstract = {We describe an approach to tensor decomposition that involves extracting a set of observable matrices from the tensor and applying an approximate joint Schur decomposition on those matrices, and we establish the corresponding first-order perturbation bounds. We develop a novel iterative Gauss-Newton algorithm for joint matrix Schur decomposition, which minimizes a nonconvex objective over the manifold of orthogonal matrices, and which is guaranteed to converge to a global optimum under certain conditions. We empirically demonstrate that our algorithm is faster and at least as accurate and robust than state-of-the-art algorithms for this problem.} }
Endnote
%0 Conference Paper %T Tensor Decomposition via Joint Matrix Schur Decomposition %A Nicolo Colombo %A Nikos Vlassis %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-colombo16 %I PMLR %P 2820--2828 %U https://proceedings.mlr.press/v48/colombo16.html %V 48 %X We describe an approach to tensor decomposition that involves extracting a set of observable matrices from the tensor and applying an approximate joint Schur decomposition on those matrices, and we establish the corresponding first-order perturbation bounds. We develop a novel iterative Gauss-Newton algorithm for joint matrix Schur decomposition, which minimizes a nonconvex objective over the manifold of orthogonal matrices, and which is guaranteed to converge to a global optimum under certain conditions. We empirically demonstrate that our algorithm is faster and at least as accurate and robust than state-of-the-art algorithms for this problem.
RIS
TY - CPAPER TI - Tensor Decomposition via Joint Matrix Schur Decomposition AU - Nicolo Colombo AU - Nikos Vlassis BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-colombo16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 2820 EP - 2828 L1 - http://proceedings.mlr.press/v48/colombo16.pdf UR - https://proceedings.mlr.press/v48/colombo16.html AB - We describe an approach to tensor decomposition that involves extracting a set of observable matrices from the tensor and applying an approximate joint Schur decomposition on those matrices, and we establish the corresponding first-order perturbation bounds. We develop a novel iterative Gauss-Newton algorithm for joint matrix Schur decomposition, which minimizes a nonconvex objective over the manifold of orthogonal matrices, and which is guaranteed to converge to a global optimum under certain conditions. We empirically demonstrate that our algorithm is faster and at least as accurate and robust than state-of-the-art algorithms for this problem. ER -
APA
Colombo, N. & Vlassis, N.. (2016). Tensor Decomposition via Joint Matrix Schur Decomposition. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:2820-2828 Available from https://proceedings.mlr.press/v48/colombo16.html.

Related Material