Tensor vs. Matrix Methods: Robust Tensor Decomposition under Block Sparse Perturbations

Anima Anandkumar, Prateek Jain, Yang Shi, U. N. Niranjan
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:268-276, 2016.

Abstract

Robust tensor CP decomposition involves decomposing a tensor into low rank and sparse components. We propose a novel non-convex iterative algorithm with guaranteed recovery. It alternates between low-rank CP decomposition through gradient ascent (a variant of the tensor power method), and hard thresholding of the residual. We prove convergence to the globally optimal solution under natural incoherence conditions on the low rank component, and bounded level of sparse perturbations. We compare our method with natural baselines, which apply robust matrix PCA either to the \em flattened tensor, or to the matrix slices of the tensor. Our method can provably handle a far greater level of perturbation when the sparse tensor is block-structured. This naturally occurs in many applications such as the foreground-background separation task in videos. Our experiments validate these findings. Thus, we establish that tensor methods can tolerate a higher level of gross corruptions compared to matrix methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-anandkumar16, title = {Tensor vs. Matrix Methods: Robust Tensor Decomposition under Block Sparse Perturbations}, author = {Anandkumar, Anima and Jain, Prateek and Shi, Yang and Niranjan, U. N.}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {268--276}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/anandkumar16.pdf}, url = {https://proceedings.mlr.press/v51/anandkumar16.html}, abstract = {Robust tensor CP decomposition involves decomposing a tensor into low rank and sparse components. We propose a novel non-convex iterative algorithm with guaranteed recovery. It alternates between low-rank CP decomposition through gradient ascent (a variant of the tensor power method), and hard thresholding of the residual. We prove convergence to the globally optimal solution under natural incoherence conditions on the low rank component, and bounded level of sparse perturbations. We compare our method with natural baselines, which apply robust matrix PCA either to the \em flattened tensor, or to the matrix slices of the tensor. Our method can provably handle a far greater level of perturbation when the sparse tensor is block-structured. This naturally occurs in many applications such as the foreground-background separation task in videos. Our experiments validate these findings. Thus, we establish that tensor methods can tolerate a higher level of gross corruptions compared to matrix methods.} }
Endnote
%0 Conference Paper %T Tensor vs. Matrix Methods: Robust Tensor Decomposition under Block Sparse Perturbations %A Anima Anandkumar %A Prateek Jain %A Yang Shi %A U. N. Niranjan %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-anandkumar16 %I PMLR %P 268--276 %U https://proceedings.mlr.press/v51/anandkumar16.html %V 51 %X Robust tensor CP decomposition involves decomposing a tensor into low rank and sparse components. We propose a novel non-convex iterative algorithm with guaranteed recovery. It alternates between low-rank CP decomposition through gradient ascent (a variant of the tensor power method), and hard thresholding of the residual. We prove convergence to the globally optimal solution under natural incoherence conditions on the low rank component, and bounded level of sparse perturbations. We compare our method with natural baselines, which apply robust matrix PCA either to the \em flattened tensor, or to the matrix slices of the tensor. Our method can provably handle a far greater level of perturbation when the sparse tensor is block-structured. This naturally occurs in many applications such as the foreground-background separation task in videos. Our experiments validate these findings. Thus, we establish that tensor methods can tolerate a higher level of gross corruptions compared to matrix methods.
RIS
TY - CPAPER TI - Tensor vs. Matrix Methods: Robust Tensor Decomposition under Block Sparse Perturbations AU - Anima Anandkumar AU - Prateek Jain AU - Yang Shi AU - U. N. Niranjan BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-anandkumar16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 268 EP - 276 L1 - http://proceedings.mlr.press/v51/anandkumar16.pdf UR - https://proceedings.mlr.press/v51/anandkumar16.html AB - Robust tensor CP decomposition involves decomposing a tensor into low rank and sparse components. We propose a novel non-convex iterative algorithm with guaranteed recovery. It alternates between low-rank CP decomposition through gradient ascent (a variant of the tensor power method), and hard thresholding of the residual. We prove convergence to the globally optimal solution under natural incoherence conditions on the low rank component, and bounded level of sparse perturbations. We compare our method with natural baselines, which apply robust matrix PCA either to the \em flattened tensor, or to the matrix slices of the tensor. Our method can provably handle a far greater level of perturbation when the sparse tensor is block-structured. This naturally occurs in many applications such as the foreground-background separation task in videos. Our experiments validate these findings. Thus, we establish that tensor methods can tolerate a higher level of gross corruptions compared to matrix methods. ER -
APA
Anandkumar, A., Jain, P., Shi, Y. & Niranjan, U.N.. (2016). Tensor vs. Matrix Methods: Robust Tensor Decomposition under Block Sparse Perturbations. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:268-276 Available from https://proceedings.mlr.press/v51/anandkumar16.html.

Related Material