Homotopy Analysis for Tensor PCA

Anima Anandkumar, Yuan Deng, Rong Ge, Hossein Mobahi
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:79-104, 2017.

Abstract

Developing efficient and guaranteed nonconvex algorithms has been an important challenge in modern machine learning. Algorithms with good empirical performance such as stochastic gradient descent often lack theoretical guarantees. In this paper, we analyze the class of homotopy or continuation methods for global optimization of nonconvex functions. These methods start from an objective function that is efficient to optimize (e.g. convex), and progressively modify it to obtain the required objective, and the solutions are passed along the homotopy path. For the challenging problem of tensor PCA, we prove global convergence of the homotopy method in the “high noise” regime. The signal-to-noise requirement for our algorithm is tight in the sense that it matches the recovery guarantee for the \em best degree-$4$ sum-of-squares algorithm. In addition, we prove a phase transition along the homotopy path for tensor PCA. This allows us to simplify the homotopy method to a local search algorithm, viz., tensor power iterations, with a specific initialization and a noise injection procedure, while retaining the theoretical guarantees.

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-anandkumar17a, title = {Homotopy Analysis for Tensor PCA}, author = {Anandkumar, Anima and Deng, Yuan and Ge, Rong and Mobahi, Hossein}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {79--104}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/anandkumar17a/anandkumar17a.pdf}, url = {https://proceedings.mlr.press/v65/anandkumar17a.html}, abstract = {Developing efficient and guaranteed nonconvex algorithms has been an important challenge in modern machine learning. Algorithms with good empirical performance such as stochastic gradient descent often lack theoretical guarantees. In this paper, we analyze the class of homotopy or continuation methods for global optimization of nonconvex functions. These methods start from an objective function that is efficient to optimize (e.g. convex), and progressively modify it to obtain the required objective, and the solutions are passed along the homotopy path. For the challenging problem of tensor PCA, we prove global convergence of the homotopy method in the “high noise” regime. The signal-to-noise requirement for our algorithm is tight in the sense that it matches the recovery guarantee for the \em best degree-$4$ sum-of-squares algorithm. In addition, we prove a phase transition along the homotopy path for tensor PCA. This allows us to simplify the homotopy method to a local search algorithm, viz., tensor power iterations, with a specific initialization and a noise injection procedure, while retaining the theoretical guarantees.} }
Endnote
%0 Conference Paper %T Homotopy Analysis for Tensor PCA %A Anima Anandkumar %A Yuan Deng %A Rong Ge %A Hossein Mobahi %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-anandkumar17a %I PMLR %P 79--104 %U https://proceedings.mlr.press/v65/anandkumar17a.html %V 65 %X Developing efficient and guaranteed nonconvex algorithms has been an important challenge in modern machine learning. Algorithms with good empirical performance such as stochastic gradient descent often lack theoretical guarantees. In this paper, we analyze the class of homotopy or continuation methods for global optimization of nonconvex functions. These methods start from an objective function that is efficient to optimize (e.g. convex), and progressively modify it to obtain the required objective, and the solutions are passed along the homotopy path. For the challenging problem of tensor PCA, we prove global convergence of the homotopy method in the “high noise” regime. The signal-to-noise requirement for our algorithm is tight in the sense that it matches the recovery guarantee for the \em best degree-$4$ sum-of-squares algorithm. In addition, we prove a phase transition along the homotopy path for tensor PCA. This allows us to simplify the homotopy method to a local search algorithm, viz., tensor power iterations, with a specific initialization and a noise injection procedure, while retaining the theoretical guarantees.
APA
Anandkumar, A., Deng, Y., Ge, R. & Mobahi, H.. (2017). Homotopy Analysis for Tensor PCA. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:79-104 Available from https://proceedings.mlr.press/v65/anandkumar17a.html.

Related Material