Majorization-Minimization for Manifold Embedding

Zhirong Yang, Jaakko Peltonen, Samuel Kaski
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:1088-1097, 2015.

Abstract

Nonlinear dimensionality reduction by manifold embedding has become a popular and powerful approach both for visualization and as preprocessing for predictive tasks, but more efficient optimization algorithms are still crucially needed. Majorization-Minimization (MM) is a promising approach that monotonically decreases the cost function, but it remains unknown how to tightly majorize the manifold embedding objective functions such that the resulting MM algorithms are efficient and robust. We propose a new MM procedure that yields fast MM algorithms for a wide variety of manifold embedding problems. In our majorization step, two parts of the cost function are respectively upper bounded by quadratic and Lipschitz surrogates, and the resulting upper bound can be minimized in closed form. For cost functions amenable to such QL-majorization, the MM yields monotonic improvement and is efficient: in experiments the newly developed MM algorithms outperform five state-of-the-art optimization approaches in manifold embedding tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-yang15a, title = {{Majorization-Minimization for Manifold Embedding}}, author = {Yang, Zhirong and Peltonen, Jaakko and Kaski, Samuel}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {1088--1097}, year = {2015}, editor = {Lebanon, Guy and Vishwanathan, S. V. N.}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/yang15a.pdf}, url = {https://proceedings.mlr.press/v38/yang15a.html}, abstract = {Nonlinear dimensionality reduction by manifold embedding has become a popular and powerful approach both for visualization and as preprocessing for predictive tasks, but more efficient optimization algorithms are still crucially needed. Majorization-Minimization (MM) is a promising approach that monotonically decreases the cost function, but it remains unknown how to tightly majorize the manifold embedding objective functions such that the resulting MM algorithms are efficient and robust. We propose a new MM procedure that yields fast MM algorithms for a wide variety of manifold embedding problems. In our majorization step, two parts of the cost function are respectively upper bounded by quadratic and Lipschitz surrogates, and the resulting upper bound can be minimized in closed form. For cost functions amenable to such QL-majorization, the MM yields monotonic improvement and is efficient: in experiments the newly developed MM algorithms outperform five state-of-the-art optimization approaches in manifold embedding tasks.} }
Endnote
%0 Conference Paper %T Majorization-Minimization for Manifold Embedding %A Zhirong Yang %A Jaakko Peltonen %A Samuel Kaski %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-yang15a %I PMLR %P 1088--1097 %U https://proceedings.mlr.press/v38/yang15a.html %V 38 %X Nonlinear dimensionality reduction by manifold embedding has become a popular and powerful approach both for visualization and as preprocessing for predictive tasks, but more efficient optimization algorithms are still crucially needed. Majorization-Minimization (MM) is a promising approach that monotonically decreases the cost function, but it remains unknown how to tightly majorize the manifold embedding objective functions such that the resulting MM algorithms are efficient and robust. We propose a new MM procedure that yields fast MM algorithms for a wide variety of manifold embedding problems. In our majorization step, two parts of the cost function are respectively upper bounded by quadratic and Lipschitz surrogates, and the resulting upper bound can be minimized in closed form. For cost functions amenable to such QL-majorization, the MM yields monotonic improvement and is efficient: in experiments the newly developed MM algorithms outperform five state-of-the-art optimization approaches in manifold embedding tasks.
RIS
TY - CPAPER TI - Majorization-Minimization for Manifold Embedding AU - Zhirong Yang AU - Jaakko Peltonen AU - Samuel Kaski BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-yang15a PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 1088 EP - 1097 L1 - http://proceedings.mlr.press/v38/yang15a.pdf UR - https://proceedings.mlr.press/v38/yang15a.html AB - Nonlinear dimensionality reduction by manifold embedding has become a popular and powerful approach both for visualization and as preprocessing for predictive tasks, but more efficient optimization algorithms are still crucially needed. Majorization-Minimization (MM) is a promising approach that monotonically decreases the cost function, but it remains unknown how to tightly majorize the manifold embedding objective functions such that the resulting MM algorithms are efficient and robust. We propose a new MM procedure that yields fast MM algorithms for a wide variety of manifold embedding problems. In our majorization step, two parts of the cost function are respectively upper bounded by quadratic and Lipschitz surrogates, and the resulting upper bound can be minimized in closed form. For cost functions amenable to such QL-majorization, the MM yields monotonic improvement and is efficient: in experiments the newly developed MM algorithms outperform five state-of-the-art optimization approaches in manifold embedding tasks. ER -
APA
Yang, Z., Peltonen, J. & Kaski, S.. (2015). Majorization-Minimization for Manifold Embedding. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:1088-1097 Available from https://proceedings.mlr.press/v38/yang15a.html.

Related Material