Fast Low-Rank Semidefinite Programming for Embedding and Clustering

Brian Kulis, Arun C. Surendran, John C. Platt
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:235-242, 2007.

Abstract

Many non-convex problems in machine learning such as embedding and clustering have been solved using convex semidefinite relaxations. These semidefinite programs (SDPs) are expensive to solve and are hence limited to run on very small data sets. In this paper we show how we can improve the quality and speed of solving a number of these problems by casting them as low-rank SDPs and then directly solving them using a nonconvex optimization algorithm. In particular, we show that problems such as the k-means clustering and maximum variance unfolding (MVU) may be expressed exactly as low-rank SDPs and solved using our approach. We demonstrate that in the above problems our approach is significantly faster, far more scalable and often produces better results compared to traditional SDP relaxation techniques.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-kulis07a, title = {Fast Low-Rank Semidefinite Programming for Embedding and Clustering}, author = {Kulis, Brian and Surendran, Arun C. and Platt, John C.}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {235--242}, year = {2007}, editor = {Meila, Marina and Shen, Xiaotong}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/kulis07a/kulis07a.pdf}, url = {https://proceedings.mlr.press/v2/kulis07a.html}, abstract = {Many non-convex problems in machine learning such as embedding and clustering have been solved using convex semidefinite relaxations. These semidefinite programs (SDPs) are expensive to solve and are hence limited to run on very small data sets. In this paper we show how we can improve the quality and speed of solving a number of these problems by casting them as low-rank SDPs and then directly solving them using a nonconvex optimization algorithm. In particular, we show that problems such as the k-means clustering and maximum variance unfolding (MVU) may be expressed exactly as low-rank SDPs and solved using our approach. We demonstrate that in the above problems our approach is significantly faster, far more scalable and often produces better results compared to traditional SDP relaxation techniques.} }
Endnote
%0 Conference Paper %T Fast Low-Rank Semidefinite Programming for Embedding and Clustering %A Brian Kulis %A Arun C. Surendran %A John C. Platt %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-kulis07a %I PMLR %P 235--242 %U https://proceedings.mlr.press/v2/kulis07a.html %V 2 %X Many non-convex problems in machine learning such as embedding and clustering have been solved using convex semidefinite relaxations. These semidefinite programs (SDPs) are expensive to solve and are hence limited to run on very small data sets. In this paper we show how we can improve the quality and speed of solving a number of these problems by casting them as low-rank SDPs and then directly solving them using a nonconvex optimization algorithm. In particular, we show that problems such as the k-means clustering and maximum variance unfolding (MVU) may be expressed exactly as low-rank SDPs and solved using our approach. We demonstrate that in the above problems our approach is significantly faster, far more scalable and often produces better results compared to traditional SDP relaxation techniques.
RIS
TY - CPAPER TI - Fast Low-Rank Semidefinite Programming for Embedding and Clustering AU - Brian Kulis AU - Arun C. Surendran AU - John C. Platt BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-kulis07a PB - PMLR DP - Proceedings of Machine Learning Research VL - 2 SP - 235 EP - 242 L1 - http://proceedings.mlr.press/v2/kulis07a/kulis07a.pdf UR - https://proceedings.mlr.press/v2/kulis07a.html AB - Many non-convex problems in machine learning such as embedding and clustering have been solved using convex semidefinite relaxations. These semidefinite programs (SDPs) are expensive to solve and are hence limited to run on very small data sets. In this paper we show how we can improve the quality and speed of solving a number of these problems by casting them as low-rank SDPs and then directly solving them using a nonconvex optimization algorithm. In particular, we show that problems such as the k-means clustering and maximum variance unfolding (MVU) may be expressed exactly as low-rank SDPs and solved using our approach. We demonstrate that in the above problems our approach is significantly faster, far more scalable and often produces better results compared to traditional SDP relaxation techniques. ER -
APA
Kulis, B., Surendran, A.C. & Platt, J.C.. (2007). Fast Low-Rank Semidefinite Programming for Embedding and Clustering. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 2:235-242 Available from https://proceedings.mlr.press/v2/kulis07a.html.

Related Material