Multidimensional Scaling: Approximation and Complexity

Erik Demaine, Adam Hesterberg, Frederic Koehler, Jayson Lynch, John Urschel
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:2568-2578, 2021.

Abstract

Metric Multidimensional scaling (MDS) is a classical method for generating meaningful (non-linear) low-dimensional embeddings of high-dimensional data. MDS has a long history in the statistics, machine learning, and graph drawing communities. In particular, the Kamada-Kawai force-directed graph drawing method is equivalent to MDS and is one of the most popular ways in practice to embed graphs into low dimensions. Despite its ubiquity, our theoretical understanding of MDS remains limited as its objective function is highly non-convex. In this paper, we prove that minimizing the Kamada-Kawai objective is NP-hard and give a provable approximation algorithm for optimizing it, which in particular is a PTAS on low-diameter graphs. We supplement this result with experiments suggesting possible connections between our greedy approximation algorithm and gradient-based methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-demaine21a, title = {Multidimensional Scaling: Approximation and Complexity}, author = {Demaine, Erik and Hesterberg, Adam and Koehler, Frederic and Lynch, Jayson and Urschel, John}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {2568--2578}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/demaine21a/demaine21a.pdf}, url = {https://proceedings.mlr.press/v139/demaine21a.html}, abstract = {Metric Multidimensional scaling (MDS) is a classical method for generating meaningful (non-linear) low-dimensional embeddings of high-dimensional data. MDS has a long history in the statistics, machine learning, and graph drawing communities. In particular, the Kamada-Kawai force-directed graph drawing method is equivalent to MDS and is one of the most popular ways in practice to embed graphs into low dimensions. Despite its ubiquity, our theoretical understanding of MDS remains limited as its objective function is highly non-convex. In this paper, we prove that minimizing the Kamada-Kawai objective is NP-hard and give a provable approximation algorithm for optimizing it, which in particular is a PTAS on low-diameter graphs. We supplement this result with experiments suggesting possible connections between our greedy approximation algorithm and gradient-based methods.} }
Endnote
%0 Conference Paper %T Multidimensional Scaling: Approximation and Complexity %A Erik Demaine %A Adam Hesterberg %A Frederic Koehler %A Jayson Lynch %A John Urschel %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-demaine21a %I PMLR %P 2568--2578 %U https://proceedings.mlr.press/v139/demaine21a.html %V 139 %X Metric Multidimensional scaling (MDS) is a classical method for generating meaningful (non-linear) low-dimensional embeddings of high-dimensional data. MDS has a long history in the statistics, machine learning, and graph drawing communities. In particular, the Kamada-Kawai force-directed graph drawing method is equivalent to MDS and is one of the most popular ways in practice to embed graphs into low dimensions. Despite its ubiquity, our theoretical understanding of MDS remains limited as its objective function is highly non-convex. In this paper, we prove that minimizing the Kamada-Kawai objective is NP-hard and give a provable approximation algorithm for optimizing it, which in particular is a PTAS on low-diameter graphs. We supplement this result with experiments suggesting possible connections between our greedy approximation algorithm and gradient-based methods.
APA
Demaine, E., Hesterberg, A., Koehler, F., Lynch, J. & Urschel, J.. (2021). Multidimensional Scaling: Approximation and Complexity. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:2568-2578 Available from https://proceedings.mlr.press/v139/demaine21a.html.

Related Material