StableMDS: A Novel Gradient Descent-Based Method for Stabilizing and Accelerating Weighted Multidimensional Scaling

Zhongxi Fang, Xun Su, Tomohisa Tabuchi, Jianming Huang, Hiroyuki Kasai
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1981-1989, 2025.

Abstract

Multidimensional Scaling (MDS) is an essential technique in multivariate analysis, with Weighted MDS (WMDS) commonly employed for tasks such as dimensionality reduction and graph drawing. However, the optimization of WMDS poses significant challenges due to the highly non-convex nature of its objective function. Stress Majorization, a method classified under the Majorization-Minimization algorithm, is among the most widely used solvers for this problem because it guarantees non-increasing loss values during optimization, even with a non-convex objective function. Despite this advantage, Stress Majorization suffers from high computational complexity, specifically $\mathcal{O}(\max(n^3, n^2 p))$ per iteration, where $n$ denotes the number of data points, and $p$ represents the projection dimension, rendering it impractical for large-scale data analysis. To mitigate the computational challenge, we introduce StableMDS, a novel gradient descent-based method that reduces the computational complexity to $\mathcal{O}(n^2 p)$ per iteration. StableMDS achieves this computational efficiency by applying gradient descent independently to each point, thereby eliminating the need for costly matrix operations inherent in Stress Majorization. Furthermore, we theoretically ensure non-increasing loss values and optimization stability akin to Stress Majorization. These advancements not only enhance computational efficiency but also maintain stability, thereby broadening the applicability of WMDS to larger datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-fang25a, title = {StableMDS: A Novel Gradient Descent-Based Method for Stabilizing and Accelerating Weighted Multidimensional Scaling}, author = {Fang, Zhongxi and Su, Xun and Tabuchi, Tomohisa and Huang, Jianming and Kasai, Hiroyuki}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1981--1989}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/fang25a/fang25a.pdf}, url = {https://proceedings.mlr.press/v258/fang25a.html}, abstract = {Multidimensional Scaling (MDS) is an essential technique in multivariate analysis, with Weighted MDS (WMDS) commonly employed for tasks such as dimensionality reduction and graph drawing. However, the optimization of WMDS poses significant challenges due to the highly non-convex nature of its objective function. Stress Majorization, a method classified under the Majorization-Minimization algorithm, is among the most widely used solvers for this problem because it guarantees non-increasing loss values during optimization, even with a non-convex objective function. Despite this advantage, Stress Majorization suffers from high computational complexity, specifically $\mathcal{O}(\max(n^3, n^2 p))$ per iteration, where $n$ denotes the number of data points, and $p$ represents the projection dimension, rendering it impractical for large-scale data analysis. To mitigate the computational challenge, we introduce StableMDS, a novel gradient descent-based method that reduces the computational complexity to $\mathcal{O}(n^2 p)$ per iteration. StableMDS achieves this computational efficiency by applying gradient descent independently to each point, thereby eliminating the need for costly matrix operations inherent in Stress Majorization. Furthermore, we theoretically ensure non-increasing loss values and optimization stability akin to Stress Majorization. These advancements not only enhance computational efficiency but also maintain stability, thereby broadening the applicability of WMDS to larger datasets.} }
Endnote
%0 Conference Paper %T StableMDS: A Novel Gradient Descent-Based Method for Stabilizing and Accelerating Weighted Multidimensional Scaling %A Zhongxi Fang %A Xun Su %A Tomohisa Tabuchi %A Jianming Huang %A Hiroyuki Kasai %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-fang25a %I PMLR %P 1981--1989 %U https://proceedings.mlr.press/v258/fang25a.html %V 258 %X Multidimensional Scaling (MDS) is an essential technique in multivariate analysis, with Weighted MDS (WMDS) commonly employed for tasks such as dimensionality reduction and graph drawing. However, the optimization of WMDS poses significant challenges due to the highly non-convex nature of its objective function. Stress Majorization, a method classified under the Majorization-Minimization algorithm, is among the most widely used solvers for this problem because it guarantees non-increasing loss values during optimization, even with a non-convex objective function. Despite this advantage, Stress Majorization suffers from high computational complexity, specifically $\mathcal{O}(\max(n^3, n^2 p))$ per iteration, where $n$ denotes the number of data points, and $p$ represents the projection dimension, rendering it impractical for large-scale data analysis. To mitigate the computational challenge, we introduce StableMDS, a novel gradient descent-based method that reduces the computational complexity to $\mathcal{O}(n^2 p)$ per iteration. StableMDS achieves this computational efficiency by applying gradient descent independently to each point, thereby eliminating the need for costly matrix operations inherent in Stress Majorization. Furthermore, we theoretically ensure non-increasing loss values and optimization stability akin to Stress Majorization. These advancements not only enhance computational efficiency but also maintain stability, thereby broadening the applicability of WMDS to larger datasets.
APA
Fang, Z., Su, X., Tabuchi, T., Huang, J. & Kasai, H.. (2025). StableMDS: A Novel Gradient Descent-Based Method for Stabilizing and Accelerating Weighted Multidimensional Scaling. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1981-1989 Available from https://proceedings.mlr.press/v258/fang25a.html.

Related Material