Fully Dynamic Embedding into $\ell_p$ Spaces

Kiarash Banihashem, Xiang Chen, Mohammadtaghi Hajiaghayi, Sungchul Kim, Kanak Mahadik, Ryan A. Rossi, Tong Yu
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:2894-2909, 2025.

Abstract

Metric embeddings are fundamental in machine learning, enabling similarity search, dimensionality reduction, and representation learning. They underpin modern architectures like transformers and large language models, facilitating scalable training and improved generalization. Theoretically, the classic problem in embedding design is mapping arbitrary metrics into $\ell_p$ spaces while approximately preserving pairwise distances. We study this problem in a fully dynamic setting, where the underlying metric is a graph metric subject to edge insertions and deletions. Our goal is to maintain an efficient embedding after each update. We present the first fully dynamic algorithm for this problem, achieving $O(\log(n))^{2q} O(\log(nW))^{q-1}$ expected distortion with $O(m^{1/q + o(1)})$ update time and $O(q \log(n) \log(nW))$ query time, where $q \ge 2$ is an integer parameter.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-banihashem25a, title = {Fully Dynamic Embedding into $\ell_p$ Spaces}, author = {Banihashem, Kiarash and Chen, Xiang and Hajiaghayi, Mohammadtaghi and Kim, Sungchul and Mahadik, Kanak and Rossi, Ryan A. and Yu, Tong}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {2894--2909}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/banihashem25a/banihashem25a.pdf}, url = {https://proceedings.mlr.press/v267/banihashem25a.html}, abstract = {Metric embeddings are fundamental in machine learning, enabling similarity search, dimensionality reduction, and representation learning. They underpin modern architectures like transformers and large language models, facilitating scalable training and improved generalization. Theoretically, the classic problem in embedding design is mapping arbitrary metrics into $\ell_p$ spaces while approximately preserving pairwise distances. We study this problem in a fully dynamic setting, where the underlying metric is a graph metric subject to edge insertions and deletions. Our goal is to maintain an efficient embedding after each update. We present the first fully dynamic algorithm for this problem, achieving $O(\log(n))^{2q} O(\log(nW))^{q-1}$ expected distortion with $O(m^{1/q + o(1)})$ update time and $O(q \log(n) \log(nW))$ query time, where $q \ge 2$ is an integer parameter.} }
Endnote
%0 Conference Paper %T Fully Dynamic Embedding into $\ell_p$ Spaces %A Kiarash Banihashem %A Xiang Chen %A Mohammadtaghi Hajiaghayi %A Sungchul Kim %A Kanak Mahadik %A Ryan A. Rossi %A Tong Yu %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-banihashem25a %I PMLR %P 2894--2909 %U https://proceedings.mlr.press/v267/banihashem25a.html %V 267 %X Metric embeddings are fundamental in machine learning, enabling similarity search, dimensionality reduction, and representation learning. They underpin modern architectures like transformers and large language models, facilitating scalable training and improved generalization. Theoretically, the classic problem in embedding design is mapping arbitrary metrics into $\ell_p$ spaces while approximately preserving pairwise distances. We study this problem in a fully dynamic setting, where the underlying metric is a graph metric subject to edge insertions and deletions. Our goal is to maintain an efficient embedding after each update. We present the first fully dynamic algorithm for this problem, achieving $O(\log(n))^{2q} O(\log(nW))^{q-1}$ expected distortion with $O(m^{1/q + o(1)})$ update time and $O(q \log(n) \log(nW))$ query time, where $q \ge 2$ is an integer parameter.
APA
Banihashem, K., Chen, X., Hajiaghayi, M., Kim, S., Mahadik, K., Rossi, R.A. & Yu, T.. (2025). Fully Dynamic Embedding into $\ell_p$ Spaces. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:2894-2909 Available from https://proceedings.mlr.press/v267/banihashem25a.html.

Related Material