Dimensionality Reduction on Complex Vector Spaces for Euclidean Distance with Dynamic Weights

Simone Moretti, Paolo Pellizzoni, Francesco Silvestri
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:44854-44873, 2025.

Abstract

The weighted Euclidean norm $||x||_w$ of a vector $x\in \mathbb{R}^d$ with weights $w\in \mathbb{R}^d$ is the Euclidean norm where the contribution of each dimension is scaled by a given weight. Approaches to dimensionality reduction that satisfy the Johnson–Lindenstrauss (JL) lemma can be easily adapted to the weighted Euclidean distance if weights are known and fixed: it suffices to scale each dimension of the input vectors according to the weights, and then apply any standard approach. However, this is not the case when weights are unknown during the dimensionality reduction or might dynamically change. In this paper, we address this issue by providing a linear function that maps vectors into a smaller complex vector space and allows to retrieve a JL-like estimate for the weighted Euclidean distance once weights are revealed. Our results are based on the decomposition of the complex dimensionality reduction into several Rademacher chaos random variables, which are studied using novel concentration inequalities for sums of independent Rademacher chaoses.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-moretti25a, title = {Dimensionality Reduction on Complex Vector Spaces for {E}uclidean Distance with Dynamic Weights}, author = {Moretti, Simone and Pellizzoni, Paolo and Silvestri, Francesco}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {44854--44873}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/moretti25a/moretti25a.pdf}, url = {https://proceedings.mlr.press/v267/moretti25a.html}, abstract = {The weighted Euclidean norm $||x||_w$ of a vector $x\in \mathbb{R}^d$ with weights $w\in \mathbb{R}^d$ is the Euclidean norm where the contribution of each dimension is scaled by a given weight. Approaches to dimensionality reduction that satisfy the Johnson–Lindenstrauss (JL) lemma can be easily adapted to the weighted Euclidean distance if weights are known and fixed: it suffices to scale each dimension of the input vectors according to the weights, and then apply any standard approach. However, this is not the case when weights are unknown during the dimensionality reduction or might dynamically change. In this paper, we address this issue by providing a linear function that maps vectors into a smaller complex vector space and allows to retrieve a JL-like estimate for the weighted Euclidean distance once weights are revealed. Our results are based on the decomposition of the complex dimensionality reduction into several Rademacher chaos random variables, which are studied using novel concentration inequalities for sums of independent Rademacher chaoses.} }
Endnote
%0 Conference Paper %T Dimensionality Reduction on Complex Vector Spaces for Euclidean Distance with Dynamic Weights %A Simone Moretti %A Paolo Pellizzoni %A Francesco Silvestri %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-moretti25a %I PMLR %P 44854--44873 %U https://proceedings.mlr.press/v267/moretti25a.html %V 267 %X The weighted Euclidean norm $||x||_w$ of a vector $x\in \mathbb{R}^d$ with weights $w\in \mathbb{R}^d$ is the Euclidean norm where the contribution of each dimension is scaled by a given weight. Approaches to dimensionality reduction that satisfy the Johnson–Lindenstrauss (JL) lemma can be easily adapted to the weighted Euclidean distance if weights are known and fixed: it suffices to scale each dimension of the input vectors according to the weights, and then apply any standard approach. However, this is not the case when weights are unknown during the dimensionality reduction or might dynamically change. In this paper, we address this issue by providing a linear function that maps vectors into a smaller complex vector space and allows to retrieve a JL-like estimate for the weighted Euclidean distance once weights are revealed. Our results are based on the decomposition of the complex dimensionality reduction into several Rademacher chaos random variables, which are studied using novel concentration inequalities for sums of independent Rademacher chaoses.
APA
Moretti, S., Pellizzoni, P. & Silvestri, F.. (2025). Dimensionality Reduction on Complex Vector Spaces for Euclidean Distance with Dynamic Weights. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:44854-44873 Available from https://proceedings.mlr.press/v267/moretti25a.html.

Related Material