Metric Learning in an RKHS

Gokcan Tatli, Yi Chen, Blake Mason, Robert D Nowak, Ramya Korlakai Vinayak
Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, PMLR 286:4145-4164, 2025.

Abstract

This paper investigates metric learning in a Reproducing Kernel Hilbert Space (RKHS) based on a set of random triplet comparisons in the form of *"Do you think item h is more similar to item i or item j?"* indicating similarity and differences between various items. The goal is to learn a metric in the RKHS that reflects the comparisons. Nonlinear metric learning using kernel methods and neural networks has shown great empirical promise. While previous works have addressed certain aspects of this problem, there is little or no theoretical understanding of such methods. The exception is the special (linear) case in which the RKHS is the standard $d$-dimensional Euclidean space; there is a comprehensive theory for metric learning in the $d$-dimensional Euclidean space. This paper develops a general RKHS framework for metric learning and provides novel generalization guarantees and sample complexity bounds. We validate our findings through a set of simulations and experiments on real datasets. Our code is publicly available at https://github.com/RamyaLab/metric-learning-RKHS.

Cite this Paper


BibTeX
@InProceedings{pmlr-v286-tatli25a, title = {Metric Learning in an RKHS}, author = {Tatli, Gokcan and Chen, Yi and Mason, Blake and Nowak, Robert D and Vinayak, Ramya Korlakai}, booktitle = {Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence}, pages = {4145--4164}, year = {2025}, editor = {Chiappa, Silvia and Magliacane, Sara}, volume = {286}, series = {Proceedings of Machine Learning Research}, month = {21--25 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v286/main/assets/tatli25a/tatli25a.pdf}, url = {https://proceedings.mlr.press/v286/tatli25a.html}, abstract = {This paper investigates metric learning in a Reproducing Kernel Hilbert Space (RKHS) based on a set of random triplet comparisons in the form of *"Do you think item h is more similar to item i or item j?"* indicating similarity and differences between various items. The goal is to learn a metric in the RKHS that reflects the comparisons. Nonlinear metric learning using kernel methods and neural networks has shown great empirical promise. While previous works have addressed certain aspects of this problem, there is little or no theoretical understanding of such methods. The exception is the special (linear) case in which the RKHS is the standard $d$-dimensional Euclidean space; there is a comprehensive theory for metric learning in the $d$-dimensional Euclidean space. This paper develops a general RKHS framework for metric learning and provides novel generalization guarantees and sample complexity bounds. We validate our findings through a set of simulations and experiments on real datasets. Our code is publicly available at https://github.com/RamyaLab/metric-learning-RKHS.} }
Endnote
%0 Conference Paper %T Metric Learning in an RKHS %A Gokcan Tatli %A Yi Chen %A Blake Mason %A Robert D Nowak %A Ramya Korlakai Vinayak %B Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2025 %E Silvia Chiappa %E Sara Magliacane %F pmlr-v286-tatli25a %I PMLR %P 4145--4164 %U https://proceedings.mlr.press/v286/tatli25a.html %V 286 %X This paper investigates metric learning in a Reproducing Kernel Hilbert Space (RKHS) based on a set of random triplet comparisons in the form of *"Do you think item h is more similar to item i or item j?"* indicating similarity and differences between various items. The goal is to learn a metric in the RKHS that reflects the comparisons. Nonlinear metric learning using kernel methods and neural networks has shown great empirical promise. While previous works have addressed certain aspects of this problem, there is little or no theoretical understanding of such methods. The exception is the special (linear) case in which the RKHS is the standard $d$-dimensional Euclidean space; there is a comprehensive theory for metric learning in the $d$-dimensional Euclidean space. This paper develops a general RKHS framework for metric learning and provides novel generalization guarantees and sample complexity bounds. We validate our findings through a set of simulations and experiments on real datasets. Our code is publicly available at https://github.com/RamyaLab/metric-learning-RKHS.
APA
Tatli, G., Chen, Y., Mason, B., Nowak, R.D. & Vinayak, R.K.. (2025). Metric Learning in an RKHS. Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 286:4145-4164 Available from https://proceedings.mlr.press/v286/tatli25a.html.

Related Material