Overcoming Saturation in Density Ratio Estimation by Iterated Regularization

Lukas Gruber, Markus Holzleitner, Johannes Lehner, Sepp Hochreiter, Werner Zellinger
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:16502-16529, 2024.

Abstract

Estimating the ratio of two probability densities from finitely many samples, is a central task in machine learning and statistics. In this work, we show that a large class of kernel methods for density ratio estimation suffers from error saturation, which prevents algorithms from achieving fast error convergence rates on highly regular learning problems. To resolve saturation, we introduce iterated regularization in density ratio estimation to achieve fast error rates. Our methods outperform its non-iteratively regularized versions on benchmarks for density ratio estimation as well as on large-scale evaluations for importance-weighted ensembling of deep unsupervised domain adaptation models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-gruber24b, title = {Overcoming Saturation in Density Ratio Estimation by Iterated Regularization}, author = {Gruber, Lukas and Holzleitner, Markus and Lehner, Johannes and Hochreiter, Sepp and Zellinger, Werner}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {16502--16529}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/gruber24b/gruber24b.pdf}, url = {https://proceedings.mlr.press/v235/gruber24b.html}, abstract = {Estimating the ratio of two probability densities from finitely many samples, is a central task in machine learning and statistics. In this work, we show that a large class of kernel methods for density ratio estimation suffers from error saturation, which prevents algorithms from achieving fast error convergence rates on highly regular learning problems. To resolve saturation, we introduce iterated regularization in density ratio estimation to achieve fast error rates. Our methods outperform its non-iteratively regularized versions on benchmarks for density ratio estimation as well as on large-scale evaluations for importance-weighted ensembling of deep unsupervised domain adaptation models.} }
Endnote
%0 Conference Paper %T Overcoming Saturation in Density Ratio Estimation by Iterated Regularization %A Lukas Gruber %A Markus Holzleitner %A Johannes Lehner %A Sepp Hochreiter %A Werner Zellinger %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-gruber24b %I PMLR %P 16502--16529 %U https://proceedings.mlr.press/v235/gruber24b.html %V 235 %X Estimating the ratio of two probability densities from finitely many samples, is a central task in machine learning and statistics. In this work, we show that a large class of kernel methods for density ratio estimation suffers from error saturation, which prevents algorithms from achieving fast error convergence rates on highly regular learning problems. To resolve saturation, we introduce iterated regularization in density ratio estimation to achieve fast error rates. Our methods outperform its non-iteratively regularized versions on benchmarks for density ratio estimation as well as on large-scale evaluations for importance-weighted ensembling of deep unsupervised domain adaptation models.
APA
Gruber, L., Holzleitner, M., Lehner, J., Hochreiter, S. & Zellinger, W.. (2024). Overcoming Saturation in Density Ratio Estimation by Iterated Regularization. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:16502-16529 Available from https://proceedings.mlr.press/v235/gruber24b.html.

Related Material