Density Ratio Estimation with Doubly Strong Robustness

Ryosuke Nagumo, Hironori Fujisawa
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:37260-37276, 2024.

Abstract

We develop two density ratio estimation (DRE) methods with robustness to outliers. These are based on the divergence with a weight function to weaken the adverse effects of outliers. One is based on the Unnormalized Kullback-Leibler divergence, called Weighted DRE, and its optimization is a convex problem. The other is based on the γ-divergence, called γ-DRE, which improves a normalizing term problem of Weighted DRE. Its optimization is a DC (Difference of Convex functions) problem and needs more computation than a convex problem. These methods have doubly strong robustness, which means robustness to the heavy contamination of both the reference and target distributions. Numerical experiments show that our proposals are more robust than the previous methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-nagumo24a, title = {Density Ratio Estimation with Doubly Strong Robustness}, author = {Nagumo, Ryosuke and Fujisawa, Hironori}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {37260--37276}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/nagumo24a/nagumo24a.pdf}, url = {https://proceedings.mlr.press/v235/nagumo24a.html}, abstract = {We develop two density ratio estimation (DRE) methods with robustness to outliers. These are based on the divergence with a weight function to weaken the adverse effects of outliers. One is based on the Unnormalized Kullback-Leibler divergence, called Weighted DRE, and its optimization is a convex problem. The other is based on the γ-divergence, called γ-DRE, which improves a normalizing term problem of Weighted DRE. Its optimization is a DC (Difference of Convex functions) problem and needs more computation than a convex problem. These methods have doubly strong robustness, which means robustness to the heavy contamination of both the reference and target distributions. Numerical experiments show that our proposals are more robust than the previous methods.} }
Endnote
%0 Conference Paper %T Density Ratio Estimation with Doubly Strong Robustness %A Ryosuke Nagumo %A Hironori Fujisawa %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-nagumo24a %I PMLR %P 37260--37276 %U https://proceedings.mlr.press/v235/nagumo24a.html %V 235 %X We develop two density ratio estimation (DRE) methods with robustness to outliers. These are based on the divergence with a weight function to weaken the adverse effects of outliers. One is based on the Unnormalized Kullback-Leibler divergence, called Weighted DRE, and its optimization is a convex problem. The other is based on the γ-divergence, called γ-DRE, which improves a normalizing term problem of Weighted DRE. Its optimization is a DC (Difference of Convex functions) problem and needs more computation than a convex problem. These methods have doubly strong robustness, which means robustness to the heavy contamination of both the reference and target distributions. Numerical experiments show that our proposals are more robust than the previous methods.
APA
Nagumo, R. & Fujisawa, H.. (2024). Density Ratio Estimation with Doubly Strong Robustness. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:37260-37276 Available from https://proceedings.mlr.press/v235/nagumo24a.html.

Related Material