Precision-Recall Curves Using Information Divergence Frontiers

Josip Djolonga, Mario Lucic, Marco Cuturi, Olivier Bachem, Olivier Bousquet, Sylvain Gelly
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2550-2559, 2020.

Abstract

Despite the tremendous progress in the estimation of generative models, the development of tools for diagnosing their failures and assessing their performance has advanced at a much slower pace. Recent developments have investigated metrics that quantify which parts of the true distribution is modeled well, and, on the contrary, what the model fails to capture, akin to precision and recall in information retrieval. In this paper, we present a general evaluation framework for generative models that measures the trade-off between precision and recall using Renyi divergences. Our framework provides a novel perspective on existing techniques and extends them to more general domains. As a key advantage, this formulation encompasses both continuous and discrete models and allows for the design of efficient algorithms that do not have to quantize the data. We further analyze the biases of the approximations used in practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-djolonga20a, title = {Precision-Recall Curves Using Information Divergence Frontiers}, author = {Djolonga, Josip and Lucic, Mario and Cuturi, Marco and Bachem, Olivier and Bousquet, Olivier and Gelly, Sylvain}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {2550--2559}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/djolonga20a/djolonga20a.pdf}, url = {https://proceedings.mlr.press/v108/djolonga20a.html}, abstract = {Despite the tremendous progress in the estimation of generative models, the development of tools for diagnosing their failures and assessing their performance has advanced at a much slower pace. Recent developments have investigated metrics that quantify which parts of the true distribution is modeled well, and, on the contrary, what the model fails to capture, akin to precision and recall in information retrieval. In this paper, we present a general evaluation framework for generative models that measures the trade-off between precision and recall using Renyi divergences. Our framework provides a novel perspective on existing techniques and extends them to more general domains. As a key advantage, this formulation encompasses both continuous and discrete models and allows for the design of efficient algorithms that do not have to quantize the data. We further analyze the biases of the approximations used in practice.} }
Endnote
%0 Conference Paper %T Precision-Recall Curves Using Information Divergence Frontiers %A Josip Djolonga %A Mario Lucic %A Marco Cuturi %A Olivier Bachem %A Olivier Bousquet %A Sylvain Gelly %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-djolonga20a %I PMLR %P 2550--2559 %U https://proceedings.mlr.press/v108/djolonga20a.html %V 108 %X Despite the tremendous progress in the estimation of generative models, the development of tools for diagnosing their failures and assessing their performance has advanced at a much slower pace. Recent developments have investigated metrics that quantify which parts of the true distribution is modeled well, and, on the contrary, what the model fails to capture, akin to precision and recall in information retrieval. In this paper, we present a general evaluation framework for generative models that measures the trade-off between precision and recall using Renyi divergences. Our framework provides a novel perspective on existing techniques and extends them to more general domains. As a key advantage, this formulation encompasses both continuous and discrete models and allows for the design of efficient algorithms that do not have to quantize the data. We further analyze the biases of the approximations used in practice.
APA
Djolonga, J., Lucic, M., Cuturi, M., Bachem, O., Bousquet, O. & Gelly, S.. (2020). Precision-Recall Curves Using Information Divergence Frontiers. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:2550-2559 Available from https://proceedings.mlr.press/v108/djolonga20a.html.

Related Material