Proper Losses for Discrete Generative Models

Dhamma Kimpara, Rafael Frongillo, Bo Waggoner
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:17015-17040, 2023.

Abstract

We initiate the study of proper losses for evaluating generative models in the discrete setting. Unlike traditional proper losses, we treat both the generative model and the target distribution as black-boxes, only assuming ability to draw i.i.d. samples. We define a loss to be black-box proper if the generative distribution that minimizes expected loss is equal to the target distribution. Using techniques from statistical estimation theory, we give a general construction and characterization of black-box proper losses: they must take a polynomial form, and the number of draws from the model and target distribution must exceed the degree of the polynomial. The characterization rules out a loss whose expectation is the cross-entropy between the target distribution and the model. By extending the construction to arbitrary sampling schemes such as Poisson sampling, however, we show that one can construct such a loss.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-kimpara23a, title = {Proper Losses for Discrete Generative Models}, author = {Kimpara, Dhamma and Frongillo, Rafael and Waggoner, Bo}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {17015--17040}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/kimpara23a/kimpara23a.pdf}, url = {https://proceedings.mlr.press/v202/kimpara23a.html}, abstract = {We initiate the study of proper losses for evaluating generative models in the discrete setting. Unlike traditional proper losses, we treat both the generative model and the target distribution as black-boxes, only assuming ability to draw i.i.d. samples. We define a loss to be black-box proper if the generative distribution that minimizes expected loss is equal to the target distribution. Using techniques from statistical estimation theory, we give a general construction and characterization of black-box proper losses: they must take a polynomial form, and the number of draws from the model and target distribution must exceed the degree of the polynomial. The characterization rules out a loss whose expectation is the cross-entropy between the target distribution and the model. By extending the construction to arbitrary sampling schemes such as Poisson sampling, however, we show that one can construct such a loss.} }
Endnote
%0 Conference Paper %T Proper Losses for Discrete Generative Models %A Dhamma Kimpara %A Rafael Frongillo %A Bo Waggoner %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-kimpara23a %I PMLR %P 17015--17040 %U https://proceedings.mlr.press/v202/kimpara23a.html %V 202 %X We initiate the study of proper losses for evaluating generative models in the discrete setting. Unlike traditional proper losses, we treat both the generative model and the target distribution as black-boxes, only assuming ability to draw i.i.d. samples. We define a loss to be black-box proper if the generative distribution that minimizes expected loss is equal to the target distribution. Using techniques from statistical estimation theory, we give a general construction and characterization of black-box proper losses: they must take a polynomial form, and the number of draws from the model and target distribution must exceed the degree of the polynomial. The characterization rules out a loss whose expectation is the cross-entropy between the target distribution and the model. By extending the construction to arbitrary sampling schemes such as Poisson sampling, however, we show that one can construct such a loss.
APA
Kimpara, D., Frongillo, R. & Waggoner, B.. (2023). Proper Losses for Discrete Generative Models. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:17015-17040 Available from https://proceedings.mlr.press/v202/kimpara23a.html.

Related Material