Optimal Budgeted Rejection Sampling for Generative Models

Alexandre Verine, Muni Sreenivas Pydi, Benjamin Negrevergne, Yann Chevaleyre
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3367-3375, 2024.

Abstract

Rejection sampling methods have recently been proposed to improve the performance of discriminator-based generative models. However, these methods are only optimal under an unlimited sampling budget, and are usually applied to a generator trained independently of the rejection procedure. We first propose an Optimal Budgeted Rejection Sampling (OBRS) scheme that is provably optimal with respect to \textit{any} $f$-divergence between the true distribution and the post-rejection distribution, for a given sampling budget. Second, we propose an end-to-end method that incorporates the sampling scheme into the training procedure to further enhance the model’s overall performance. Through experiments and supporting theory, we show that the proposed methods are effective in significantly improving the quality and diversity of the samples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-verine24a, title = { Optimal Budgeted Rejection Sampling for Generative Models }, author = {Verine, Alexandre and Sreenivas Pydi, Muni and Negrevergne, Benjamin and Chevaleyre, Yann}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3367--3375}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/verine24a/verine24a.pdf}, url = {https://proceedings.mlr.press/v238/verine24a.html}, abstract = { Rejection sampling methods have recently been proposed to improve the performance of discriminator-based generative models. However, these methods are only optimal under an unlimited sampling budget, and are usually applied to a generator trained independently of the rejection procedure. We first propose an Optimal Budgeted Rejection Sampling (OBRS) scheme that is provably optimal with respect to \textit{any} $f$-divergence between the true distribution and the post-rejection distribution, for a given sampling budget. Second, we propose an end-to-end method that incorporates the sampling scheme into the training procedure to further enhance the model’s overall performance. Through experiments and supporting theory, we show that the proposed methods are effective in significantly improving the quality and diversity of the samples. } }
Endnote
%0 Conference Paper %T Optimal Budgeted Rejection Sampling for Generative Models %A Alexandre Verine %A Muni Sreenivas Pydi %A Benjamin Negrevergne %A Yann Chevaleyre %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-verine24a %I PMLR %P 3367--3375 %U https://proceedings.mlr.press/v238/verine24a.html %V 238 %X Rejection sampling methods have recently been proposed to improve the performance of discriminator-based generative models. However, these methods are only optimal under an unlimited sampling budget, and are usually applied to a generator trained independently of the rejection procedure. We first propose an Optimal Budgeted Rejection Sampling (OBRS) scheme that is provably optimal with respect to \textit{any} $f$-divergence between the true distribution and the post-rejection distribution, for a given sampling budget. Second, we propose an end-to-end method that incorporates the sampling scheme into the training procedure to further enhance the model’s overall performance. Through experiments and supporting theory, we show that the proposed methods are effective in significantly improving the quality and diversity of the samples.
APA
Verine, A., Sreenivas Pydi, M., Negrevergne, B. & Chevaleyre, Y.. (2024). Optimal Budgeted Rejection Sampling for Generative Models . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3367-3375 Available from https://proceedings.mlr.press/v238/verine24a.html.

Related Material