Incremental Sampling Without Replacement for Sequence Models

Kensen Shi, David Bieber, Charles Sutton
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:8785-8795, 2020.

Abstract

Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-shi20a, title = {Incremental Sampling Without Replacement for Sequence Models}, author = {Shi, Kensen and Bieber, David and Sutton, Charles}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {8785--8795}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/shi20a/shi20a.pdf}, url = {https://proceedings.mlr.press/v119/shi20a.html}, abstract = {Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization.} }
Endnote
%0 Conference Paper %T Incremental Sampling Without Replacement for Sequence Models %A Kensen Shi %A David Bieber %A Charles Sutton %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-shi20a %I PMLR %P 8785--8795 %U https://proceedings.mlr.press/v119/shi20a.html %V 119 %X Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization.
APA
Shi, K., Bieber, D. & Sutton, C.. (2020). Incremental Sampling Without Replacement for Sequence Models. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:8785-8795 Available from https://proceedings.mlr.press/v119/shi20a.html.

Related Material