Efficient Informed Proposals for Discrete Distributions via Newton’s Series Approximation

Yue Xiang, Dongyao Zhu, Bowen Lei, Dongkuan Xu, Ruqi Zhang
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:7288-7310, 2023.

Abstract

Gradients have been exploited in proposal distributions to accelerate the convergence of Markov chain Monte Carlo algorithms on discrete distributions. However, these methods require a natural differentiable extension of the target discrete distribution, which often does not exist or does not provide effective guidance. In this paper, we develop a gradient-like proposal for any discrete distribution without this strong requirement. Built upon a locally-balanced proposal, our method efficiently approximates the discrete likelihood ratio via Newton’s series expansion to enable a large and efficient exploration in discrete spaces. We show that our method can also be viewed as a multilinear extension, thus inheriting the desired properties. We prove that our method has a guaranteed convergence rate with or without the Metropolis-Hastings step. Furthermore, our method outperforms a number of popular alternatives in several different experiments, including the facility location problem, extractive text summarization, and image retrieval.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-xiang23a, title = {Efficient Informed Proposals for Discrete Distributions via Newton’s Series Approximation}, author = {Xiang, Yue and Zhu, Dongyao and Lei, Bowen and Xu, Dongkuan and Zhang, Ruqi}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {7288--7310}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/xiang23a/xiang23a.pdf}, url = {https://proceedings.mlr.press/v206/xiang23a.html}, abstract = {Gradients have been exploited in proposal distributions to accelerate the convergence of Markov chain Monte Carlo algorithms on discrete distributions. However, these methods require a natural differentiable extension of the target discrete distribution, which often does not exist or does not provide effective guidance. In this paper, we develop a gradient-like proposal for any discrete distribution without this strong requirement. Built upon a locally-balanced proposal, our method efficiently approximates the discrete likelihood ratio via Newton’s series expansion to enable a large and efficient exploration in discrete spaces. We show that our method can also be viewed as a multilinear extension, thus inheriting the desired properties. We prove that our method has a guaranteed convergence rate with or without the Metropolis-Hastings step. Furthermore, our method outperforms a number of popular alternatives in several different experiments, including the facility location problem, extractive text summarization, and image retrieval.} }
Endnote
%0 Conference Paper %T Efficient Informed Proposals for Discrete Distributions via Newton’s Series Approximation %A Yue Xiang %A Dongyao Zhu %A Bowen Lei %A Dongkuan Xu %A Ruqi Zhang %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-xiang23a %I PMLR %P 7288--7310 %U https://proceedings.mlr.press/v206/xiang23a.html %V 206 %X Gradients have been exploited in proposal distributions to accelerate the convergence of Markov chain Monte Carlo algorithms on discrete distributions. However, these methods require a natural differentiable extension of the target discrete distribution, which often does not exist or does not provide effective guidance. In this paper, we develop a gradient-like proposal for any discrete distribution without this strong requirement. Built upon a locally-balanced proposal, our method efficiently approximates the discrete likelihood ratio via Newton’s series expansion to enable a large and efficient exploration in discrete spaces. We show that our method can also be viewed as a multilinear extension, thus inheriting the desired properties. We prove that our method has a guaranteed convergence rate with or without the Metropolis-Hastings step. Furthermore, our method outperforms a number of popular alternatives in several different experiments, including the facility location problem, extractive text summarization, and image retrieval.
APA
Xiang, Y., Zhu, D., Lei, B., Xu, D. & Zhang, R.. (2023). Efficient Informed Proposals for Discrete Distributions via Newton’s Series Approximation. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:7288-7310 Available from https://proceedings.mlr.press/v206/xiang23a.html.

Related Material