A Faster Approximation Algorithm for the Gibbs Partition Function

Vladimir Kolmogorov
Proceedings of the 31st Conference On Learning Theory, PMLR 75:228-249, 2018.

Abstract

We consider the problem of estimating the partition function $Z(\beta)=\sum_x \exp(-\beta H(x))$ of a Gibbs distribution with a Hamilton $H(\cdot)$, or more precisely the logarithm of the ratio $q=\ln Z(0)/Z(\beta)$. It has been recently shown how to approximate $q$ with high probability assuming the existence of an oracle that produces samples from the Gibbs distribution for a given parameter value in $[0,\beta]$. The current best known approach due to Huber (2015) uses $O(q\ln n\cdot[\ln q + \ln \ln n+\varepsilon^{-2}])$ oracle calls on average where $\varepsilon$ is the desired accuracy of approximation and $H(\cdot)$ is assumed to lie in $\{0\}\cup[1,n]$. We improve the complexity to $O(q\ln n\cdot\varepsilon^{-2})$ oracle calls. We also show that the same complexity can be achieved if exact oracles are replaced with approximate sampling oracles that are within $O(\frac{\varepsilon^2}{q\ln n})$ variation distance from exact oracles. Finally, we prove a lower bound of $\Omega(q\cdot \varepsilon^{-2})$ oracle calls under a natural model of computation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v75-kolmogorov18a, title = {A Faster Approximation Algorithm for the {G}ibbs Partition Function}, author = {Kolmogorov, Vladimir}, booktitle = {Proceedings of the 31st Conference On Learning Theory}, pages = {228--249}, year = {2018}, editor = {Bubeck, Sébastien and Perchet, Vianney and Rigollet, Philippe}, volume = {75}, series = {Proceedings of Machine Learning Research}, month = {06--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v75/kolmogorov18a/kolmogorov18a.pdf}, url = {https://proceedings.mlr.press/v75/kolmogorov18a.html}, abstract = { We consider the problem of estimating the partition function $Z(\beta)=\sum_x \exp(-\beta H(x))$ of a Gibbs distribution with a Hamilton $H(\cdot)$, or more precisely the logarithm of the ratio $q=\ln Z(0)/Z(\beta)$. It has been recently shown how to approximate $q$ with high probability assuming the existence of an oracle that produces samples from the Gibbs distribution for a given parameter value in $[0,\beta]$. The current best known approach due to Huber (2015) uses $O(q\ln n\cdot[\ln q + \ln \ln n+\varepsilon^{-2}])$ oracle calls on average where $\varepsilon$ is the desired accuracy of approximation and $H(\cdot)$ is assumed to lie in $\{0\}\cup[1,n]$. We improve the complexity to $O(q\ln n\cdot\varepsilon^{-2})$ oracle calls. We also show that the same complexity can be achieved if exact oracles are replaced with approximate sampling oracles that are within $O(\frac{\varepsilon^2}{q\ln n})$ variation distance from exact oracles. Finally, we prove a lower bound of $\Omega(q\cdot \varepsilon^{-2})$ oracle calls under a natural model of computation. } }
Endnote
%0 Conference Paper %T A Faster Approximation Algorithm for the Gibbs Partition Function %A Vladimir Kolmogorov %B Proceedings of the 31st Conference On Learning Theory %C Proceedings of Machine Learning Research %D 2018 %E Sébastien Bubeck %E Vianney Perchet %E Philippe Rigollet %F pmlr-v75-kolmogorov18a %I PMLR %P 228--249 %U https://proceedings.mlr.press/v75/kolmogorov18a.html %V 75 %X We consider the problem of estimating the partition function $Z(\beta)=\sum_x \exp(-\beta H(x))$ of a Gibbs distribution with a Hamilton $H(\cdot)$, or more precisely the logarithm of the ratio $q=\ln Z(0)/Z(\beta)$. It has been recently shown how to approximate $q$ with high probability assuming the existence of an oracle that produces samples from the Gibbs distribution for a given parameter value in $[0,\beta]$. The current best known approach due to Huber (2015) uses $O(q\ln n\cdot[\ln q + \ln \ln n+\varepsilon^{-2}])$ oracle calls on average where $\varepsilon$ is the desired accuracy of approximation and $H(\cdot)$ is assumed to lie in $\{0\}\cup[1,n]$. We improve the complexity to $O(q\ln n\cdot\varepsilon^{-2})$ oracle calls. We also show that the same complexity can be achieved if exact oracles are replaced with approximate sampling oracles that are within $O(\frac{\varepsilon^2}{q\ln n})$ variation distance from exact oracles. Finally, we prove a lower bound of $\Omega(q\cdot \varepsilon^{-2})$ oracle calls under a natural model of computation.
APA
Kolmogorov, V.. (2018). A Faster Approximation Algorithm for the Gibbs Partition Function. Proceedings of the 31st Conference On Learning Theory, in Proceedings of Machine Learning Research 75:228-249 Available from https://proceedings.mlr.press/v75/kolmogorov18a.html.

Related Material