Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization

Stefano Ermon, Carla Gomes, Ashish Sabharwal, Bart Selman
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):334-342, 2013.

Abstract

Integration is affected by the curse of dimensionality and quickly becomes intractable as the dimensionality of the problem grows. We propose a randomized algorithm that, with high probability, gives a constant-factor approximation of a general discrete integral defined over an exponentially large set. This algorithm relies on solving only a small number of instances of a discrete combinatorial optimization problem subject to randomly generated parity constraints used as a hash function. As an application, we demonstrate that with a small number of MAP queries we can efficiently approximate the partition function of discrete graphical models, which can in turn be used, for instance, for marginal computation or model selection.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-ermon13, title = {Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization}, author = {Ermon, Stefano and Gomes, Carla and Sabharwal, Ashish and Selman, Bart}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {334--342}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/ermon13.pdf}, url = {https://proceedings.mlr.press/v28/ermon13.html}, abstract = {Integration is affected by the curse of dimensionality and quickly becomes intractable as the dimensionality of the problem grows. We propose a randomized algorithm that, with high probability, gives a constant-factor approximation of a general discrete integral defined over an exponentially large set. This algorithm relies on solving only a small number of instances of a discrete combinatorial optimization problem subject to randomly generated parity constraints used as a hash function. As an application, we demonstrate that with a small number of MAP queries we can efficiently approximate the partition function of discrete graphical models, which can in turn be used, for instance, for marginal computation or model selection.} }
Endnote
%0 Conference Paper %T Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization %A Stefano Ermon %A Carla Gomes %A Ashish Sabharwal %A Bart Selman %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-ermon13 %I PMLR %P 334--342 %U https://proceedings.mlr.press/v28/ermon13.html %V 28 %N 2 %X Integration is affected by the curse of dimensionality and quickly becomes intractable as the dimensionality of the problem grows. We propose a randomized algorithm that, with high probability, gives a constant-factor approximation of a general discrete integral defined over an exponentially large set. This algorithm relies on solving only a small number of instances of a discrete combinatorial optimization problem subject to randomly generated parity constraints used as a hash function. As an application, we demonstrate that with a small number of MAP queries we can efficiently approximate the partition function of discrete graphical models, which can in turn be used, for instance, for marginal computation or model selection.
RIS
TY - CPAPER TI - Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization AU - Stefano Ermon AU - Carla Gomes AU - Ashish Sabharwal AU - Bart Selman BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-ermon13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 2 SP - 334 EP - 342 L1 - http://proceedings.mlr.press/v28/ermon13.pdf UR - https://proceedings.mlr.press/v28/ermon13.html AB - Integration is affected by the curse of dimensionality and quickly becomes intractable as the dimensionality of the problem grows. We propose a randomized algorithm that, with high probability, gives a constant-factor approximation of a general discrete integral defined over an exponentially large set. This algorithm relies on solving only a small number of instances of a discrete combinatorial optimization problem subject to randomly generated parity constraints used as a hash function. As an application, we demonstrate that with a small number of MAP queries we can efficiently approximate the partition function of discrete graphical models, which can in turn be used, for instance, for marginal computation or model selection. ER -
APA
Ermon, S., Gomes, C., Sabharwal, A. & Selman, B.. (2013). Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(2):334-342 Available from https://proceedings.mlr.press/v28/ermon13.html.

Related Material