Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization

[edit]

Stefano Ermon, Carla Gomes, Ashish Sabharwal, Bart Selman ;
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):334-342, 2013.

Abstract

Integration is affected by the curse of dimensionality and quickly becomes intractable as the dimensionality of the problem grows. We propose a randomized algorithm that, with high probability, gives a constant-factor approximation of a general discrete integral defined over an exponentially large set. This algorithm relies on solving only a small number of instances of a discrete combinatorial optimization problem subject to randomly generated parity constraints used as a hash function. As an application, we demonstrate that with a small number of MAP queries we can efficiently approximate the partition function of discrete graphical models, which can in turn be used, for instance, for marginal computation or model selection.

Related Material