Low-density Parity Constraints for Hashing-Based Discrete Integration

Stefano Ermon, Carla Gomes, Ashish Sabharwal, Bart Selman
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):271-279, 2014.

Abstract

In recent years, a number of probabilistic inference and counting techniques have been proposed that exploit pairwise independent hash functions to infer properties of succinctly defined high-dimensional sets. While providing desirable statistical guarantees, typical constructions of such hash functions are themselves not amenable to efficient inference. Inspired by the success of LDPC codes, we propose the use of low-density parity constraints to make inference more tractable in practice. While not strongly universal, we show that such sparse constraints belong to a new class of hash functions that we call Average Universal. These weaker hash functions retain the desirable statistical guarantees needed by most such probabilistic inference methods. Thus, they continue to provide provable accuracy guarantees while at the same time making a number of algorithms significantly more scalable in practice. Using this technique, we provide new, tighter bounds for challenging discrete integration and model counting problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-ermon14, title = {Low-density Parity Constraints for Hashing-Based Discrete Integration}, author = {Ermon, Stefano and Gomes, Carla and Sabharwal, Ashish and Selman, Bart}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {271--279}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/ermon14.pdf}, url = {https://proceedings.mlr.press/v32/ermon14.html}, abstract = {In recent years, a number of probabilistic inference and counting techniques have been proposed that exploit pairwise independent hash functions to infer properties of succinctly defined high-dimensional sets. While providing desirable statistical guarantees, typical constructions of such hash functions are themselves not amenable to efficient inference. Inspired by the success of LDPC codes, we propose the use of low-density parity constraints to make inference more tractable in practice. While not strongly universal, we show that such sparse constraints belong to a new class of hash functions that we call Average Universal. These weaker hash functions retain the desirable statistical guarantees needed by most such probabilistic inference methods. Thus, they continue to provide provable accuracy guarantees while at the same time making a number of algorithms significantly more scalable in practice. Using this technique, we provide new, tighter bounds for challenging discrete integration and model counting problems.} }
Endnote
%0 Conference Paper %T Low-density Parity Constraints for Hashing-Based Discrete Integration %A Stefano Ermon %A Carla Gomes %A Ashish Sabharwal %A Bart Selman %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-ermon14 %I PMLR %P 271--279 %U https://proceedings.mlr.press/v32/ermon14.html %V 32 %N 1 %X In recent years, a number of probabilistic inference and counting techniques have been proposed that exploit pairwise independent hash functions to infer properties of succinctly defined high-dimensional sets. While providing desirable statistical guarantees, typical constructions of such hash functions are themselves not amenable to efficient inference. Inspired by the success of LDPC codes, we propose the use of low-density parity constraints to make inference more tractable in practice. While not strongly universal, we show that such sparse constraints belong to a new class of hash functions that we call Average Universal. These weaker hash functions retain the desirable statistical guarantees needed by most such probabilistic inference methods. Thus, they continue to provide provable accuracy guarantees while at the same time making a number of algorithms significantly more scalable in practice. Using this technique, we provide new, tighter bounds for challenging discrete integration and model counting problems.
RIS
TY - CPAPER TI - Low-density Parity Constraints for Hashing-Based Discrete Integration AU - Stefano Ermon AU - Carla Gomes AU - Ashish Sabharwal AU - Bart Selman BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-ermon14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 271 EP - 279 L1 - http://proceedings.mlr.press/v32/ermon14.pdf UR - https://proceedings.mlr.press/v32/ermon14.html AB - In recent years, a number of probabilistic inference and counting techniques have been proposed that exploit pairwise independent hash functions to infer properties of succinctly defined high-dimensional sets. While providing desirable statistical guarantees, typical constructions of such hash functions are themselves not amenable to efficient inference. Inspired by the success of LDPC codes, we propose the use of low-density parity constraints to make inference more tractable in practice. While not strongly universal, we show that such sparse constraints belong to a new class of hash functions that we call Average Universal. These weaker hash functions retain the desirable statistical guarantees needed by most such probabilistic inference methods. Thus, they continue to provide provable accuracy guarantees while at the same time making a number of algorithms significantly more scalable in practice. Using this technique, we provide new, tighter bounds for challenging discrete integration and model counting problems. ER -
APA
Ermon, S., Gomes, C., Sabharwal, A. & Selman, B.. (2014). Low-density Parity Constraints for Hashing-Based Discrete Integration. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):271-279 Available from https://proceedings.mlr.press/v32/ermon14.html.

Related Material