Tight Variational Bounds via Random Projections and I-Projections

Lun-Kai Hsu, Tudor Achim, Stefano Ermon
; Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1087-1095, 2016.

Abstract

Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-hsu16, title = {Tight Variational Bounds via Random Projections and I-Projections}, author = {Lun-Kai Hsu and Tudor Achim and Stefano Ermon}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1087--1095}, year = {2016}, editor = {Arthur Gretton and Christian C. Robert}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/hsu16.pdf}, url = {http://proceedings.mlr.press/v51/hsu16.html}, abstract = {Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data.} }
Endnote
%0 Conference Paper %T Tight Variational Bounds via Random Projections and I-Projections %A Lun-Kai Hsu %A Tudor Achim %A Stefano Ermon %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-hsu16 %I PMLR %J Proceedings of Machine Learning Research %P 1087--1095 %U http://proceedings.mlr.press %V 51 %W PMLR %X Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data.
RIS
TY - CPAPER TI - Tight Variational Bounds via Random Projections and I-Projections AU - Lun-Kai Hsu AU - Tudor Achim AU - Stefano Ermon BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics PY - 2016/05/02 DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-hsu16 PB - PMLR SP - 1087 DP - PMLR EP - 1095 L1 - http://proceedings.mlr.press/v51/hsu16.pdf UR - http://proceedings.mlr.press/v51/hsu16.html AB - Information projections are the key building block of variational inference algorithms and are used to approximate a target probabilistic model by projecting it onto a family of tractable distributions. In general, there is no guarantee on the quality of the approximation obtained. To overcome this issue, we introduce a new class of random projections to reduce the dimensionality and hence the complexity of the original model. In the spirit of random projections, the projection preserves (with high probability) key properties of the target distribution. We show that information projections can be combined with random projections to obtain provable guarantees on the quality of the approximation obtained, regardless of the complexity of the original model. We demonstrate empirically that augmenting mean field with a random projection step dramatically improves partition function and marginal probability estimates, both on synthetic and real world data. ER -
APA
Hsu, L., Achim, T. & Ermon, S.. (2016). Tight Variational Bounds via Random Projections and I-Projections. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in PMLR 51:1087-1095

Related Material