On Sparse, Spectral and Other Parameterizations of Binary Probabilistic Models

David Buchman, Mark Schmidt, Shakir Mohamed, David Poole, Nando De Freitas
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:173-181, 2012.

Abstract

This paper studies issues relating to the parameterization of probability distributions over binary data sets. Several such parameterizations of models for binary data are known, including the Ising, generalized Ising, canonical and full parameterizations. We also discuss a parameterization that we call the “spectral parameterization”, which has received significantly less coverage in existing literature. We provide this parameterization with a spectral interpretation by casting log-linear models in terms of orthogonal Walsh-Hadamard harmonic expansions. Using various standard and group sparse regularizers for structural learning, we provide a comprehensive theoretical and empirical comparison of these parameterizations. We show that the spectral parameterization, along with the canonical, has the best performance and sparsity levels, while the spectral does not depend on any particular reference state. The spectral interpretation also provides a new starting point for analyzing the statistics of binary data sets; we measure the magnitude of higher order interactions in the underlying distributions for several data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-buchman12, title = {On Sparse, Spectral and Other Parameterizations of Binary Probabilistic Models}, author = {Buchman, David and Schmidt, Mark and Mohamed, Shakir and Poole, David and Freitas, Nando De}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {173--181}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/buchman12/buchman12.pdf}, url = {https://proceedings.mlr.press/v22/buchman12.html}, abstract = {This paper studies issues relating to the parameterization of probability distributions over binary data sets. Several such parameterizations of models for binary data are known, including the Ising, generalized Ising, canonical and full parameterizations. We also discuss a parameterization that we call the “spectral parameterization”, which has received significantly less coverage in existing literature. We provide this parameterization with a spectral interpretation by casting log-linear models in terms of orthogonal Walsh-Hadamard harmonic expansions. Using various standard and group sparse regularizers for structural learning, we provide a comprehensive theoretical and empirical comparison of these parameterizations. We show that the spectral parameterization, along with the canonical, has the best performance and sparsity levels, while the spectral does not depend on any particular reference state. The spectral interpretation also provides a new starting point for analyzing the statistics of binary data sets; we measure the magnitude of higher order interactions in the underlying distributions for several data sets.} }
Endnote
%0 Conference Paper %T On Sparse, Spectral and Other Parameterizations of Binary Probabilistic Models %A David Buchman %A Mark Schmidt %A Shakir Mohamed %A David Poole %A Nando De Freitas %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-buchman12 %I PMLR %P 173--181 %U https://proceedings.mlr.press/v22/buchman12.html %V 22 %X This paper studies issues relating to the parameterization of probability distributions over binary data sets. Several such parameterizations of models for binary data are known, including the Ising, generalized Ising, canonical and full parameterizations. We also discuss a parameterization that we call the “spectral parameterization”, which has received significantly less coverage in existing literature. We provide this parameterization with a spectral interpretation by casting log-linear models in terms of orthogonal Walsh-Hadamard harmonic expansions. Using various standard and group sparse regularizers for structural learning, we provide a comprehensive theoretical and empirical comparison of these parameterizations. We show that the spectral parameterization, along with the canonical, has the best performance and sparsity levels, while the spectral does not depend on any particular reference state. The spectral interpretation also provides a new starting point for analyzing the statistics of binary data sets; we measure the magnitude of higher order interactions in the underlying distributions for several data sets.
RIS
TY - CPAPER TI - On Sparse, Spectral and Other Parameterizations of Binary Probabilistic Models AU - David Buchman AU - Mark Schmidt AU - Shakir Mohamed AU - David Poole AU - Nando De Freitas BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-buchman12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 173 EP - 181 L1 - http://proceedings.mlr.press/v22/buchman12/buchman12.pdf UR - https://proceedings.mlr.press/v22/buchman12.html AB - This paper studies issues relating to the parameterization of probability distributions over binary data sets. Several such parameterizations of models for binary data are known, including the Ising, generalized Ising, canonical and full parameterizations. We also discuss a parameterization that we call the “spectral parameterization”, which has received significantly less coverage in existing literature. We provide this parameterization with a spectral interpretation by casting log-linear models in terms of orthogonal Walsh-Hadamard harmonic expansions. Using various standard and group sparse regularizers for structural learning, we provide a comprehensive theoretical and empirical comparison of these parameterizations. We show that the spectral parameterization, along with the canonical, has the best performance and sparsity levels, while the spectral does not depend on any particular reference state. The spectral interpretation also provides a new starting point for analyzing the statistics of binary data sets; we measure the magnitude of higher order interactions in the underlying distributions for several data sets. ER -
APA
Buchman, D., Schmidt, M., Mohamed, S., Poole, D. & Freitas, N.D.. (2012). On Sparse, Spectral and Other Parameterizations of Binary Probabilistic Models. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:173-181 Available from https://proceedings.mlr.press/v22/buchman12.html.

Related Material