Exchangeable Variable Models

Mathias Niepert, Pedro Domingos
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):271-279, 2014.

Abstract

A sequence of random variables is exchangeable if its joint distribution is invariant under variable permutations. We introduce exchangeable variable models (EVMs) as a novel class of probabilistic models whose basic building blocks are partially exchangeable sequences, a generalization of exchangeable sequences. We prove that a family of tractable EVMs is optimal under zero-one loss for a large class of functions, including parity and threshold functions, and strictly subsumes existing tractable independence-based model families. Extensive experiments show that EVMs outperform state of the art classifiers such as SVMs and probabilistic models which are solely based on independence assumptions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-niepert14, title = {Exchangeable Variable Models}, author = {Niepert, Mathias and Domingos, Pedro}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {271--279}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/niepert14.pdf}, url = {https://proceedings.mlr.press/v32/niepert14.html}, abstract = {A sequence of random variables is exchangeable if its joint distribution is invariant under variable permutations. We introduce exchangeable variable models (EVMs) as a novel class of probabilistic models whose basic building blocks are partially exchangeable sequences, a generalization of exchangeable sequences. We prove that a family of tractable EVMs is optimal under zero-one loss for a large class of functions, including parity and threshold functions, and strictly subsumes existing tractable independence-based model families. Extensive experiments show that EVMs outperform state of the art classifiers such as SVMs and probabilistic models which are solely based on independence assumptions.} }
Endnote
%0 Conference Paper %T Exchangeable Variable Models %A Mathias Niepert %A Pedro Domingos %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-niepert14 %I PMLR %P 271--279 %U https://proceedings.mlr.press/v32/niepert14.html %V 32 %N 2 %X A sequence of random variables is exchangeable if its joint distribution is invariant under variable permutations. We introduce exchangeable variable models (EVMs) as a novel class of probabilistic models whose basic building blocks are partially exchangeable sequences, a generalization of exchangeable sequences. We prove that a family of tractable EVMs is optimal under zero-one loss for a large class of functions, including parity and threshold functions, and strictly subsumes existing tractable independence-based model families. Extensive experiments show that EVMs outperform state of the art classifiers such as SVMs and probabilistic models which are solely based on independence assumptions.
RIS
TY - CPAPER TI - Exchangeable Variable Models AU - Mathias Niepert AU - Pedro Domingos BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-niepert14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 271 EP - 279 L1 - http://proceedings.mlr.press/v32/niepert14.pdf UR - https://proceedings.mlr.press/v32/niepert14.html AB - A sequence of random variables is exchangeable if its joint distribution is invariant under variable permutations. We introduce exchangeable variable models (EVMs) as a novel class of probabilistic models whose basic building blocks are partially exchangeable sequences, a generalization of exchangeable sequences. We prove that a family of tractable EVMs is optimal under zero-one loss for a large class of functions, including parity and threshold functions, and strictly subsumes existing tractable independence-based model families. Extensive experiments show that EVMs outperform state of the art classifiers such as SVMs and probabilistic models which are solely based on independence assumptions. ER -
APA
Niepert, M. & Domingos, P.. (2014). Exchangeable Variable Models. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):271-279 Available from https://proceedings.mlr.press/v32/niepert14.html.

Related Material