Non-exchangeable feature allocation models with sublinear growth of the feature sizes

Giuseppe Di Benedetto, Francois Caron, Yee Whye Teh
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3208-3218, 2020.

Abstract

Feature allocation models are popular models used in different applications such as unsupervised learning or network modeling. In particular, the Indian buffet process is a flexible and simple one-parameter feature allocation model where the number of features grows unboundedly with the number of objects. The Indian buffet process, like most feature allocation models, satisfies a symmetry property of exchangeability: the distribution is invariant under permutation of the objects. While this property is desirable in some cases, it has some strong implications. Importantly, the number of objects sharing a particular feature grows linearly with the number of objects. In this article, we describe a class of non-exchangeable feature allocation models where the number of objects sharing a given feature grows sublinearly, where the rate can be controlled by a tuning parameter. We derive the asymptotic properties of the model, and show that such models provides a better fit and better predictive performances on various datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-benedetto20a, title = {Non-exchangeable feature allocation models with sublinear growth of the feature sizes}, author = {Benedetto, Giuseppe Di and Caron, Francois and Teh, Yee Whye}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {3208--3218}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/benedetto20a/benedetto20a.pdf}, url = { http://proceedings.mlr.press/v108/benedetto20a.html }, abstract = {Feature allocation models are popular models used in different applications such as unsupervised learning or network modeling. In particular, the Indian buffet process is a flexible and simple one-parameter feature allocation model where the number of features grows unboundedly with the number of objects. The Indian buffet process, like most feature allocation models, satisfies a symmetry property of exchangeability: the distribution is invariant under permutation of the objects. While this property is desirable in some cases, it has some strong implications. Importantly, the number of objects sharing a particular feature grows linearly with the number of objects. In this article, we describe a class of non-exchangeable feature allocation models where the number of objects sharing a given feature grows sublinearly, where the rate can be controlled by a tuning parameter. We derive the asymptotic properties of the model, and show that such models provides a better fit and better predictive performances on various datasets.} }
Endnote
%0 Conference Paper %T Non-exchangeable feature allocation models with sublinear growth of the feature sizes %A Giuseppe Di Benedetto %A Francois Caron %A Yee Whye Teh %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-benedetto20a %I PMLR %P 3208--3218 %U http://proceedings.mlr.press/v108/benedetto20a.html %V 108 %X Feature allocation models are popular models used in different applications such as unsupervised learning or network modeling. In particular, the Indian buffet process is a flexible and simple one-parameter feature allocation model where the number of features grows unboundedly with the number of objects. The Indian buffet process, like most feature allocation models, satisfies a symmetry property of exchangeability: the distribution is invariant under permutation of the objects. While this property is desirable in some cases, it has some strong implications. Importantly, the number of objects sharing a particular feature grows linearly with the number of objects. In this article, we describe a class of non-exchangeable feature allocation models where the number of objects sharing a given feature grows sublinearly, where the rate can be controlled by a tuning parameter. We derive the asymptotic properties of the model, and show that such models provides a better fit and better predictive performances on various datasets.
APA
Benedetto, G.D., Caron, F. & Teh, Y.W.. (2020). Non-exchangeable feature allocation models with sublinear growth of the feature sizes. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:3208-3218 Available from http://proceedings.mlr.press/v108/benedetto20a.html .

Related Material