Variational Network Inference: Strong and Stable with Concrete Support

Amir Dezfouli, Edwin Bonilla, Richard Nock
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1204-1213, 2018.

Abstract

Traditional methods for the discovery of latent network structures are limited in two ways: they either assume that all the signal comes from the network (i.e. there is no source of signal outside the network) or they place constraints on the network parameters to ensure model or algorithmic stability. We address these limitations by proposing a model that incorporates a Gaussian process prior on a network-independent component and formally proving that we get algorithmic stability for free while providing a novel perspective on model stability as well as robustness results and precise intervals for key inference parameters. We show that, on three applications, our approach outperforms previous methods consistently.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-dezfouli18a, title = {Variational Network Inference: Strong and Stable with Concrete Support}, author = {Dezfouli, Amir and Bonilla, Edwin and Nock, Richard}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {1204--1213}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/dezfouli18a/dezfouli18a.pdf}, url = {https://proceedings.mlr.press/v80/dezfouli18a.html}, abstract = {Traditional methods for the discovery of latent network structures are limited in two ways: they either assume that all the signal comes from the network (i.e. there is no source of signal outside the network) or they place constraints on the network parameters to ensure model or algorithmic stability. We address these limitations by proposing a model that incorporates a Gaussian process prior on a network-independent component and formally proving that we get algorithmic stability for free while providing a novel perspective on model stability as well as robustness results and precise intervals for key inference parameters. We show that, on three applications, our approach outperforms previous methods consistently.} }
Endnote
%0 Conference Paper %T Variational Network Inference: Strong and Stable with Concrete Support %A Amir Dezfouli %A Edwin Bonilla %A Richard Nock %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-dezfouli18a %I PMLR %P 1204--1213 %U https://proceedings.mlr.press/v80/dezfouli18a.html %V 80 %X Traditional methods for the discovery of latent network structures are limited in two ways: they either assume that all the signal comes from the network (i.e. there is no source of signal outside the network) or they place constraints on the network parameters to ensure model or algorithmic stability. We address these limitations by proposing a model that incorporates a Gaussian process prior on a network-independent component and formally proving that we get algorithmic stability for free while providing a novel perspective on model stability as well as robustness results and precise intervals for key inference parameters. We show that, on three applications, our approach outperforms previous methods consistently.
APA
Dezfouli, A., Bonilla, E. & Nock, R.. (2018). Variational Network Inference: Strong and Stable with Concrete Support. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:1204-1213 Available from https://proceedings.mlr.press/v80/dezfouli18a.html.

Related Material