Nonparametric Bayesian Deep Networks with Local Competition

Konstantinos Panousis, Sotirios Chatzis, Sergios Theodoridis
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4980-4988, 2019.

Abstract

The aim of this work is to enable inference of deep networks that retain high accuracy for the least possible model complexity, with the latter deduced from the data during inference. To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition. In this context, our main technical innovation consists in an inferential setup that leverages solid arguments from Bayesian nonparametrics. We infer both the needed set of connections or locally competing sets of units, as well as the required floating-point precision for storing the network parameters. Specifically, we introduce auxiliary discrete latent variables representing which initial network components are actually needed for modeling the data at hand, and perform Bayesian inference over them by imposing appropriate stick-breaking priors. As we experimentally show using benchmark datasets, our approach yields networks with less computational footprint than the state-of-the-art, and with no compromises in predictive accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-panousis19a, title = {Nonparametric {B}ayesian Deep Networks with Local Competition}, author = {Panousis, Konstantinos and Chatzis, Sotirios and Theodoridis, Sergios}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4980--4988}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/panousis19a/panousis19a.pdf}, url = {https://proceedings.mlr.press/v97/panousis19a.html}, abstract = {The aim of this work is to enable inference of deep networks that retain high accuracy for the least possible model complexity, with the latter deduced from the data during inference. To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition. In this context, our main technical innovation consists in an inferential setup that leverages solid arguments from Bayesian nonparametrics. We infer both the needed set of connections or locally competing sets of units, as well as the required floating-point precision for storing the network parameters. Specifically, we introduce auxiliary discrete latent variables representing which initial network components are actually needed for modeling the data at hand, and perform Bayesian inference over them by imposing appropriate stick-breaking priors. As we experimentally show using benchmark datasets, our approach yields networks with less computational footprint than the state-of-the-art, and with no compromises in predictive accuracy.} }
Endnote
%0 Conference Paper %T Nonparametric Bayesian Deep Networks with Local Competition %A Konstantinos Panousis %A Sotirios Chatzis %A Sergios Theodoridis %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-panousis19a %I PMLR %P 4980--4988 %U https://proceedings.mlr.press/v97/panousis19a.html %V 97 %X The aim of this work is to enable inference of deep networks that retain high accuracy for the least possible model complexity, with the latter deduced from the data during inference. To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition. In this context, our main technical innovation consists in an inferential setup that leverages solid arguments from Bayesian nonparametrics. We infer both the needed set of connections or locally competing sets of units, as well as the required floating-point precision for storing the network parameters. Specifically, we introduce auxiliary discrete latent variables representing which initial network components are actually needed for modeling the data at hand, and perform Bayesian inference over them by imposing appropriate stick-breaking priors. As we experimentally show using benchmark datasets, our approach yields networks with less computational footprint than the state-of-the-art, and with no compromises in predictive accuracy.
APA
Panousis, K., Chatzis, S. & Theodoridis, S.. (2019). Nonparametric Bayesian Deep Networks with Local Competition. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4980-4988 Available from https://proceedings.mlr.press/v97/panousis19a.html.

Related Material