Stochastic Neural Networks with Monotonic Activation Functions

Siamak Ravanbakhsh, Barnabas Poczos, Jeff Schneider, Dale Schuurmans, Russell Greiner
; Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:809-818, 2016.

Abstract

We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-ravanbakhsh16, title = {Stochastic Neural Networks with Monotonic Activation Functions}, author = {Siamak Ravanbakhsh and Barnabas Poczos and Jeff Schneider and Dale Schuurmans and Russell Greiner}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {809--818}, year = {2016}, editor = {Arthur Gretton and Christian C. Robert}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/ravanbakhsh16.pdf}, url = {http://proceedings.mlr.press/v51/ravanbakhsh16.html}, abstract = {We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.} }
Endnote
%0 Conference Paper %T Stochastic Neural Networks with Monotonic Activation Functions %A Siamak Ravanbakhsh %A Barnabas Poczos %A Jeff Schneider %A Dale Schuurmans %A Russell Greiner %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-ravanbakhsh16 %I PMLR %J Proceedings of Machine Learning Research %P 809--818 %U http://proceedings.mlr.press %V 51 %W PMLR %X We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units.
RIS
TY - CPAPER TI - Stochastic Neural Networks with Monotonic Activation Functions AU - Siamak Ravanbakhsh AU - Barnabas Poczos AU - Jeff Schneider AU - Dale Schuurmans AU - Russell Greiner BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics PY - 2016/05/02 DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-ravanbakhsh16 PB - PMLR SP - 809 DP - PMLR EP - 818 L1 - http://proceedings.mlr.press/v51/ravanbakhsh16.pdf UR - http://proceedings.mlr.press/v51/ravanbakhsh16.html AB - We propose a Laplace approximation that creates a stochastic unit from any smooth monotonic activation function, using only Gaussian noise. This paper investigates the application of this stochastic approximation in training a family of Restricted Boltzmann Machines (RBM) that are closely linked to Bregman divergences. This family, that we call exponential family RBM (Exp-RBM), is a subset of the exponential family Harmoniums that expresses family members through a choice of smooth monotonic non-linearity for each neuron. Using contrastive divergence along with our Gaussian approximation, we show that Exp-RBM can learn useful representations using novel stochastic units. ER -
APA
Ravanbakhsh, S., Poczos, B., Schneider, J., Schuurmans, D. & Greiner, R.. (2016). Stochastic Neural Networks with Monotonic Activation Functions. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in PMLR 51:809-818

Related Material