Deep Boltzmann Machines as Feed-Forward Hierarchies

Gregoire Montavon, Mikio Braun, Klaus-Robert Muller
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:798-804, 2012.

Abstract

The deep Boltzmann machine is a powerful model that extracts the hierarchical structure of observed data. While inference is typically slow due to its undirected nature, we argue that the emerging feature hierarchy is still explicit enough to be traversed in a feed-forward fashion. The claim is corroborated by training a set of deep neural networks on real data and measuring the evolution of the representation layer after layer. The analysis reveals that the deep Boltzmann machine produces a feed-forward hierarchy of increasingly invariant representations that clearly surpasses the layer-wise approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-montavon12, title = {Deep Boltzmann Machines as Feed-Forward Hierarchies}, author = {Montavon, Gregoire and Braun, Mikio and Muller, Klaus-Robert}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {798--804}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/montavon12/montavon12.pdf}, url = {https://proceedings.mlr.press/v22/montavon12.html}, abstract = {The deep Boltzmann machine is a powerful model that extracts the hierarchical structure of observed data. While inference is typically slow due to its undirected nature, we argue that the emerging feature hierarchy is still explicit enough to be traversed in a feed-forward fashion. The claim is corroborated by training a set of deep neural networks on real data and measuring the evolution of the representation layer after layer. The analysis reveals that the deep Boltzmann machine produces a feed-forward hierarchy of increasingly invariant representations that clearly surpasses the layer-wise approach.} }
Endnote
%0 Conference Paper %T Deep Boltzmann Machines as Feed-Forward Hierarchies %A Gregoire Montavon %A Mikio Braun %A Klaus-Robert Muller %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-montavon12 %I PMLR %P 798--804 %U https://proceedings.mlr.press/v22/montavon12.html %V 22 %X The deep Boltzmann machine is a powerful model that extracts the hierarchical structure of observed data. While inference is typically slow due to its undirected nature, we argue that the emerging feature hierarchy is still explicit enough to be traversed in a feed-forward fashion. The claim is corroborated by training a set of deep neural networks on real data and measuring the evolution of the representation layer after layer. The analysis reveals that the deep Boltzmann machine produces a feed-forward hierarchy of increasingly invariant representations that clearly surpasses the layer-wise approach.
RIS
TY - CPAPER TI - Deep Boltzmann Machines as Feed-Forward Hierarchies AU - Gregoire Montavon AU - Mikio Braun AU - Klaus-Robert Muller BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-montavon12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 798 EP - 804 L1 - http://proceedings.mlr.press/v22/montavon12/montavon12.pdf UR - https://proceedings.mlr.press/v22/montavon12.html AB - The deep Boltzmann machine is a powerful model that extracts the hierarchical structure of observed data. While inference is typically slow due to its undirected nature, we argue that the emerging feature hierarchy is still explicit enough to be traversed in a feed-forward fashion. The claim is corroborated by training a set of deep neural networks on real data and measuring the evolution of the representation layer after layer. The analysis reveals that the deep Boltzmann machine produces a feed-forward hierarchy of increasingly invariant representations that clearly surpasses the layer-wise approach. ER -
APA
Montavon, G., Braun, M. & Muller, K.. (2012). Deep Boltzmann Machines as Feed-Forward Hierarchies. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:798-804 Available from https://proceedings.mlr.press/v22/montavon12.html.

Related Material