Learning Parts-based Representations with Nonnegative Restricted Boltzmann Machine

Tu Dinh Nguyen, Truyen Tran, Dinh Phung, Svetha Venkatesh
Proceedings of the 5th Asian Conference on Machine Learning, PMLR 29:133-148, 2013.

Abstract

The success of any machine learning system depends critically on effective representations of data. In many cases, especially those in vision, it is desirable that a representation scheme uncovers the parts-based, additive nature of the data. Of current representation learning schemes, restricted Boltzmann machines (RBMs) have proved to be highly effective in unsupervised settings. However, when it comes to parts-based discovery, RBMs do not usually produce satisfactory results. We enhance such capacity of RBMs by introducing nonnegativity into the model weights, resulting in a variant called \emphnonnegative restricted Boltzmann machine (NRBM). The NRBM produces not only controllable decomposition of data into interpretable parts but also offers a way to estimate the intrinsic nonlinear dimensionality of data. We demonstrate the capacity of our model on well-known datasets of handwritten digits, faces and documents. The decomposition quality on images is comparable with or better than what produced by the nonnegative matrix factorisation (NMF), and the thematic features uncovered from text are qualitatively interpretable in a similar manner to that of the latent Dirichlet allocation (LDA). However, the learnt features, when used for classification, are more discriminative than those discovered by both NMF and LDA and comparable with those by RBM.

Cite this Paper


BibTeX
@InProceedings{pmlr-v29-Nguyen13, title = {Learning Parts-based Representations with Nonnegative Restricted Boltzmann Machine}, author = {Nguyen, Tu Dinh and Tran, Truyen and Phung, Dinh and Venkatesh, Svetha}, booktitle = {Proceedings of the 5th Asian Conference on Machine Learning}, pages = {133--148}, year = {2013}, editor = {Ong, Cheng Soon and Ho, Tu Bao}, volume = {29}, series = {Proceedings of Machine Learning Research}, address = {Australian National University, Canberra, Australia}, month = {13--15 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v29/Nguyen13.pdf}, url = {https://proceedings.mlr.press/v29/Nguyen13.html}, abstract = {The success of any machine learning system depends critically on effective representations of data. In many cases, especially those in vision, it is desirable that a representation scheme uncovers the parts-based, additive nature of the data. Of current representation learning schemes, restricted Boltzmann machines (RBMs) have proved to be highly effective in unsupervised settings. However, when it comes to parts-based discovery, RBMs do not usually produce satisfactory results. We enhance such capacity of RBMs by introducing nonnegativity into the model weights, resulting in a variant called \emphnonnegative restricted Boltzmann machine (NRBM). The NRBM produces not only controllable decomposition of data into interpretable parts but also offers a way to estimate the intrinsic nonlinear dimensionality of data. We demonstrate the capacity of our model on well-known datasets of handwritten digits, faces and documents. The decomposition quality on images is comparable with or better than what produced by the nonnegative matrix factorisation (NMF), and the thematic features uncovered from text are qualitatively interpretable in a similar manner to that of the latent Dirichlet allocation (LDA). However, the learnt features, when used for classification, are more discriminative than those discovered by both NMF and LDA and comparable with those by RBM.} }
Endnote
%0 Conference Paper %T Learning Parts-based Representations with Nonnegative Restricted Boltzmann Machine %A Tu Dinh Nguyen %A Truyen Tran %A Dinh Phung %A Svetha Venkatesh %B Proceedings of the 5th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Cheng Soon Ong %E Tu Bao Ho %F pmlr-v29-Nguyen13 %I PMLR %P 133--148 %U https://proceedings.mlr.press/v29/Nguyen13.html %V 29 %X The success of any machine learning system depends critically on effective representations of data. In many cases, especially those in vision, it is desirable that a representation scheme uncovers the parts-based, additive nature of the data. Of current representation learning schemes, restricted Boltzmann machines (RBMs) have proved to be highly effective in unsupervised settings. However, when it comes to parts-based discovery, RBMs do not usually produce satisfactory results. We enhance such capacity of RBMs by introducing nonnegativity into the model weights, resulting in a variant called \emphnonnegative restricted Boltzmann machine (NRBM). The NRBM produces not only controllable decomposition of data into interpretable parts but also offers a way to estimate the intrinsic nonlinear dimensionality of data. We demonstrate the capacity of our model on well-known datasets of handwritten digits, faces and documents. The decomposition quality on images is comparable with or better than what produced by the nonnegative matrix factorisation (NMF), and the thematic features uncovered from text are qualitatively interpretable in a similar manner to that of the latent Dirichlet allocation (LDA). However, the learnt features, when used for classification, are more discriminative than those discovered by both NMF and LDA and comparable with those by RBM.
RIS
TY - CPAPER TI - Learning Parts-based Representations with Nonnegative Restricted Boltzmann Machine AU - Tu Dinh Nguyen AU - Truyen Tran AU - Dinh Phung AU - Svetha Venkatesh BT - Proceedings of the 5th Asian Conference on Machine Learning DA - 2013/10/21 ED - Cheng Soon Ong ED - Tu Bao Ho ID - pmlr-v29-Nguyen13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 29 SP - 133 EP - 148 L1 - http://proceedings.mlr.press/v29/Nguyen13.pdf UR - https://proceedings.mlr.press/v29/Nguyen13.html AB - The success of any machine learning system depends critically on effective representations of data. In many cases, especially those in vision, it is desirable that a representation scheme uncovers the parts-based, additive nature of the data. Of current representation learning schemes, restricted Boltzmann machines (RBMs) have proved to be highly effective in unsupervised settings. However, when it comes to parts-based discovery, RBMs do not usually produce satisfactory results. We enhance such capacity of RBMs by introducing nonnegativity into the model weights, resulting in a variant called \emphnonnegative restricted Boltzmann machine (NRBM). The NRBM produces not only controllable decomposition of data into interpretable parts but also offers a way to estimate the intrinsic nonlinear dimensionality of data. We demonstrate the capacity of our model on well-known datasets of handwritten digits, faces and documents. The decomposition quality on images is comparable with or better than what produced by the nonnegative matrix factorisation (NMF), and the thematic features uncovered from text are qualitatively interpretable in a similar manner to that of the latent Dirichlet allocation (LDA). However, the learnt features, when used for classification, are more discriminative than those discovered by both NMF and LDA and comparable with those by RBM. ER -
APA
Nguyen, T.D., Tran, T., Phung, D. & Venkatesh, S.. (2013). Learning Parts-based Representations with Nonnegative Restricted Boltzmann Machine. Proceedings of the 5th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 29:133-148 Available from https://proceedings.mlr.press/v29/Nguyen13.html.

Related Material