Autoencoder Trees

Ozan İrsoy, Ethem Alpaydin
Asian Conference on Machine Learning, PMLR 45:378-390, 2016.

Abstract

We discuss an autoencoder model in which the encoding and decoding functions are implemented by decision trees. We use the soft decision tree where internal nodes realize soft multivariate splits given by a gating function and the overall output is the average of all leaves weighted by the gating values on their path. The encoder tree takes the input and generates a lower dimensional representation in the leaves and the decoder tree takes this and reconstructs the original input. Exploiting the continuity of the trees, autoencoder trees are trained with stochastic gradient-descent. On handwritten digit and news data, we see that the autoencoder trees yield good reconstruction error compared to traditional autoencoder perceptrons. We also see that the autoencoder tree captures hierarchical representations at different granularities of the data on its different levels and the leaves capture the localities in the input space.

Cite this Paper


BibTeX
@InProceedings{pmlr-v45-Irsoy15, title = {Autoencoder Trees}, author = {İrsoy, Ozan and Alpaydin, Ethem}, booktitle = {Asian Conference on Machine Learning}, pages = {378--390}, year = {2016}, editor = {Holmes, Geoffrey and Liu, Tie-Yan}, volume = {45}, series = {Proceedings of Machine Learning Research}, address = {Hong Kong}, month = {20--22 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v45/Irsoy15.pdf}, url = {https://proceedings.mlr.press/v45/Irsoy15.html}, abstract = {We discuss an autoencoder model in which the encoding and decoding functions are implemented by decision trees. We use the soft decision tree where internal nodes realize soft multivariate splits given by a gating function and the overall output is the average of all leaves weighted by the gating values on their path. The encoder tree takes the input and generates a lower dimensional representation in the leaves and the decoder tree takes this and reconstructs the original input. Exploiting the continuity of the trees, autoencoder trees are trained with stochastic gradient-descent. On handwritten digit and news data, we see that the autoencoder trees yield good reconstruction error compared to traditional autoencoder perceptrons. We also see that the autoencoder tree captures hierarchical representations at different granularities of the data on its different levels and the leaves capture the localities in the input space. } }
Endnote
%0 Conference Paper %T Autoencoder Trees %A Ozan İrsoy %A Ethem Alpaydin %B Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Geoffrey Holmes %E Tie-Yan Liu %F pmlr-v45-Irsoy15 %I PMLR %P 378--390 %U https://proceedings.mlr.press/v45/Irsoy15.html %V 45 %X We discuss an autoencoder model in which the encoding and decoding functions are implemented by decision trees. We use the soft decision tree where internal nodes realize soft multivariate splits given by a gating function and the overall output is the average of all leaves weighted by the gating values on their path. The encoder tree takes the input and generates a lower dimensional representation in the leaves and the decoder tree takes this and reconstructs the original input. Exploiting the continuity of the trees, autoencoder trees are trained with stochastic gradient-descent. On handwritten digit and news data, we see that the autoencoder trees yield good reconstruction error compared to traditional autoencoder perceptrons. We also see that the autoencoder tree captures hierarchical representations at different granularities of the data on its different levels and the leaves capture the localities in the input space.
RIS
TY - CPAPER TI - Autoencoder Trees AU - Ozan İrsoy AU - Ethem Alpaydin BT - Asian Conference on Machine Learning DA - 2016/02/25 ED - Geoffrey Holmes ED - Tie-Yan Liu ID - pmlr-v45-Irsoy15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 45 SP - 378 EP - 390 L1 - http://proceedings.mlr.press/v45/Irsoy15.pdf UR - https://proceedings.mlr.press/v45/Irsoy15.html AB - We discuss an autoencoder model in which the encoding and decoding functions are implemented by decision trees. We use the soft decision tree where internal nodes realize soft multivariate splits given by a gating function and the overall output is the average of all leaves weighted by the gating values on their path. The encoder tree takes the input and generates a lower dimensional representation in the leaves and the decoder tree takes this and reconstructs the original input. Exploiting the continuity of the trees, autoencoder trees are trained with stochastic gradient-descent. On handwritten digit and news data, we see that the autoencoder trees yield good reconstruction error compared to traditional autoencoder perceptrons. We also see that the autoencoder tree captures hierarchical representations at different granularities of the data on its different levels and the leaves capture the localities in the input space. ER -
APA
İrsoy, O. & Alpaydin, E.. (2016). Autoencoder Trees. Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 45:378-390 Available from https://proceedings.mlr.press/v45/Irsoy15.html.

Related Material