Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure

Ruslan Salakhutdinov, Geoff Hinton
; Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:412-419, 2007.

Abstract

We show how to pretrain and fine-tune a multilayer neural network to learn a nonlinear transformation from the input space to a lowdimensional feature space in which K-nearest neighbour classification performs well. We also show how the non-linear transformation can be improved using unlabeled data. Our method achieves a much lower error rate than Support Vector Machines or standard backpropagation on a widely used version of the MNIST handwritten digit recognition task. If some of the dimensions of the low-dimensional feature space are not used for nearest neighbor classification, our method uses these dimensions to explicitly represent transformations of the digits that do not affect their identity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-salakhutdinov07a, title = {Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure}, author = {Ruslan Salakhutdinov and Geoff Hinton}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {412--419}, year = {2007}, editor = {Marina Meila and Xiaotong Shen}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/salakhutdinov07a/salakhutdinov07a.pdf}, url = {http://proceedings.mlr.press/v2/salakhutdinov07a.html}, abstract = {We show how to pretrain and fine-tune a multilayer neural network to learn a nonlinear transformation from the input space to a lowdimensional feature space in which K-nearest neighbour classification performs well. We also show how the non-linear transformation can be improved using unlabeled data. Our method achieves a much lower error rate than Support Vector Machines or standard backpropagation on a widely used version of the MNIST handwritten digit recognition task. If some of the dimensions of the low-dimensional feature space are not used for nearest neighbor classification, our method uses these dimensions to explicitly represent transformations of the digits that do not affect their identity.} }
Endnote
%0 Conference Paper %T Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure %A Ruslan Salakhutdinov %A Geoff Hinton %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-salakhutdinov07a %I PMLR %J Proceedings of Machine Learning Research %P 412--419 %U http://proceedings.mlr.press %V 2 %W PMLR %X We show how to pretrain and fine-tune a multilayer neural network to learn a nonlinear transformation from the input space to a lowdimensional feature space in which K-nearest neighbour classification performs well. We also show how the non-linear transformation can be improved using unlabeled data. Our method achieves a much lower error rate than Support Vector Machines or standard backpropagation on a widely used version of the MNIST handwritten digit recognition task. If some of the dimensions of the low-dimensional feature space are not used for nearest neighbor classification, our method uses these dimensions to explicitly represent transformations of the digits that do not affect their identity.
RIS
TY - CPAPER TI - Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure AU - Ruslan Salakhutdinov AU - Geoff Hinton BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics PY - 2007/03/11 DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-salakhutdinov07a PB - PMLR SP - 412 DP - PMLR EP - 419 L1 - http://proceedings.mlr.press/v2/salakhutdinov07a/salakhutdinov07a.pdf UR - http://proceedings.mlr.press/v2/salakhutdinov07a.html AB - We show how to pretrain and fine-tune a multilayer neural network to learn a nonlinear transformation from the input space to a lowdimensional feature space in which K-nearest neighbour classification performs well. We also show how the non-linear transformation can be improved using unlabeled data. Our method achieves a much lower error rate than Support Vector Machines or standard backpropagation on a widely used version of the MNIST handwritten digit recognition task. If some of the dimensions of the low-dimensional feature space are not used for nearest neighbor classification, our method uses these dimensions to explicitly represent transformations of the digits that do not affect their identity. ER -
APA
Salakhutdinov, R. & Hinton, G.. (2007). Learning a Nonlinear Embedding by Preserving Class Neighbourhood Structure. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in PMLR 2:412-419

Related Material