Manifold Preserving Hierarchical Topic Models for Quantization and Approximation

[edit]

Minje Kim, Paris Smaragdis ;
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):1373-1381, 2013.

Abstract

We present two complementary topic models to address the analysis of mixture data lying on manifolds. First, we propose a quantization method with an additional mid-layer latent variable, which selects only data points that best preserve the manifold structure of the input data. In order to address the case of modeling all the in-between parts of that manifold using this reduced representation of the input, we introduce a new model that provides a manifold-aware interpolation method. We demonstrate the advantages of these models with experiments on the hand-written digit recognition and the speech source separation tasks.

Related Material