Hyperboloid GPLVM for Discovering Continuous Hierarchies via Nonparametric Estimation

Koshi Watanabe, Keisuke Maeda, Takahiro Ogawa, Miki Haseyama
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1783-1791, 2025.

Abstract

Dimensionality reduction (DR) offers interpretable representations of complex high-dimensional data, and recent DR methods have leveraged hyperbolic geometry to obtain faithful low-dimensional embeddings of high-dimensional hierarchical relationships. However, existing methods are dependent on neighbor embedding, which frequently ruins the continuous nature of the hierarchical structures. This paper proposes hyperboloid Gaussian process latent variable models (hGP-LVMs) to embed high-dimensional hierarchical data while preserving the implicit continuity via nonparametric estimation. We adopt generative modeling using the GP, which provides effective hierarchical embedding and executes ill-posed hyperparameter tuning. This paper presents three variants of the proposed models that employ original point, sparse point, and Bayesian estimations, and we establish their learning algorithms by incorporating the Riemannian optimization and active approximation scheme of the GP-LVM. In addition, we employ the reparameterization trick for scalable learning of the latent variables in the Bayesian estimation method. The proposed hGP-LVMs were applied to several datasets, and the results demonstrate their ability to represent high-dimensional hierarchies in low-dimensional spaces.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-watanabe25a, title = {Hyperboloid GPLVM for Discovering Continuous Hierarchies via Nonparametric Estimation}, author = {Watanabe, Koshi and Maeda, Keisuke and Ogawa, Takahiro and Haseyama, Miki}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1783--1791}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/watanabe25a/watanabe25a.pdf}, url = {https://proceedings.mlr.press/v258/watanabe25a.html}, abstract = {Dimensionality reduction (DR) offers interpretable representations of complex high-dimensional data, and recent DR methods have leveraged hyperbolic geometry to obtain faithful low-dimensional embeddings of high-dimensional hierarchical relationships. However, existing methods are dependent on neighbor embedding, which frequently ruins the continuous nature of the hierarchical structures. This paper proposes hyperboloid Gaussian process latent variable models (hGP-LVMs) to embed high-dimensional hierarchical data while preserving the implicit continuity via nonparametric estimation. We adopt generative modeling using the GP, which provides effective hierarchical embedding and executes ill-posed hyperparameter tuning. This paper presents three variants of the proposed models that employ original point, sparse point, and Bayesian estimations, and we establish their learning algorithms by incorporating the Riemannian optimization and active approximation scheme of the GP-LVM. In addition, we employ the reparameterization trick for scalable learning of the latent variables in the Bayesian estimation method. The proposed hGP-LVMs were applied to several datasets, and the results demonstrate their ability to represent high-dimensional hierarchies in low-dimensional spaces.} }
Endnote
%0 Conference Paper %T Hyperboloid GPLVM for Discovering Continuous Hierarchies via Nonparametric Estimation %A Koshi Watanabe %A Keisuke Maeda %A Takahiro Ogawa %A Miki Haseyama %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-watanabe25a %I PMLR %P 1783--1791 %U https://proceedings.mlr.press/v258/watanabe25a.html %V 258 %X Dimensionality reduction (DR) offers interpretable representations of complex high-dimensional data, and recent DR methods have leveraged hyperbolic geometry to obtain faithful low-dimensional embeddings of high-dimensional hierarchical relationships. However, existing methods are dependent on neighbor embedding, which frequently ruins the continuous nature of the hierarchical structures. This paper proposes hyperboloid Gaussian process latent variable models (hGP-LVMs) to embed high-dimensional hierarchical data while preserving the implicit continuity via nonparametric estimation. We adopt generative modeling using the GP, which provides effective hierarchical embedding and executes ill-posed hyperparameter tuning. This paper presents three variants of the proposed models that employ original point, sparse point, and Bayesian estimations, and we establish their learning algorithms by incorporating the Riemannian optimization and active approximation scheme of the GP-LVM. In addition, we employ the reparameterization trick for scalable learning of the latent variables in the Bayesian estimation method. The proposed hGP-LVMs were applied to several datasets, and the results demonstrate their ability to represent high-dimensional hierarchies in low-dimensional spaces.
APA
Watanabe, K., Maeda, K., Ogawa, T. & Haseyama, M.. (2025). Hyperboloid GPLVM for Discovering Continuous Hierarchies via Nonparametric Estimation. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1783-1791 Available from https://proceedings.mlr.press/v258/watanabe25a.html.

Related Material