Generalization Error Bound for Hyperbolic Ordinal Embedding

Atsushi Suzuki, Atsushi Nitanda, Jing Wang, Linchuan Xu, Kenji Yamanishi, Marc Cavazza
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:10011-10021, 2021.

Abstract

Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space so that they agree as well as possible with given constraints in the form of entity $i$ is more similar to entity $j$ than to entity $k$. It has been experimentally shown that HOE can obtain representations of hierarchical data such as a knowledge base and a citation network effectively, owing to hyperbolic space’s exponential growth property. However, its theoretical analysis has been limited to ideal noiseless settings, and its generalization error in compensation for hyperbolic space’s exponential representation ability has not been guaranteed. The difficulty is that existing generalization error bound derivations for ordinal embedding based on the Gramian matrix are not applicable in HOE, since hyperbolic space is not inner-product space. In this paper, through our novel characterization of HOE with decomposed Lorentz Gramian matrices, we provide a generalization error bound of HOE for the first time, which is at most exponential with respect to the embedding space’s radius. Our comparison between the bounds of HOE and Euclidean ordinal embedding shows that HOE’s generalization error comes at a reasonable cost considering its exponential representation ability.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-suzuki21a, title = {Generalization Error Bound for Hyperbolic Ordinal Embedding}, author = {Suzuki, Atsushi and Nitanda, Atsushi and Wang, Jing and Xu, Linchuan and Yamanishi, Kenji and Cavazza, Marc}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {10011--10021}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/suzuki21a/suzuki21a.pdf}, url = {https://proceedings.mlr.press/v139/suzuki21a.html}, abstract = {Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space so that they agree as well as possible with given constraints in the form of entity $i$ is more similar to entity $j$ than to entity $k$. It has been experimentally shown that HOE can obtain representations of hierarchical data such as a knowledge base and a citation network effectively, owing to hyperbolic space’s exponential growth property. However, its theoretical analysis has been limited to ideal noiseless settings, and its generalization error in compensation for hyperbolic space’s exponential representation ability has not been guaranteed. The difficulty is that existing generalization error bound derivations for ordinal embedding based on the Gramian matrix are not applicable in HOE, since hyperbolic space is not inner-product space. In this paper, through our novel characterization of HOE with decomposed Lorentz Gramian matrices, we provide a generalization error bound of HOE for the first time, which is at most exponential with respect to the embedding space’s radius. Our comparison between the bounds of HOE and Euclidean ordinal embedding shows that HOE’s generalization error comes at a reasonable cost considering its exponential representation ability.} }
Endnote
%0 Conference Paper %T Generalization Error Bound for Hyperbolic Ordinal Embedding %A Atsushi Suzuki %A Atsushi Nitanda %A Jing Wang %A Linchuan Xu %A Kenji Yamanishi %A Marc Cavazza %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-suzuki21a %I PMLR %P 10011--10021 %U https://proceedings.mlr.press/v139/suzuki21a.html %V 139 %X Hyperbolic ordinal embedding (HOE) represents entities as points in hyperbolic space so that they agree as well as possible with given constraints in the form of entity $i$ is more similar to entity $j$ than to entity $k$. It has been experimentally shown that HOE can obtain representations of hierarchical data such as a knowledge base and a citation network effectively, owing to hyperbolic space’s exponential growth property. However, its theoretical analysis has been limited to ideal noiseless settings, and its generalization error in compensation for hyperbolic space’s exponential representation ability has not been guaranteed. The difficulty is that existing generalization error bound derivations for ordinal embedding based on the Gramian matrix are not applicable in HOE, since hyperbolic space is not inner-product space. In this paper, through our novel characterization of HOE with decomposed Lorentz Gramian matrices, we provide a generalization error bound of HOE for the first time, which is at most exponential with respect to the embedding space’s radius. Our comparison between the bounds of HOE and Euclidean ordinal embedding shows that HOE’s generalization error comes at a reasonable cost considering its exponential representation ability.
APA
Suzuki, A., Nitanda, A., Wang, J., Xu, L., Yamanishi, K. & Cavazza, M.. (2021). Generalization Error Bound for Hyperbolic Ordinal Embedding. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:10011-10021 Available from https://proceedings.mlr.press/v139/suzuki21a.html.

Related Material