Scaling Up Ordinal Embedding: A Landmark Approach

Jesse Anderton, Javed Aslam
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:282-290, 2019.

Abstract

Ordinal Embedding is the problem of placing n objects into R^d to satisfy constraints like "object a is closer to b than to c." It can accommodate data that embeddings from features or distances cannot, but is a more difficult problem. We propose a novel landmark-based method as a partial solution. At small to medium scales, we present a novel combination of existing methods with some new theoretical justification. For very large values of n optimizing over an entire embedding breaks down, so we propose a novel method which first embeds a subset of m << n objects and then embeds the remaining objects independently and in parallel. We prove a distance error bound for our method in terms of m and that it has O(dn log m) time complexity, and show empirically that it is able to produce high quality embeddings in a fraction of the time needed for any published method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-anderton19a, title = {Scaling Up Ordinal Embedding: A Landmark Approach}, author = {Anderton, Jesse and Aslam, Javed}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {282--290}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/anderton19a/anderton19a.pdf}, url = {https://proceedings.mlr.press/v97/anderton19a.html}, abstract = {Ordinal Embedding is the problem of placing n objects into R^d to satisfy constraints like "object a is closer to b than to c." It can accommodate data that embeddings from features or distances cannot, but is a more difficult problem. We propose a novel landmark-based method as a partial solution. At small to medium scales, we present a novel combination of existing methods with some new theoretical justification. For very large values of n optimizing over an entire embedding breaks down, so we propose a novel method which first embeds a subset of m << n objects and then embeds the remaining objects independently and in parallel. We prove a distance error bound for our method in terms of m and that it has O(dn log m) time complexity, and show empirically that it is able to produce high quality embeddings in a fraction of the time needed for any published method.} }
Endnote
%0 Conference Paper %T Scaling Up Ordinal Embedding: A Landmark Approach %A Jesse Anderton %A Javed Aslam %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-anderton19a %I PMLR %P 282--290 %U https://proceedings.mlr.press/v97/anderton19a.html %V 97 %X Ordinal Embedding is the problem of placing n objects into R^d to satisfy constraints like "object a is closer to b than to c." It can accommodate data that embeddings from features or distances cannot, but is a more difficult problem. We propose a novel landmark-based method as a partial solution. At small to medium scales, we present a novel combination of existing methods with some new theoretical justification. For very large values of n optimizing over an entire embedding breaks down, so we propose a novel method which first embeds a subset of m << n objects and then embeds the remaining objects independently and in parallel. We prove a distance error bound for our method in terms of m and that it has O(dn log m) time complexity, and show empirically that it is able to produce high quality embeddings in a fraction of the time needed for any published method.
APA
Anderton, J. & Aslam, J.. (2019). Scaling Up Ordinal Embedding: A Landmark Approach. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:282-290 Available from https://proceedings.mlr.press/v97/anderton19a.html.

Related Material