Comparison-Based Random Forests

Siavash Haghiri, Damien Garreau, Ulrike Luxburg
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1871-1880, 2018.

Abstract

Assume we are given a set of items from a general metric space, but we neither have access to the representation of the data nor to the distances between data points. Instead, suppose that we can actively choose a triplet of items (A, B, C) and ask an oracle whether item A is closer to item B or to item C. In this paper, we propose a novel random forest algorithm for regression and classification that relies only on such triplet comparisons. In the theory part of this paper, we establish sufficient conditions for the consistency of such a forest. In a set of comprehensive experiments, we then demonstrate that the proposed random forest is efficient both for classification and regression. In particular, it is even competitive with other methods that have direct access to the metric representation of the data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-haghiri18a, title = {Comparison-Based Random Forests}, author = {Haghiri, Siavash and Garreau, Damien and von Luxburg, Ulrike}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {1871--1880}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/haghiri18a/haghiri18a.pdf}, url = {https://proceedings.mlr.press/v80/haghiri18a.html}, abstract = {Assume we are given a set of items from a general metric space, but we neither have access to the representation of the data nor to the distances between data points. Instead, suppose that we can actively choose a triplet of items (A, B, C) and ask an oracle whether item A is closer to item B or to item C. In this paper, we propose a novel random forest algorithm for regression and classification that relies only on such triplet comparisons. In the theory part of this paper, we establish sufficient conditions for the consistency of such a forest. In a set of comprehensive experiments, we then demonstrate that the proposed random forest is efficient both for classification and regression. In particular, it is even competitive with other methods that have direct access to the metric representation of the data.} }
Endnote
%0 Conference Paper %T Comparison-Based Random Forests %A Siavash Haghiri %A Damien Garreau %A Ulrike Luxburg %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-haghiri18a %I PMLR %P 1871--1880 %U https://proceedings.mlr.press/v80/haghiri18a.html %V 80 %X Assume we are given a set of items from a general metric space, but we neither have access to the representation of the data nor to the distances between data points. Instead, suppose that we can actively choose a triplet of items (A, B, C) and ask an oracle whether item A is closer to item B or to item C. In this paper, we propose a novel random forest algorithm for regression and classification that relies only on such triplet comparisons. In the theory part of this paper, we establish sufficient conditions for the consistency of such a forest. In a set of comprehensive experiments, we then demonstrate that the proposed random forest is efficient both for classification and regression. In particular, it is even competitive with other methods that have direct access to the metric representation of the data.
APA
Haghiri, S., Garreau, D. & Luxburg, U.. (2018). Comparison-Based Random Forests. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:1871-1880 Available from https://proceedings.mlr.press/v80/haghiri18a.html.

Related Material