Active Nearest Neighbor Regression Through Delaunay Refinement

Alexander Kravberg, Giovanni Luca Marchetti, Vladislav Polianskii, Anastasiia Varava, Florian T. Pokorny, Danica Kragic
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:11650-11664, 2022.

Abstract

We introduce an algorithm for active function approximation based on nearest neighbor regression. Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value and select novel query points in a way that takes the geometry of the function graph into account. We consider the recent state-of-the-art active function approximator called DEFER, which is based on incremental rectangular partitioning of the space, as the main baseline. The ANNR addresses a number of limitations that arise from the space subdivision strategy used in DEFER. We provide a computationally efficient implementation of our method, as well as theoretical halting guarantees. Empirical results show that ANNR outperforms the baseline for both closed-form functions and real-world examples, such as gravitational wave parameter inference and exploration of the latent space of a generative model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-kravberg22a, title = {Active Nearest Neighbor Regression Through Delaunay Refinement}, author = {Kravberg, Alexander and Marchetti, Giovanni Luca and Polianskii, Vladislav and Varava, Anastasiia and Pokorny, Florian T. and Kragic, Danica}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {11650--11664}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/kravberg22a/kravberg22a.pdf}, url = {https://proceedings.mlr.press/v162/kravberg22a.html}, abstract = {We introduce an algorithm for active function approximation based on nearest neighbor regression. Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value and select novel query points in a way that takes the geometry of the function graph into account. We consider the recent state-of-the-art active function approximator called DEFER, which is based on incremental rectangular partitioning of the space, as the main baseline. The ANNR addresses a number of limitations that arise from the space subdivision strategy used in DEFER. We provide a computationally efficient implementation of our method, as well as theoretical halting guarantees. Empirical results show that ANNR outperforms the baseline for both closed-form functions and real-world examples, such as gravitational wave parameter inference and exploration of the latent space of a generative model.} }
Endnote
%0 Conference Paper %T Active Nearest Neighbor Regression Through Delaunay Refinement %A Alexander Kravberg %A Giovanni Luca Marchetti %A Vladislav Polianskii %A Anastasiia Varava %A Florian T. Pokorny %A Danica Kragic %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-kravberg22a %I PMLR %P 11650--11664 %U https://proceedings.mlr.press/v162/kravberg22a.html %V 162 %X We introduce an algorithm for active function approximation based on nearest neighbor regression. Our Active Nearest Neighbor Regressor (ANNR) relies on the Voronoi-Delaunay framework from computational geometry to subdivide the space into cells with constant estimated function value and select novel query points in a way that takes the geometry of the function graph into account. We consider the recent state-of-the-art active function approximator called DEFER, which is based on incremental rectangular partitioning of the space, as the main baseline. The ANNR addresses a number of limitations that arise from the space subdivision strategy used in DEFER. We provide a computationally efficient implementation of our method, as well as theoretical halting guarantees. Empirical results show that ANNR outperforms the baseline for both closed-form functions and real-world examples, such as gravitational wave parameter inference and exploration of the latent space of a generative model.
APA
Kravberg, A., Marchetti, G.L., Polianskii, V., Varava, A., Pokorny, F.T. & Kragic, D.. (2022). Active Nearest Neighbor Regression Through Delaunay Refinement. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:11650-11664 Available from https://proceedings.mlr.press/v162/kravberg22a.html.

Related Material