Non-Neighbors Also Matter to Kriging: A New Contrastive-Prototypical Learning

Zhishuai Li, Yunhao Nie, Ziyue Li, Lei Bai, Yisheng Lv, Rui Zhao
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:46-54, 2024.

Abstract

Kriging aims to estimate the attributes of unseen geo-locations from observations in the spatial vicinity or physical connections. Existing works assume that neighbors’ information offers the basis for estimating the unobserved target while ignoring non-neighbors. However, neighbors could also be quite different or even misleading, and the non-neighbors could still offer constructive information. To this end, we propose "Contrastive-Prototypical" self-supervised learning for Kriging (KCP): (1) The neighboring contrastive module coarsely pushes neighbors together and non-neighbors apart. (2) In parallel, the prototypical module identifies similar representations via exchanged prediction, such that it refines the misleading neighbors and recycles the useful non-neighbors from the neighboring contrast component. As a result, not all the neighbors and some of the non-neighbors will be used to infer the target. (3) To learn general and robust representations, we design an adaptive augmentation module that encourages data diversity. Theoretical bound is derived for the proposed augmentation. Extensive experiments on real-world datasets demonstrate the superior performance of KCP compared to its peers with 6% improvements and exceptional transferability and robustness.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-li24b, title = { Non-Neighbors Also Matter to {K}riging: A New Contrastive-Prototypical Learning }, author = {Li, Zhishuai and Nie, Yunhao and Li, Ziyue and Bai, Lei and Lv, Yisheng and Zhao, Rui}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {46--54}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/li24b/li24b.pdf}, url = {https://proceedings.mlr.press/v238/li24b.html}, abstract = { Kriging aims to estimate the attributes of unseen geo-locations from observations in the spatial vicinity or physical connections. Existing works assume that neighbors’ information offers the basis for estimating the unobserved target while ignoring non-neighbors. However, neighbors could also be quite different or even misleading, and the non-neighbors could still offer constructive information. To this end, we propose "Contrastive-Prototypical" self-supervised learning for Kriging (KCP): (1) The neighboring contrastive module coarsely pushes neighbors together and non-neighbors apart. (2) In parallel, the prototypical module identifies similar representations via exchanged prediction, such that it refines the misleading neighbors and recycles the useful non-neighbors from the neighboring contrast component. As a result, not all the neighbors and some of the non-neighbors will be used to infer the target. (3) To learn general and robust representations, we design an adaptive augmentation module that encourages data diversity. Theoretical bound is derived for the proposed augmentation. Extensive experiments on real-world datasets demonstrate the superior performance of KCP compared to its peers with 6% improvements and exceptional transferability and robustness. } }
Endnote
%0 Conference Paper %T Non-Neighbors Also Matter to Kriging: A New Contrastive-Prototypical Learning %A Zhishuai Li %A Yunhao Nie %A Ziyue Li %A Lei Bai %A Yisheng Lv %A Rui Zhao %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-li24b %I PMLR %P 46--54 %U https://proceedings.mlr.press/v238/li24b.html %V 238 %X Kriging aims to estimate the attributes of unseen geo-locations from observations in the spatial vicinity or physical connections. Existing works assume that neighbors’ information offers the basis for estimating the unobserved target while ignoring non-neighbors. However, neighbors could also be quite different or even misleading, and the non-neighbors could still offer constructive information. To this end, we propose "Contrastive-Prototypical" self-supervised learning for Kriging (KCP): (1) The neighboring contrastive module coarsely pushes neighbors together and non-neighbors apart. (2) In parallel, the prototypical module identifies similar representations via exchanged prediction, such that it refines the misleading neighbors and recycles the useful non-neighbors from the neighboring contrast component. As a result, not all the neighbors and some of the non-neighbors will be used to infer the target. (3) To learn general and robust representations, we design an adaptive augmentation module that encourages data diversity. Theoretical bound is derived for the proposed augmentation. Extensive experiments on real-world datasets demonstrate the superior performance of KCP compared to its peers with 6% improvements and exceptional transferability and robustness.
APA
Li, Z., Nie, Y., Li, Z., Bai, L., Lv, Y. & Zhao, R.. (2024). Non-Neighbors Also Matter to Kriging: A New Contrastive-Prototypical Learning . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:46-54 Available from https://proceedings.mlr.press/v238/li24b.html.

Related Material