Hyperbolic Kernel Convolution: A Generic Framework

Eric Qu, Lige Zhang, Habib Debaya, Yue Wu, Dongmian Zou
Proceedings of the Third Learning on Graphs Conference, PMLR 269:25:1-25:25, 2025.

Abstract

The past sexennium has witnessed rapid advancements of hyperbolic neural networks. However, it is challenging to learn good hyperbolic representations since common Euclidean neural operations, such as convolution, do not extend to the hyperbolic space. Most hyperbolic neural networks omit the convolution operation and cannot effectively extract local patterns. Others either only use non-hyperbolic convolution, or miss essential properties such as equivariance to permutation. We propose HKConv, a novel trainable hyperbolic convolution which first correlates trainable local hyperbolic features with fixed kernel points placed in the hyperbolic space, then aggregates the output features within a local neighborhood. HKConv is a generic framework where any coordinate model of the hyperbolic space can be flexibly used. We show that neural networks with HKConv layers advance state-of-the-art in various tasks. The code of our implementation is available at https://github.com/BruceZhangReve/Hyperbolic-Kernel-Convolution

Cite this Paper


BibTeX
@InProceedings{pmlr-v269-qu25a, title = {Hyperbolic Kernel Convolution: A Generic Framework}, author = {Qu, Eric and Zhang, Lige and Debaya, Habib and Wu, Yue and Zou, Dongmian}, booktitle = {Proceedings of the Third Learning on Graphs Conference}, pages = {25:1--25:25}, year = {2025}, editor = {Wolf, Guy and Krishnaswamy, Smita}, volume = {269}, series = {Proceedings of Machine Learning Research}, month = {26--29 Nov}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v269/main/assets/qu25a/qu25a.pdf}, url = {https://proceedings.mlr.press/v269/qu25a.html}, abstract = {The past sexennium has witnessed rapid advancements of hyperbolic neural networks. However, it is challenging to learn good hyperbolic representations since common Euclidean neural operations, such as convolution, do not extend to the hyperbolic space. Most hyperbolic neural networks omit the convolution operation and cannot effectively extract local patterns. Others either only use non-hyperbolic convolution, or miss essential properties such as equivariance to permutation. We propose HKConv, a novel trainable hyperbolic convolution which first correlates trainable local hyperbolic features with fixed kernel points placed in the hyperbolic space, then aggregates the output features within a local neighborhood. HKConv is a generic framework where any coordinate model of the hyperbolic space can be flexibly used. We show that neural networks with HKConv layers advance state-of-the-art in various tasks. The code of our implementation is available at https://github.com/BruceZhangReve/Hyperbolic-Kernel-Convolution} }
Endnote
%0 Conference Paper %T Hyperbolic Kernel Convolution: A Generic Framework %A Eric Qu %A Lige Zhang %A Habib Debaya %A Yue Wu %A Dongmian Zou %B Proceedings of the Third Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2025 %E Guy Wolf %E Smita Krishnaswamy %F pmlr-v269-qu25a %I PMLR %P 25:1--25:25 %U https://proceedings.mlr.press/v269/qu25a.html %V 269 %X The past sexennium has witnessed rapid advancements of hyperbolic neural networks. However, it is challenging to learn good hyperbolic representations since common Euclidean neural operations, such as convolution, do not extend to the hyperbolic space. Most hyperbolic neural networks omit the convolution operation and cannot effectively extract local patterns. Others either only use non-hyperbolic convolution, or miss essential properties such as equivariance to permutation. We propose HKConv, a novel trainable hyperbolic convolution which first correlates trainable local hyperbolic features with fixed kernel points placed in the hyperbolic space, then aggregates the output features within a local neighborhood. HKConv is a generic framework where any coordinate model of the hyperbolic space can be flexibly used. We show that neural networks with HKConv layers advance state-of-the-art in various tasks. The code of our implementation is available at https://github.com/BruceZhangReve/Hyperbolic-Kernel-Convolution
APA
Qu, E., Zhang, L., Debaya, H., Wu, Y. & Zou, D.. (2025). Hyperbolic Kernel Convolution: A Generic Framework. Proceedings of the Third Learning on Graphs Conference, in Proceedings of Machine Learning Research 269:25:1-25:25 Available from https://proceedings.mlr.press/v269/qu25a.html.

Related Material