Well-Conditioned Spectral Transforms for Dynamic Graph Representation

Bingxin Zhou, Xinliang Liu, Yuehua Liu, Yunying Huang, Pietro Lio, Yu Guang Wang
Proceedings of the First Learning on Graphs Conference, PMLR 198:12:1-12:19, 2022.

Abstract

This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.

Cite this Paper


BibTeX
@InProceedings{pmlr-v198-zhou22a, title = {Well-Conditioned Spectral Transforms for Dynamic Graph Representation}, author = {Zhou, Bingxin and Liu, Xinliang and Liu, Yuehua and Huang, Yunying and Lio, Pietro and Wang, Yu Guang}, booktitle = {Proceedings of the First Learning on Graphs Conference}, pages = {12:1--12:19}, year = {2022}, editor = {Rieck, Bastian and Pascanu, Razvan}, volume = {198}, series = {Proceedings of Machine Learning Research}, month = {09--12 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v198/zhou22a/zhou22a.pdf}, url = {https://proceedings.mlr.press/v198/zhou22a.html}, abstract = {This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.} }
Endnote
%0 Conference Paper %T Well-Conditioned Spectral Transforms for Dynamic Graph Representation %A Bingxin Zhou %A Xinliang Liu %A Yuehua Liu %A Yunying Huang %A Pietro Lio %A Yu Guang Wang %B Proceedings of the First Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2022 %E Bastian Rieck %E Razvan Pascanu %F pmlr-v198-zhou22a %I PMLR %P 12:1--12:19 %U https://proceedings.mlr.press/v198/zhou22a.html %V 198 %X This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.
APA
Zhou, B., Liu, X., Liu, Y., Huang, Y., Lio, P. & Wang, Y.G.. (2022). Well-Conditioned Spectral Transforms for Dynamic Graph Representation. Proceedings of the First Learning on Graphs Conference, in Proceedings of Machine Learning Research 198:12:1-12:19 Available from https://proceedings.mlr.press/v198/zhou22a.html.

Related Material