SPECTRE: Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators

Karolis Martinkus, Andreas Loukas, Nathanaël Perraudin, Roger Wattenhofer
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:15159-15179, 2022.

Abstract

We approach the graph generation problem from a spectral perspective by first generating the dominant parts of the graph Laplacian spectrum and then building a graph matching these eigenvalues and eigenvectors. Spectral conditioning allows for direct modeling of the global and local graph structure and helps to overcome the expressivity and mode collapse issues of one-shot graph generators. Our novel GAN, called SPECTRE, enables the one-shot generation of much larger graphs than previously possible with one-shot models. SPECTRE outperforms state-of-the-art deep autoregressive generators in terms of modeling fidelity, while also avoiding expensive sequential generation and dependence on node ordering. A case in point, in sizable synthetic and real-world graphs SPECTRE achieves a 4-to-170 fold improvement over the best competitor that does not overfit and is 23-to-30 times faster than autoregressive generators.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-martinkus22a, title = {{SPECTRE}: Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators}, author = {Martinkus, Karolis and Loukas, Andreas and Perraudin, Nathana{\"e}l and Wattenhofer, Roger}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {15159--15179}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/martinkus22a/martinkus22a.pdf}, url = {https://proceedings.mlr.press/v162/martinkus22a.html}, abstract = {We approach the graph generation problem from a spectral perspective by first generating the dominant parts of the graph Laplacian spectrum and then building a graph matching these eigenvalues and eigenvectors. Spectral conditioning allows for direct modeling of the global and local graph structure and helps to overcome the expressivity and mode collapse issues of one-shot graph generators. Our novel GAN, called SPECTRE, enables the one-shot generation of much larger graphs than previously possible with one-shot models. SPECTRE outperforms state-of-the-art deep autoregressive generators in terms of modeling fidelity, while also avoiding expensive sequential generation and dependence on node ordering. A case in point, in sizable synthetic and real-world graphs SPECTRE achieves a 4-to-170 fold improvement over the best competitor that does not overfit and is 23-to-30 times faster than autoregressive generators.} }
Endnote
%0 Conference Paper %T SPECTRE: Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators %A Karolis Martinkus %A Andreas Loukas %A Nathanaël Perraudin %A Roger Wattenhofer %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-martinkus22a %I PMLR %P 15159--15179 %U https://proceedings.mlr.press/v162/martinkus22a.html %V 162 %X We approach the graph generation problem from a spectral perspective by first generating the dominant parts of the graph Laplacian spectrum and then building a graph matching these eigenvalues and eigenvectors. Spectral conditioning allows for direct modeling of the global and local graph structure and helps to overcome the expressivity and mode collapse issues of one-shot graph generators. Our novel GAN, called SPECTRE, enables the one-shot generation of much larger graphs than previously possible with one-shot models. SPECTRE outperforms state-of-the-art deep autoregressive generators in terms of modeling fidelity, while also avoiding expensive sequential generation and dependence on node ordering. A case in point, in sizable synthetic and real-world graphs SPECTRE achieves a 4-to-170 fold improvement over the best competitor that does not overfit and is 23-to-30 times faster than autoregressive generators.
APA
Martinkus, K., Loukas, A., Perraudin, N. & Wattenhofer, R.. (2022). SPECTRE: Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:15159-15179 Available from https://proceedings.mlr.press/v162/martinkus22a.html.

Related Material