[edit]
Generative Modeling of Labeled Graphs Under Data Scarcity
Proceedings of the Second Learning on Graphs Conference, PMLR 231:32:1-32:18, 2024.
Abstract
Deep graph generative modeling has gained enormous attraction in recent years due to its impressive ability to directly learn the underlying hidden graph distribution. Despite their initial success, these techniques, like much of the existing deep generative methods, require a large number of training samples to learn a good model. Unfortunately, large number of training samples may not always be available in scenarios such as drug discovery for rare diseases. At the same time, recent advances in few-shot learning have opened door to applications where available training data is limited. In this work, we introduce the hitherto unexplored paradigm of graph generative modeling under data scarcity. Towards this, we develop a meta-learning based framework for labeled graph generative modeling under data scarcity. Our proposed model learns to transfer meta-knowledge from similar auxiliary graph datasets. Utilizing these prior experiences, our model quickly adapts to an unseen graph dataset through self-paced fine-tuning. Through extensive experiments on datasets from diverse domains having limited training samples, we establish that the proposed method generates graphs of superior fidelity compared to existing baselines.