[edit]
CoDi: Co-evolving Contrastive Diffusion Models for Mixed-type Tabular Synthesis
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:18940-18956, 2023.
Abstract
With growing attention to tabular data these days, the attempt to apply a synthetic table to various tasks has been expanded toward various scenarios. Owing to the recent advances in generative modeling, fake data generated by tabular data synthesis models become sophisticated and realistic. However, there still exists a difficulty in modeling discrete variables (columns) of tabular data. In this work, we propose to process continuous and discrete variables separately (but being conditioned on each other) by two diffusion models. The two diffusion models are co-evolved during training by reading conditions from each other. In order to further bind the diffusion models, moreover, we introduce a contrastive learning method with a negative sampling method. In our experiments with 11 real-world tabular datasets and 8 baseline methods, we prove the efficacy of the proposed method, called $\texttt{CoDi}$. Our code is available at https://github.com/ChaejeongLee/CoDi.