[edit]
ACFlow: Flow Models for Arbitrary Conditional Likelihoods
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:5831-5841, 2020.
Abstract
Understanding the dependencies among features of a dataset is at the core of most unsupervised learning tasks. However, a majority of generative modeling approaches are focused solely on the joint distribution p(x) and utilize models where it is intractable to obtain the conditional distribution of some arbitrary subset of features xu given the rest of the observed covariates xo: p(xu∣xo). Traditional conditional approaches provide a model for a \emph{fixed} set of covariates conditioned on another \emph{fixed} set of observed covariates. Instead, in this work we develop a model that is capable of yielding \emph{all} conditional distributions p(xu∣xo) (for arbitrary xu) via tractable conditional likelihoods. We propose a novel extension of (change of variables based) flow generative models, arbitrary conditioning flow models (ACFlow). ACFlow can be conditioned on arbitrary subsets of observed covariates, which was previously infeasible. We further extend ACFlow to model the joint distributions p(x) and arbitrary marginal distributions p(xu). We also apply ACFlow to the imputation of features, and develop a unified platform for both multiple and single imputation by introducing an auxiliary objective that provides a principled single “best guess” for flow models. Extensive empirical evaluations show that our model achieves state-of-the-art performance in modeling arbitrary conditional likelihoods in addition to both single and multiple imputation in synthetic and real-world datasets.