Deep Generative Missingness Pattern-Set Mixture Models

Sahra Ghalebikesabi, Rob Cornish, Chris Holmes, Luke Kelly
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:3727-3735, 2021.

Abstract

We propose a variational autoencoder architecture to model both ignorable and nonignorable missing data using pattern-set mixtures as proposed by Little (1993). Our model explicitly learns to cluster the missing data into missingness pattern sets based on the observed data and missingness masks. Underpinning our approach is the assumption that the data distribution under missingness is probabilistically semi-supervised by samples from the observed data distribution. Our setup trades off the characteristics of ignorable and nonignorable missingness and can thus be applied to data of both types. We evaluate our method on a wide range of data sets with different types of missingness and achieve state-of-the-art imputation performance. Our model outperforms many common imputation algorithms, especially when the amount of missing data is high and the missingness mechanism is nonignorable.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-ghalebikesabi21a, title = { Deep Generative Missingness Pattern-Set Mixture Models }, author = {Ghalebikesabi, Sahra and Cornish, Rob and Holmes, Chris and Kelly, Luke}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {3727--3735}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/ghalebikesabi21a/ghalebikesabi21a.pdf}, url = {https://proceedings.mlr.press/v130/ghalebikesabi21a.html}, abstract = { We propose a variational autoencoder architecture to model both ignorable and nonignorable missing data using pattern-set mixtures as proposed by Little (1993). Our model explicitly learns to cluster the missing data into missingness pattern sets based on the observed data and missingness masks. Underpinning our approach is the assumption that the data distribution under missingness is probabilistically semi-supervised by samples from the observed data distribution. Our setup trades off the characteristics of ignorable and nonignorable missingness and can thus be applied to data of both types. We evaluate our method on a wide range of data sets with different types of missingness and achieve state-of-the-art imputation performance. Our model outperforms many common imputation algorithms, especially when the amount of missing data is high and the missingness mechanism is nonignorable. } }
Endnote
%0 Conference Paper %T Deep Generative Missingness Pattern-Set Mixture Models %A Sahra Ghalebikesabi %A Rob Cornish %A Chris Holmes %A Luke Kelly %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-ghalebikesabi21a %I PMLR %P 3727--3735 %U https://proceedings.mlr.press/v130/ghalebikesabi21a.html %V 130 %X We propose a variational autoencoder architecture to model both ignorable and nonignorable missing data using pattern-set mixtures as proposed by Little (1993). Our model explicitly learns to cluster the missing data into missingness pattern sets based on the observed data and missingness masks. Underpinning our approach is the assumption that the data distribution under missingness is probabilistically semi-supervised by samples from the observed data distribution. Our setup trades off the characteristics of ignorable and nonignorable missingness and can thus be applied to data of both types. We evaluate our method on a wide range of data sets with different types of missingness and achieve state-of-the-art imputation performance. Our model outperforms many common imputation algorithms, especially when the amount of missing data is high and the missingness mechanism is nonignorable.
APA
Ghalebikesabi, S., Cornish, R., Holmes, C. & Kelly, L.. (2021). Deep Generative Missingness Pattern-Set Mixture Models . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:3727-3735 Available from https://proceedings.mlr.press/v130/ghalebikesabi21a.html.

Related Material