GFlowOut: Dropout with Generative Flow Networks

Dianbo Liu, Moksh Jain, Bonaventure F. P. Dossou, Qianli Shen, Salem Lahlou, Anirudh Goyal, Nikolay Malkin, Chris Chinenye Emezue, Dinghuai Zhang, Nadhir Hassen, Xu Ji, Kenji Kawaguchi, Yoshua Bengio
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:21715-21729, 2023.

Abstract

Bayesian inference offers principled tools to tackle many critical problems with modern neural networks such as poor calibration and generalization, and data inefficiency. However, scaling Bayesian inference to large architectures is challenging and requires restrictive approximations. Monte Carlo Dropout has been widely used as a relatively cheap way to approximate inference and estimate uncertainty with deep neural networks. Traditionally, the dropout mask is sampled independently from a fixed distribution. Recent research shows that the dropout mask can be seen as a latent variable, which can be inferred with variational inference. These methods face two important challenges: (a) the posterior distribution over masks can be highly multi-modal which can be difficult to approximate with standard variational inference and (b) it is not trivial to fully utilize sample-dependent information and correlation among dropout masks to improve posterior estimation. In this work, we propose GFlowOut to address these issues. GFlowOut leverages the recently proposed probabilistic framework of Generative Flow Networks (GFlowNets) to learn the posterior distribution over dropout masks. We empirically demonstrate that GFlowOut results in predictive distributions that generalize better to out-of-distribution data and provide uncertainty estimates which lead to better performance in downstream tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-liu23r, title = {{GF}low{O}ut: Dropout with Generative Flow Networks}, author = {Liu, Dianbo and Jain, Moksh and Dossou, Bonaventure F. P. and Shen, Qianli and Lahlou, Salem and Goyal, Anirudh and Malkin, Nikolay and Emezue, Chris Chinenye and Zhang, Dinghuai and Hassen, Nadhir and Ji, Xu and Kawaguchi, Kenji and Bengio, Yoshua}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {21715--21729}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/liu23r/liu23r.pdf}, url = {https://proceedings.mlr.press/v202/liu23r.html}, abstract = {Bayesian inference offers principled tools to tackle many critical problems with modern neural networks such as poor calibration and generalization, and data inefficiency. However, scaling Bayesian inference to large architectures is challenging and requires restrictive approximations. Monte Carlo Dropout has been widely used as a relatively cheap way to approximate inference and estimate uncertainty with deep neural networks. Traditionally, the dropout mask is sampled independently from a fixed distribution. Recent research shows that the dropout mask can be seen as a latent variable, which can be inferred with variational inference. These methods face two important challenges: (a) the posterior distribution over masks can be highly multi-modal which can be difficult to approximate with standard variational inference and (b) it is not trivial to fully utilize sample-dependent information and correlation among dropout masks to improve posterior estimation. In this work, we propose GFlowOut to address these issues. GFlowOut leverages the recently proposed probabilistic framework of Generative Flow Networks (GFlowNets) to learn the posterior distribution over dropout masks. We empirically demonstrate that GFlowOut results in predictive distributions that generalize better to out-of-distribution data and provide uncertainty estimates which lead to better performance in downstream tasks.} }
Endnote
%0 Conference Paper %T GFlowOut: Dropout with Generative Flow Networks %A Dianbo Liu %A Moksh Jain %A Bonaventure F. P. Dossou %A Qianli Shen %A Salem Lahlou %A Anirudh Goyal %A Nikolay Malkin %A Chris Chinenye Emezue %A Dinghuai Zhang %A Nadhir Hassen %A Xu Ji %A Kenji Kawaguchi %A Yoshua Bengio %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-liu23r %I PMLR %P 21715--21729 %U https://proceedings.mlr.press/v202/liu23r.html %V 202 %X Bayesian inference offers principled tools to tackle many critical problems with modern neural networks such as poor calibration and generalization, and data inefficiency. However, scaling Bayesian inference to large architectures is challenging and requires restrictive approximations. Monte Carlo Dropout has been widely used as a relatively cheap way to approximate inference and estimate uncertainty with deep neural networks. Traditionally, the dropout mask is sampled independently from a fixed distribution. Recent research shows that the dropout mask can be seen as a latent variable, which can be inferred with variational inference. These methods face two important challenges: (a) the posterior distribution over masks can be highly multi-modal which can be difficult to approximate with standard variational inference and (b) it is not trivial to fully utilize sample-dependent information and correlation among dropout masks to improve posterior estimation. In this work, we propose GFlowOut to address these issues. GFlowOut leverages the recently proposed probabilistic framework of Generative Flow Networks (GFlowNets) to learn the posterior distribution over dropout masks. We empirically demonstrate that GFlowOut results in predictive distributions that generalize better to out-of-distribution data and provide uncertainty estimates which lead to better performance in downstream tasks.
APA
Liu, D., Jain, M., Dossou, B.F.P., Shen, Q., Lahlou, S., Goyal, A., Malkin, N., Emezue, C.C., Zhang, D., Hassen, N., Ji, X., Kawaguchi, K. & Bengio, Y.. (2023). GFlowOut: Dropout with Generative Flow Networks. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:21715-21729 Available from https://proceedings.mlr.press/v202/liu23r.html.

Related Material