Multiscale Invertible Generative Networks for High-Dimensional Bayesian Inference

Shumao Zhang, Pengchuan Zhang, Thomas Y Hou
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:12632-12641, 2021.

Abstract

We propose a Multiscale Invertible Generative Network (MsIGN) and associated training algorithm that leverages multiscale structure to solve high-dimensional Bayesian inference. To address the curse of dimensionality, MsIGN exploits the low-dimensional nature of the posterior, and generates samples from coarse to fine scale (low to high dimension) by iteratively upsampling and refining samples. MsIGN is trained in a multi-stage manner to minimize the Jeffreys divergence, which avoids mode dropping in high-dimensional cases. On two high-dimensional Bayesian inverse problems, we show superior performance of MsIGN over previous approaches in posterior approximation and multiple mode capture. On the natural image synthesis task, MsIGN achieves superior performance in bits-per-dimension over baseline models and yields great interpret-ability of its neurons in intermediate layers.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-zhang21z, title = {Multiscale Invertible Generative Networks for High-Dimensional Bayesian Inference}, author = {Zhang, Shumao and Zhang, Pengchuan and Hou, Thomas Y}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {12632--12641}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/zhang21z/zhang21z.pdf}, url = {https://proceedings.mlr.press/v139/zhang21z.html}, abstract = {We propose a Multiscale Invertible Generative Network (MsIGN) and associated training algorithm that leverages multiscale structure to solve high-dimensional Bayesian inference. To address the curse of dimensionality, MsIGN exploits the low-dimensional nature of the posterior, and generates samples from coarse to fine scale (low to high dimension) by iteratively upsampling and refining samples. MsIGN is trained in a multi-stage manner to minimize the Jeffreys divergence, which avoids mode dropping in high-dimensional cases. On two high-dimensional Bayesian inverse problems, we show superior performance of MsIGN over previous approaches in posterior approximation and multiple mode capture. On the natural image synthesis task, MsIGN achieves superior performance in bits-per-dimension over baseline models and yields great interpret-ability of its neurons in intermediate layers.} }
Endnote
%0 Conference Paper %T Multiscale Invertible Generative Networks for High-Dimensional Bayesian Inference %A Shumao Zhang %A Pengchuan Zhang %A Thomas Y Hou %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-zhang21z %I PMLR %P 12632--12641 %U https://proceedings.mlr.press/v139/zhang21z.html %V 139 %X We propose a Multiscale Invertible Generative Network (MsIGN) and associated training algorithm that leverages multiscale structure to solve high-dimensional Bayesian inference. To address the curse of dimensionality, MsIGN exploits the low-dimensional nature of the posterior, and generates samples from coarse to fine scale (low to high dimension) by iteratively upsampling and refining samples. MsIGN is trained in a multi-stage manner to minimize the Jeffreys divergence, which avoids mode dropping in high-dimensional cases. On two high-dimensional Bayesian inverse problems, we show superior performance of MsIGN over previous approaches in posterior approximation and multiple mode capture. On the natural image synthesis task, MsIGN achieves superior performance in bits-per-dimension over baseline models and yields great interpret-ability of its neurons in intermediate layers.
APA
Zhang, S., Zhang, P. & Hou, T.Y.. (2021). Multiscale Invertible Generative Networks for High-Dimensional Bayesian Inference. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:12632-12641 Available from https://proceedings.mlr.press/v139/zhang21z.html.

Related Material