Path-Gradient Estimators for Continuous Normalizing Flows

Lorenz Vaitl, Kim Andrea Nicoli, Shinichi Nakajima, Pan Kessel
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:21945-21959, 2022.

Abstract

Recent work has established a path-gradient estimator for simple variational Gaussian distributions and has argued that the path-gradient is particularly beneficial in the regime in which the variational distribution approaches the exact target distribution. In many applications, this regime can however not be reached by a simple Gaussian variational distribution. In this work, we overcome this crucial limitation by proposing a path-gradient estimator for the considerably more expressive variational family of continuous normalizing flows. We outline an efficient algorithm to calculate this estimator and establish its superior performance empirically.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-vaitl22a, title = {Path-Gradient Estimators for Continuous Normalizing Flows}, author = {Vaitl, Lorenz and Nicoli, Kim Andrea and Nakajima, Shinichi and Kessel, Pan}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {21945--21959}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/vaitl22a/vaitl22a.pdf}, url = {https://proceedings.mlr.press/v162/vaitl22a.html}, abstract = {Recent work has established a path-gradient estimator for simple variational Gaussian distributions and has argued that the path-gradient is particularly beneficial in the regime in which the variational distribution approaches the exact target distribution. In many applications, this regime can however not be reached by a simple Gaussian variational distribution. In this work, we overcome this crucial limitation by proposing a path-gradient estimator for the considerably more expressive variational family of continuous normalizing flows. We outline an efficient algorithm to calculate this estimator and establish its superior performance empirically.} }
Endnote
%0 Conference Paper %T Path-Gradient Estimators for Continuous Normalizing Flows %A Lorenz Vaitl %A Kim Andrea Nicoli %A Shinichi Nakajima %A Pan Kessel %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-vaitl22a %I PMLR %P 21945--21959 %U https://proceedings.mlr.press/v162/vaitl22a.html %V 162 %X Recent work has established a path-gradient estimator for simple variational Gaussian distributions and has argued that the path-gradient is particularly beneficial in the regime in which the variational distribution approaches the exact target distribution. In many applications, this regime can however not be reached by a simple Gaussian variational distribution. In this work, we overcome this crucial limitation by proposing a path-gradient estimator for the considerably more expressive variational family of continuous normalizing flows. We outline an efficient algorithm to calculate this estimator and establish its superior performance empirically.
APA
Vaitl, L., Nicoli, K.A., Nakajima, S. & Kessel, P.. (2022). Path-Gradient Estimators for Continuous Normalizing Flows. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:21945-21959 Available from https://proceedings.mlr.press/v162/vaitl22a.html.

Related Material