Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows

Rob Cornish, Anthony Caterini, George Deligiannidis, Arnaud Doucet
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2133-2143, 2020.

Abstract

We show that normalising flows become pathological when used to model targets whose supports have complicated topologies. In this scenario, we prove that a flow must become arbitrarily numerically noninvertible in order to approximate the target closely. This result has implications for all flow-based models, and especially residual flows (ResFlows), which explicitly control the Lipschitz constant of the bijection used. To address this, we propose continuously indexed flows (CIFs), which replace the single bijection used by normalising flows with a continuously indexed family of bijections, and which can intuitively "clean up" mass that would otherwise be misplaced by a single bijection. We show theoretically that CIFs are not subject to the same topological limitations as normalising flows, and obtain better empirical performance on a variety of models and benchmarks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-cornish20a, title = {Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows}, author = {Cornish, Rob and Caterini, Anthony and Deligiannidis, George and Doucet, Arnaud}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2133--2143}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/cornish20a/cornish20a.pdf}, url = { http://proceedings.mlr.press/v119/cornish20a.html }, abstract = {We show that normalising flows become pathological when used to model targets whose supports have complicated topologies. In this scenario, we prove that a flow must become arbitrarily numerically noninvertible in order to approximate the target closely. This result has implications for all flow-based models, and especially residual flows (ResFlows), which explicitly control the Lipschitz constant of the bijection used. To address this, we propose continuously indexed flows (CIFs), which replace the single bijection used by normalising flows with a continuously indexed family of bijections, and which can intuitively "clean up" mass that would otherwise be misplaced by a single bijection. We show theoretically that CIFs are not subject to the same topological limitations as normalising flows, and obtain better empirical performance on a variety of models and benchmarks.} }
Endnote
%0 Conference Paper %T Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows %A Rob Cornish %A Anthony Caterini %A George Deligiannidis %A Arnaud Doucet %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-cornish20a %I PMLR %P 2133--2143 %U http://proceedings.mlr.press/v119/cornish20a.html %V 119 %X We show that normalising flows become pathological when used to model targets whose supports have complicated topologies. In this scenario, we prove that a flow must become arbitrarily numerically noninvertible in order to approximate the target closely. This result has implications for all flow-based models, and especially residual flows (ResFlows), which explicitly control the Lipschitz constant of the bijection used. To address this, we propose continuously indexed flows (CIFs), which replace the single bijection used by normalising flows with a continuously indexed family of bijections, and which can intuitively "clean up" mass that would otherwise be misplaced by a single bijection. We show theoretically that CIFs are not subject to the same topological limitations as normalising flows, and obtain better empirical performance on a variety of models and benchmarks.
APA
Cornish, R., Caterini, A., Deligiannidis, G. & Doucet, A.. (2020). Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2133-2143 Available from http://proceedings.mlr.press/v119/cornish20a.html .

Related Material