Diffusion Generative Models in Infinite Dimensions

Gavin Kerrigan, Justin Ley, Padhraic Smyth
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:9538-9563, 2023.

Abstract

Diffusion generative models have recently been applied to domains where the available data can be seen as a discretization of an underlying function, such as audio signals or time series. However, these models operate directly on the discretized data, and there are no semantics in the modeling process that relate the observed data to the underlying functional forms. We generalize diffusion models to operate directly in function space by developing the foundational theory for such models in terms of Gaussian measures on Hilbert spaces. A significant benefit of our function space point of view is that it allows us to explicitly specify the space of functions we are working in, leading us to develop methods for diffusion generative modeling in Sobolev spaces. Our approach allows us to perform both unconditional and conditional generation of function-valued data. We demonstrate our methods on several synthetic and real-world benchmarks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-kerrigan23a, title = {Diffusion Generative Models in Infinite Dimensions}, author = {Kerrigan, Gavin and Ley, Justin and Smyth, Padhraic}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {9538--9563}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/kerrigan23a/kerrigan23a.pdf}, url = {https://proceedings.mlr.press/v206/kerrigan23a.html}, abstract = {Diffusion generative models have recently been applied to domains where the available data can be seen as a discretization of an underlying function, such as audio signals or time series. However, these models operate directly on the discretized data, and there are no semantics in the modeling process that relate the observed data to the underlying functional forms. We generalize diffusion models to operate directly in function space by developing the foundational theory for such models in terms of Gaussian measures on Hilbert spaces. A significant benefit of our function space point of view is that it allows us to explicitly specify the space of functions we are working in, leading us to develop methods for diffusion generative modeling in Sobolev spaces. Our approach allows us to perform both unconditional and conditional generation of function-valued data. We demonstrate our methods on several synthetic and real-world benchmarks.} }
Endnote
%0 Conference Paper %T Diffusion Generative Models in Infinite Dimensions %A Gavin Kerrigan %A Justin Ley %A Padhraic Smyth %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-kerrigan23a %I PMLR %P 9538--9563 %U https://proceedings.mlr.press/v206/kerrigan23a.html %V 206 %X Diffusion generative models have recently been applied to domains where the available data can be seen as a discretization of an underlying function, such as audio signals or time series. However, these models operate directly on the discretized data, and there are no semantics in the modeling process that relate the observed data to the underlying functional forms. We generalize diffusion models to operate directly in function space by developing the foundational theory for such models in terms of Gaussian measures on Hilbert spaces. A significant benefit of our function space point of view is that it allows us to explicitly specify the space of functions we are working in, leading us to develop methods for diffusion generative modeling in Sobolev spaces. Our approach allows us to perform both unconditional and conditional generation of function-valued data. We demonstrate our methods on several synthetic and real-world benchmarks.
APA
Kerrigan, G., Ley, J. & Smyth, P.. (2023). Diffusion Generative Models in Infinite Dimensions. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:9538-9563 Available from https://proceedings.mlr.press/v206/kerrigan23a.html.

Related Material