Forward Operator Estimation in Generative Models with Kernel Transfer Operators

Zhichun Huang, Rudrasis Chakraborty, Vikas Singh
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:9148-9172, 2022.

Abstract

Generative models which use explicit density modeling (e.g., variational autoencoders, flow-based generative models) involve finding a mapping from a known distribution, e.g. Gaussian, to the unknown input distribution. This often requires searching over a class of non-linear functions (e.g., representable by a deep neural network). While effective in practice, the associated runtime/memory costs can increase rapidly, usually as a function of the performance desired in an application. We propose a substantially cheaper (and simpler) forward operator estimation strategy based on adapting known results on kernel transfer operators. We show that our formulation enables highly efficient distribution approximation and sampling, and offers surprisingly good empirical performance that compares favorably with powerful baselines, but with significant runtime savings. We show that the algorithm also performs well in small sample size settings (in brain imaging).

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-huang22b, title = {Forward Operator Estimation in Generative Models with Kernel Transfer Operators}, author = {Huang, Zhichun and Chakraborty, Rudrasis and Singh, Vikas}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {9148--9172}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/huang22b/huang22b.pdf}, url = {https://proceedings.mlr.press/v162/huang22b.html}, abstract = {Generative models which use explicit density modeling (e.g., variational autoencoders, flow-based generative models) involve finding a mapping from a known distribution, e.g. Gaussian, to the unknown input distribution. This often requires searching over a class of non-linear functions (e.g., representable by a deep neural network). While effective in practice, the associated runtime/memory costs can increase rapidly, usually as a function of the performance desired in an application. We propose a substantially cheaper (and simpler) forward operator estimation strategy based on adapting known results on kernel transfer operators. We show that our formulation enables highly efficient distribution approximation and sampling, and offers surprisingly good empirical performance that compares favorably with powerful baselines, but with significant runtime savings. We show that the algorithm also performs well in small sample size settings (in brain imaging).} }
Endnote
%0 Conference Paper %T Forward Operator Estimation in Generative Models with Kernel Transfer Operators %A Zhichun Huang %A Rudrasis Chakraborty %A Vikas Singh %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-huang22b %I PMLR %P 9148--9172 %U https://proceedings.mlr.press/v162/huang22b.html %V 162 %X Generative models which use explicit density modeling (e.g., variational autoencoders, flow-based generative models) involve finding a mapping from a known distribution, e.g. Gaussian, to the unknown input distribution. This often requires searching over a class of non-linear functions (e.g., representable by a deep neural network). While effective in practice, the associated runtime/memory costs can increase rapidly, usually as a function of the performance desired in an application. We propose a substantially cheaper (and simpler) forward operator estimation strategy based on adapting known results on kernel transfer operators. We show that our formulation enables highly efficient distribution approximation and sampling, and offers surprisingly good empirical performance that compares favorably with powerful baselines, but with significant runtime savings. We show that the algorithm also performs well in small sample size settings (in brain imaging).
APA
Huang, Z., Chakraborty, R. & Singh, V.. (2022). Forward Operator Estimation in Generative Models with Kernel Transfer Operators. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:9148-9172 Available from https://proceedings.mlr.press/v162/huang22b.html.

Related Material