Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings

Jean-Francois Ton, Lucian CHAN, Yee Whye Teh, Dino Sejdinovic
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:1099-1107, 2021.

Abstract

Current meta-learning approaches focus on learning functional representations of relationships between variables, \textit{i.e.} estimating conditional expectations in regression. In many applications, however, the conditional distributions cannot be meaningfully summarized solely by expectation (due to \textit{e.g.} multimodality). We introduce a novel technique for meta-learning conditional densities, which combines neural representation and noise contrastive estimation together with well-established literature in conditional mean embeddings into reproducing kernel Hilbert spaces. The method shows significant improvements over standard density estimation methods on synthetic and real-world data, by leveraging shared representations across multiple conditional density estimation tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-ton21a, title = { Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings }, author = {Ton, Jean-Francois and CHAN, Lucian and Whye Teh, Yee and Sejdinovic, Dino}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {1099--1107}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/ton21a/ton21a.pdf}, url = {https://proceedings.mlr.press/v130/ton21a.html}, abstract = { Current meta-learning approaches focus on learning functional representations of relationships between variables, \textit{i.e.} estimating conditional expectations in regression. In many applications, however, the conditional distributions cannot be meaningfully summarized solely by expectation (due to \textit{e.g.} multimodality). We introduce a novel technique for meta-learning conditional densities, which combines neural representation and noise contrastive estimation together with well-established literature in conditional mean embeddings into reproducing kernel Hilbert spaces. The method shows significant improvements over standard density estimation methods on synthetic and real-world data, by leveraging shared representations across multiple conditional density estimation tasks. } }
Endnote
%0 Conference Paper %T Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings %A Jean-Francois Ton %A Lucian CHAN %A Yee Whye Teh %A Dino Sejdinovic %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-ton21a %I PMLR %P 1099--1107 %U https://proceedings.mlr.press/v130/ton21a.html %V 130 %X Current meta-learning approaches focus on learning functional representations of relationships between variables, \textit{i.e.} estimating conditional expectations in regression. In many applications, however, the conditional distributions cannot be meaningfully summarized solely by expectation (due to \textit{e.g.} multimodality). We introduce a novel technique for meta-learning conditional densities, which combines neural representation and noise contrastive estimation together with well-established literature in conditional mean embeddings into reproducing kernel Hilbert spaces. The method shows significant improvements over standard density estimation methods on synthetic and real-world data, by leveraging shared representations across multiple conditional density estimation tasks.
APA
Ton, J., CHAN, L., Whye Teh, Y. & Sejdinovic, D.. (2021). Noise Contrastive Meta-Learning for Conditional Density Estimation using Kernel Mean Embeddings . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:1099-1107 Available from https://proceedings.mlr.press/v130/ton21a.html.

Related Material