Conditional Temporal Neural Processes with Covariance Loss

Boseon Yoo, Jiwoo Lee, Janghoon Ju, Seijun Chung, Soyeon Kim, Jaesik Choi
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:12051-12061, 2021.

Abstract

We introduce a novel loss function, Covariance Loss, which is conceptually equivalent to conditional neural processes and has a form of regularization so that is applicable to many kinds of neural networks. With the proposed loss, mappings from input variables to target variables are highly affected by dependencies of target variables as well as mean activation and mean dependencies of input and target variables. This nature enables the resulting neural networks to become more robust to noisy observations and recapture missing dependencies from prior information. In order to show the validity of the proposed loss, we conduct extensive sets of experiments on real-world datasets with state-of-the-art models and discuss the benefits and drawbacks of the proposed Covariance Loss.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-yoo21b, title = {Conditional Temporal Neural Processes with Covariance Loss}, author = {Yoo, Boseon and Lee, Jiwoo and Ju, Janghoon and Chung, Seijun and Kim, Soyeon and Choi, Jaesik}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {12051--12061}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/yoo21b/yoo21b.pdf}, url = {https://proceedings.mlr.press/v139/yoo21b.html}, abstract = {We introduce a novel loss function, Covariance Loss, which is conceptually equivalent to conditional neural processes and has a form of regularization so that is applicable to many kinds of neural networks. With the proposed loss, mappings from input variables to target variables are highly affected by dependencies of target variables as well as mean activation and mean dependencies of input and target variables. This nature enables the resulting neural networks to become more robust to noisy observations and recapture missing dependencies from prior information. In order to show the validity of the proposed loss, we conduct extensive sets of experiments on real-world datasets with state-of-the-art models and discuss the benefits and drawbacks of the proposed Covariance Loss.} }
Endnote
%0 Conference Paper %T Conditional Temporal Neural Processes with Covariance Loss %A Boseon Yoo %A Jiwoo Lee %A Janghoon Ju %A Seijun Chung %A Soyeon Kim %A Jaesik Choi %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-yoo21b %I PMLR %P 12051--12061 %U https://proceedings.mlr.press/v139/yoo21b.html %V 139 %X We introduce a novel loss function, Covariance Loss, which is conceptually equivalent to conditional neural processes and has a form of regularization so that is applicable to many kinds of neural networks. With the proposed loss, mappings from input variables to target variables are highly affected by dependencies of target variables as well as mean activation and mean dependencies of input and target variables. This nature enables the resulting neural networks to become more robust to noisy observations and recapture missing dependencies from prior information. In order to show the validity of the proposed loss, we conduct extensive sets of experiments on real-world datasets with state-of-the-art models and discuss the benefits and drawbacks of the proposed Covariance Loss.
APA
Yoo, B., Lee, J., Ju, J., Chung, S., Kim, S. & Choi, J.. (2021). Conditional Temporal Neural Processes with Covariance Loss. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:12051-12061 Available from https://proceedings.mlr.press/v139/yoo21b.html.

Related Material