Whittle Networks: A Deep Likelihood Model for Time Series

Zhongjie Yu, Fabrizio G Ventola, Kristian Kersting
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:12177-12186, 2021.

Abstract

While probabilistic circuits have been extensively explored for tabular data, less attention has been paid to time series. Here, the goal is to estimate joint densities among the entire time series and, in turn, determining, for instance, conditional independence relations between them. To this end, we propose the first probabilistic circuits (PCs) approach for modeling the joint distribution of multivariate time series, called Whittle sum-product networks (WSPNs). WSPNs leverage the Whittle approximation, casting the likelihood in the frequency domain, and place a complex-valued sum-product network, the most prominent PC, over the frequencies. The conditional independence relations among the time series can then be determined efficiently in the spectral domain. Moreover, WSPNs can naturally be placed into the deep neural learning stack for time series, resulting in Whittle Networks, opening the likelihood toolbox for training deep neural models and inspecting their behaviour. Our experiments show that Whittle Networks can indeed capture complex dependencies between time series and provide a useful measure of uncertainty for neural networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-yu21c, title = {Whittle Networks: A Deep Likelihood Model for Time Series}, author = {Yu, Zhongjie and Ventola, Fabrizio G and Kersting, Kristian}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {12177--12186}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/yu21c/yu21c.pdf}, url = {https://proceedings.mlr.press/v139/yu21c.html}, abstract = {While probabilistic circuits have been extensively explored for tabular data, less attention has been paid to time series. Here, the goal is to estimate joint densities among the entire time series and, in turn, determining, for instance, conditional independence relations between them. To this end, we propose the first probabilistic circuits (PCs) approach for modeling the joint distribution of multivariate time series, called Whittle sum-product networks (WSPNs). WSPNs leverage the Whittle approximation, casting the likelihood in the frequency domain, and place a complex-valued sum-product network, the most prominent PC, over the frequencies. The conditional independence relations among the time series can then be determined efficiently in the spectral domain. Moreover, WSPNs can naturally be placed into the deep neural learning stack for time series, resulting in Whittle Networks, opening the likelihood toolbox for training deep neural models and inspecting their behaviour. Our experiments show that Whittle Networks can indeed capture complex dependencies between time series and provide a useful measure of uncertainty for neural networks.} }
Endnote
%0 Conference Paper %T Whittle Networks: A Deep Likelihood Model for Time Series %A Zhongjie Yu %A Fabrizio G Ventola %A Kristian Kersting %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-yu21c %I PMLR %P 12177--12186 %U https://proceedings.mlr.press/v139/yu21c.html %V 139 %X While probabilistic circuits have been extensively explored for tabular data, less attention has been paid to time series. Here, the goal is to estimate joint densities among the entire time series and, in turn, determining, for instance, conditional independence relations between them. To this end, we propose the first probabilistic circuits (PCs) approach for modeling the joint distribution of multivariate time series, called Whittle sum-product networks (WSPNs). WSPNs leverage the Whittle approximation, casting the likelihood in the frequency domain, and place a complex-valued sum-product network, the most prominent PC, over the frequencies. The conditional independence relations among the time series can then be determined efficiently in the spectral domain. Moreover, WSPNs can naturally be placed into the deep neural learning stack for time series, resulting in Whittle Networks, opening the likelihood toolbox for training deep neural models and inspecting their behaviour. Our experiments show that Whittle Networks can indeed capture complex dependencies between time series and provide a useful measure of uncertainty for neural networks.
APA
Yu, Z., Ventola, F.G. & Kersting, K.. (2021). Whittle Networks: A Deep Likelihood Model for Time Series. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:12177-12186 Available from https://proceedings.mlr.press/v139/yu21c.html.

Related Material