Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series

Abdul Fatir Ansari, Alvin Heng, Andre Lim, Harold Soh
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:926-951, 2023.

Abstract

Learning accurate predictive models of real-world dynamic phenomena (e.g., climate, biological) remains a challenging task. One key issue is that the data generated by both natural and artificial processes often comprise time series that are irregularly sampled and/or contain missing observations. In this work, we propose the Neural Continuous-Discrete State Space Model (NCDSSM) for continuous-time modeling of time series through discrete-time observations. NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables. Leveraging techniques from continuous-discrete filtering theory, we demonstrate how to perform accurate Bayesian inference for the dynamic states. We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference. Empirical results on multiple benchmark datasets across various domains show improved imputation and forecasting performance of NCDSSM over existing models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-ansari23a, title = {Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series}, author = {Ansari, Abdul Fatir and Heng, Alvin and Lim, Andre and Soh, Harold}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {926--951}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/ansari23a/ansari23a.pdf}, url = {https://proceedings.mlr.press/v202/ansari23a.html}, abstract = {Learning accurate predictive models of real-world dynamic phenomena (e.g., climate, biological) remains a challenging task. One key issue is that the data generated by both natural and artificial processes often comprise time series that are irregularly sampled and/or contain missing observations. In this work, we propose the Neural Continuous-Discrete State Space Model (NCDSSM) for continuous-time modeling of time series through discrete-time observations. NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables. Leveraging techniques from continuous-discrete filtering theory, we demonstrate how to perform accurate Bayesian inference for the dynamic states. We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference. Empirical results on multiple benchmark datasets across various domains show improved imputation and forecasting performance of NCDSSM over existing models.} }
Endnote
%0 Conference Paper %T Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series %A Abdul Fatir Ansari %A Alvin Heng %A Andre Lim %A Harold Soh %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-ansari23a %I PMLR %P 926--951 %U https://proceedings.mlr.press/v202/ansari23a.html %V 202 %X Learning accurate predictive models of real-world dynamic phenomena (e.g., climate, biological) remains a challenging task. One key issue is that the data generated by both natural and artificial processes often comprise time series that are irregularly sampled and/or contain missing observations. In this work, we propose the Neural Continuous-Discrete State Space Model (NCDSSM) for continuous-time modeling of time series through discrete-time observations. NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables. Leveraging techniques from continuous-discrete filtering theory, we demonstrate how to perform accurate Bayesian inference for the dynamic states. We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference. Empirical results on multiple benchmark datasets across various domains show improved imputation and forecasting performance of NCDSSM over existing models.
APA
Ansari, A.F., Heng, A., Lim, A. & Soh, H.. (2023). Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:926-951 Available from https://proceedings.mlr.press/v202/ansari23a.html.

Related Material