Reconstructing Nonlinear Dynamical Systems from Multi-Modal Time Series

Daniel Kramer, Philine L Bommer, Carlo Tombolini, Georgia Koppe, Daniel Durstewitz
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:11613-11633, 2022.

Abstract

Empirically observed time series in physics, biology, or medicine, are commonly generated by some underlying dynamical system (DS) which is the target of scientific interest. There is an increasing interest to harvest machine learning methods to reconstruct this latent DS in a data-driven, unsupervised way. In many areas of science it is common to sample time series observations from many data modalities simultaneously, e.g. electrophysiological and behavioral time series in a typical neuroscience experiment. However, current machine learning tools for reconstructing DSs usually focus on just one data modality. Here we propose a general framework for multi-modal data integration for the purpose of nonlinear DS reconstruction and the analysis of cross-modal relations. This framework is based on dynamically interpretable recurrent neural networks as general approximators of nonlinear DSs, coupled to sets of modality-specific decoder models from the class of generalized linear models. Both an expectation-maximization and a variational inference algorithm for model training are advanced and compared. We show on nonlinear DS benchmarks that our algorithms can efficiently compensate for too noisy or missing information in one data channel by exploiting other channels, and demonstrate on experimental neuroscience data how the algorithm learns to link different data domains to the underlying dynamics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-kramer22a, title = {Reconstructing Nonlinear Dynamical Systems from Multi-Modal Time Series}, author = {Kramer, Daniel and Bommer, Philine L and Tombolini, Carlo and Koppe, Georgia and Durstewitz, Daniel}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {11613--11633}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/kramer22a/kramer22a.pdf}, url = {https://proceedings.mlr.press/v162/kramer22a.html}, abstract = {Empirically observed time series in physics, biology, or medicine, are commonly generated by some underlying dynamical system (DS) which is the target of scientific interest. There is an increasing interest to harvest machine learning methods to reconstruct this latent DS in a data-driven, unsupervised way. In many areas of science it is common to sample time series observations from many data modalities simultaneously, e.g. electrophysiological and behavioral time series in a typical neuroscience experiment. However, current machine learning tools for reconstructing DSs usually focus on just one data modality. Here we propose a general framework for multi-modal data integration for the purpose of nonlinear DS reconstruction and the analysis of cross-modal relations. This framework is based on dynamically interpretable recurrent neural networks as general approximators of nonlinear DSs, coupled to sets of modality-specific decoder models from the class of generalized linear models. Both an expectation-maximization and a variational inference algorithm for model training are advanced and compared. We show on nonlinear DS benchmarks that our algorithms can efficiently compensate for too noisy or missing information in one data channel by exploiting other channels, and demonstrate on experimental neuroscience data how the algorithm learns to link different data domains to the underlying dynamics.} }
Endnote
%0 Conference Paper %T Reconstructing Nonlinear Dynamical Systems from Multi-Modal Time Series %A Daniel Kramer %A Philine L Bommer %A Carlo Tombolini %A Georgia Koppe %A Daniel Durstewitz %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-kramer22a %I PMLR %P 11613--11633 %U https://proceedings.mlr.press/v162/kramer22a.html %V 162 %X Empirically observed time series in physics, biology, or medicine, are commonly generated by some underlying dynamical system (DS) which is the target of scientific interest. There is an increasing interest to harvest machine learning methods to reconstruct this latent DS in a data-driven, unsupervised way. In many areas of science it is common to sample time series observations from many data modalities simultaneously, e.g. electrophysiological and behavioral time series in a typical neuroscience experiment. However, current machine learning tools for reconstructing DSs usually focus on just one data modality. Here we propose a general framework for multi-modal data integration for the purpose of nonlinear DS reconstruction and the analysis of cross-modal relations. This framework is based on dynamically interpretable recurrent neural networks as general approximators of nonlinear DSs, coupled to sets of modality-specific decoder models from the class of generalized linear models. Both an expectation-maximization and a variational inference algorithm for model training are advanced and compared. We show on nonlinear DS benchmarks that our algorithms can efficiently compensate for too noisy or missing information in one data channel by exploiting other channels, and demonstrate on experimental neuroscience data how the algorithm learns to link different data domains to the underlying dynamics.
APA
Kramer, D., Bommer, P.L., Tombolini, C., Koppe, G. & Durstewitz, D.. (2022). Reconstructing Nonlinear Dynamical Systems from Multi-Modal Time Series. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:11613-11633 Available from https://proceedings.mlr.press/v162/kramer22a.html.

Related Material