A New Modeling Framework for Continuous, Sequential Domains

Hailiang Dong, James Amato, Vibhav Gogate, Nicholas Ruozzi
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:11118-11131, 2023.

Abstract

Temporal models such as Dynamic Bayesian Networks (DBNs) and Hidden Markov Models (HMMs) have been widely used to model time-dependent sequential data. Typically, these approaches limit focus to discrete domains, employ first-order Markov and stationary assumptions, and limit representational power so that efficient (approximate) inference procedures can be applied. We propose a novel temporal model for continuous domains, where the transition distribution is conditionally tractable: it is modelled as a tractable continuous density over the variables at the current time slice only, while the parameters are controlled using a Recurrent Neural Network (RNN) that takes all previous observations as input. We show that, in this model, various inference tasks can be efficiently implemented using forward filtering with simple gradient ascent. Our experimental results on two different tasks over several real-world sequential datasets demonstrate the superior performance of our model against existing competitors.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-dong23a, title = {A New Modeling Framework for Continuous, Sequential Domains}, author = {Dong, Hailiang and Amato, James and Gogate, Vibhav and Ruozzi, Nicholas}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {11118--11131}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/dong23a/dong23a.pdf}, url = {https://proceedings.mlr.press/v206/dong23a.html}, abstract = {Temporal models such as Dynamic Bayesian Networks (DBNs) and Hidden Markov Models (HMMs) have been widely used to model time-dependent sequential data. Typically, these approaches limit focus to discrete domains, employ first-order Markov and stationary assumptions, and limit representational power so that efficient (approximate) inference procedures can be applied. We propose a novel temporal model for continuous domains, where the transition distribution is conditionally tractable: it is modelled as a tractable continuous density over the variables at the current time slice only, while the parameters are controlled using a Recurrent Neural Network (RNN) that takes all previous observations as input. We show that, in this model, various inference tasks can be efficiently implemented using forward filtering with simple gradient ascent. Our experimental results on two different tasks over several real-world sequential datasets demonstrate the superior performance of our model against existing competitors.} }
Endnote
%0 Conference Paper %T A New Modeling Framework for Continuous, Sequential Domains %A Hailiang Dong %A James Amato %A Vibhav Gogate %A Nicholas Ruozzi %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-dong23a %I PMLR %P 11118--11131 %U https://proceedings.mlr.press/v206/dong23a.html %V 206 %X Temporal models such as Dynamic Bayesian Networks (DBNs) and Hidden Markov Models (HMMs) have been widely used to model time-dependent sequential data. Typically, these approaches limit focus to discrete domains, employ first-order Markov and stationary assumptions, and limit representational power so that efficient (approximate) inference procedures can be applied. We propose a novel temporal model for continuous domains, where the transition distribution is conditionally tractable: it is modelled as a tractable continuous density over the variables at the current time slice only, while the parameters are controlled using a Recurrent Neural Network (RNN) that takes all previous observations as input. We show that, in this model, various inference tasks can be efficiently implemented using forward filtering with simple gradient ascent. Our experimental results on two different tasks over several real-world sequential datasets demonstrate the superior performance of our model against existing competitors.
APA
Dong, H., Amato, J., Gogate, V. & Ruozzi, N.. (2023). A New Modeling Framework for Continuous, Sequential Domains. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:11118-11131 Available from https://proceedings.mlr.press/v206/dong23a.html.

Related Material