Recurrent Neural Networks and Universal Approximation of Bayesian Filters

Adrian N. Bishop, Edwin V. Bonilla
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:6956-6967, 2023.

Abstract

We consider the Bayesian optimal filtering problem: i.e. estimating some conditional statistics of a latent time-series signal from an observation sequence. Classical approaches often rely on the use of assumed or estimated transition and observation models. Instead, we formulate a generic recurrent neural network framework and seek to learn directly a recursive mapping from observational inputs to the desired estimator statistics. The main focus of this article is the approximation capabilities of this framework. We provide approximation error bounds for filtering in general non-compact domains. We also consider strong time-uniform approximation error bounds that guarantee good long-time performance. We discuss and illustrate a number of practical concerns and implications of these results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-bishop23a, title = {Recurrent Neural Networks and Universal Approximation of Bayesian Filters}, author = {Bishop, Adrian N. and Bonilla, Edwin V.}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {6956--6967}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/bishop23a/bishop23a.pdf}, url = {https://proceedings.mlr.press/v206/bishop23a.html}, abstract = {We consider the Bayesian optimal filtering problem: i.e. estimating some conditional statistics of a latent time-series signal from an observation sequence. Classical approaches often rely on the use of assumed or estimated transition and observation models. Instead, we formulate a generic recurrent neural network framework and seek to learn directly a recursive mapping from observational inputs to the desired estimator statistics. The main focus of this article is the approximation capabilities of this framework. We provide approximation error bounds for filtering in general non-compact domains. We also consider strong time-uniform approximation error bounds that guarantee good long-time performance. We discuss and illustrate a number of practical concerns and implications of these results.} }
Endnote
%0 Conference Paper %T Recurrent Neural Networks and Universal Approximation of Bayesian Filters %A Adrian N. Bishop %A Edwin V. Bonilla %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-bishop23a %I PMLR %P 6956--6967 %U https://proceedings.mlr.press/v206/bishop23a.html %V 206 %X We consider the Bayesian optimal filtering problem: i.e. estimating some conditional statistics of a latent time-series signal from an observation sequence. Classical approaches often rely on the use of assumed or estimated transition and observation models. Instead, we formulate a generic recurrent neural network framework and seek to learn directly a recursive mapping from observational inputs to the desired estimator statistics. The main focus of this article is the approximation capabilities of this framework. We provide approximation error bounds for filtering in general non-compact domains. We also consider strong time-uniform approximation error bounds that guarantee good long-time performance. We discuss and illustrate a number of practical concerns and implications of these results.
APA
Bishop, A.N. & Bonilla, E.V.. (2023). Recurrent Neural Networks and Universal Approximation of Bayesian Filters. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:6956-6967 Available from https://proceedings.mlr.press/v206/bishop23a.html.

Related Material