Recurrent Model-Free RL Can Be a Strong Baseline for Many POMDPs

Tianwei Ni, Benjamin Eysenbach, Ruslan Salakhutdinov
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:16691-16723, 2022.

Abstract

Many problems in RL, such as meta-RL, robust RL, generalization in RL, and temporal credit assignment, can be cast as POMDPs. In theory, simply augmenting model-free RL with memory-based architectures, such as recurrent neural networks, provides a general approach to solving all types of POMDPs. However, prior work has found that such recurrent model-free RL methods tend to perform worse than more specialized algorithms that are designed for specific types of POMDPs. This paper revisits this claim. We find that careful architecture and hyperparameter decisions can often yield a recurrent model-free implementation that performs on par with (and occasionally substantially better than) more sophisticated recent techniques. We compare to 21 environments from 6 prior specialized methods and find that our implementation achieves greater sample efficiency and asymptotic performance than these methods on 18/21 environments. We also release a simple and efficient implementation of recurrent model-free RL for future work to use as a baseline for POMDPs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-ni22a, title = {Recurrent Model-Free {RL} Can Be a Strong Baseline for Many {POMDP}s}, author = {Ni, Tianwei and Eysenbach, Benjamin and Salakhutdinov, Ruslan}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {16691--16723}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/ni22a/ni22a.pdf}, url = {https://proceedings.mlr.press/v162/ni22a.html}, abstract = {Many problems in RL, such as meta-RL, robust RL, generalization in RL, and temporal credit assignment, can be cast as POMDPs. In theory, simply augmenting model-free RL with memory-based architectures, such as recurrent neural networks, provides a general approach to solving all types of POMDPs. However, prior work has found that such recurrent model-free RL methods tend to perform worse than more specialized algorithms that are designed for specific types of POMDPs. This paper revisits this claim. We find that careful architecture and hyperparameter decisions can often yield a recurrent model-free implementation that performs on par with (and occasionally substantially better than) more sophisticated recent techniques. We compare to 21 environments from 6 prior specialized methods and find that our implementation achieves greater sample efficiency and asymptotic performance than these methods on 18/21 environments. We also release a simple and efficient implementation of recurrent model-free RL for future work to use as a baseline for POMDPs.} }
Endnote
%0 Conference Paper %T Recurrent Model-Free RL Can Be a Strong Baseline for Many POMDPs %A Tianwei Ni %A Benjamin Eysenbach %A Ruslan Salakhutdinov %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-ni22a %I PMLR %P 16691--16723 %U https://proceedings.mlr.press/v162/ni22a.html %V 162 %X Many problems in RL, such as meta-RL, robust RL, generalization in RL, and temporal credit assignment, can be cast as POMDPs. In theory, simply augmenting model-free RL with memory-based architectures, such as recurrent neural networks, provides a general approach to solving all types of POMDPs. However, prior work has found that such recurrent model-free RL methods tend to perform worse than more specialized algorithms that are designed for specific types of POMDPs. This paper revisits this claim. We find that careful architecture and hyperparameter decisions can often yield a recurrent model-free implementation that performs on par with (and occasionally substantially better than) more sophisticated recent techniques. We compare to 21 environments from 6 prior specialized methods and find that our implementation achieves greater sample efficiency and asymptotic performance than these methods on 18/21 environments. We also release a simple and efficient implementation of recurrent model-free RL for future work to use as a baseline for POMDPs.
APA
Ni, T., Eysenbach, B. & Salakhutdinov, R.. (2022). Recurrent Model-Free RL Can Be a Strong Baseline for Many POMDPs. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:16691-16723 Available from https://proceedings.mlr.press/v162/ni22a.html.

Related Material