Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules

Sarthak Mittal, Alex Lamb, Anirudh Goyal, Vikram Voleti, Murray Shanahan, Guillaume Lajoie, Michael Mozer, Yoshua Bengio
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6972-6986, 2020.

Abstract

Robust perception relies on both bottom-up and top-down signals. Bottom-up signals consist of what’s directly observed through sensation. Top-down signals consist of beliefs and expectations based on past experience and the current reportable short-term memory, such as how the phrase ‘peanut butter and ...’ will be completed. The optimal combination of bottom-up and top-down information remains an open question, but the manner of combination must be dynamic and both context and task dependent. To effectively utilize the wealth of potential top-down information available, and to prevent the cacophony of intermixed signals in a bidirectional architecture, mechanisms are needed to restrict information flow. We explore deep recurrent neural net architectures in which bottom-up and top-down signals are dynamically combined using attention. Modularity of the architecture further restricts the sharing and communication of information. Together, attention and modularity direct information flow, which leads to reliable performance improvements in perceptual and language tasks, and in particular improves robustness to distractions and noisy data. We demonstrate on a variety of benchmarks in language modeling, sequential image classification, video prediction and reinforcement learning that the \emph{bidirectional} information flow can improve results over strong baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-mittal20a, title = {Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules}, author = {Mittal, Sarthak and Lamb, Alex and Goyal, Anirudh and Voleti, Vikram and Shanahan, Murray and Lajoie, Guillaume and Mozer, Michael and Bengio, Yoshua}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6972--6986}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/mittal20a/mittal20a.pdf}, url = {https://proceedings.mlr.press/v119/mittal20a.html}, abstract = {Robust perception relies on both bottom-up and top-down signals. Bottom-up signals consist of what’s directly observed through sensation. Top-down signals consist of beliefs and expectations based on past experience and the current reportable short-term memory, such as how the phrase ‘peanut butter and ...’ will be completed. The optimal combination of bottom-up and top-down information remains an open question, but the manner of combination must be dynamic and both context and task dependent. To effectively utilize the wealth of potential top-down information available, and to prevent the cacophony of intermixed signals in a bidirectional architecture, mechanisms are needed to restrict information flow. We explore deep recurrent neural net architectures in which bottom-up and top-down signals are dynamically combined using attention. Modularity of the architecture further restricts the sharing and communication of information. Together, attention and modularity direct information flow, which leads to reliable performance improvements in perceptual and language tasks, and in particular improves robustness to distractions and noisy data. We demonstrate on a variety of benchmarks in language modeling, sequential image classification, video prediction and reinforcement learning that the \emph{bidirectional} information flow can improve results over strong baselines.} }
Endnote
%0 Conference Paper %T Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules %A Sarthak Mittal %A Alex Lamb %A Anirudh Goyal %A Vikram Voleti %A Murray Shanahan %A Guillaume Lajoie %A Michael Mozer %A Yoshua Bengio %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-mittal20a %I PMLR %P 6972--6986 %U https://proceedings.mlr.press/v119/mittal20a.html %V 119 %X Robust perception relies on both bottom-up and top-down signals. Bottom-up signals consist of what’s directly observed through sensation. Top-down signals consist of beliefs and expectations based on past experience and the current reportable short-term memory, such as how the phrase ‘peanut butter and ...’ will be completed. The optimal combination of bottom-up and top-down information remains an open question, but the manner of combination must be dynamic and both context and task dependent. To effectively utilize the wealth of potential top-down information available, and to prevent the cacophony of intermixed signals in a bidirectional architecture, mechanisms are needed to restrict information flow. We explore deep recurrent neural net architectures in which bottom-up and top-down signals are dynamically combined using attention. Modularity of the architecture further restricts the sharing and communication of information. Together, attention and modularity direct information flow, which leads to reliable performance improvements in perceptual and language tasks, and in particular improves robustness to distractions and noisy data. We demonstrate on a variety of benchmarks in language modeling, sequential image classification, video prediction and reinforcement learning that the \emph{bidirectional} information flow can improve results over strong baselines.
APA
Mittal, S., Lamb, A., Goyal, A., Voleti, V., Shanahan, M., Lajoie, G., Mozer, M. & Bengio, Y.. (2020). Learning to Combine Top-Down and Bottom-Up Signals in Recurrent Neural Networks with Attention over Modules. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6972-6986 Available from https://proceedings.mlr.press/v119/mittal20a.html.

Related Material