Approximating Stacked and Bidirectional Recurrent Architectures with the Delayed Recurrent Neural Network

Javier Turek, Shailee Jain, Vy Vo, Mihai Capotă, Alexander Huth, Theodore Willke
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9648-9658, 2020.

Abstract

Recent work has shown that topological enhancements to recurrent neural networks (RNNs) can increase their expressiveness and representational capacity. Two popular enhancements are stacked RNNs, which increases the capacity for learning non-linear functions, and bidirectional processing, which exploits acausal information in a sequence. In this work, we explore the delayed-RNN, which is a single-layer RNN that has a delay between the input and output. We prove that a weight-constrained version of the delayed-RNN is equivalent to a stacked-RNN. We also show that the delay gives rise to partial acausality, much like bidirectional networks. Synthetic experiments confirm that the delayed-RNN can mimic bidirectional networks, solving some acausal tasks similarly, and outperforming them in others. Moreover, we show similar performance to bidirectional networks in a real-world natural language processing task. These results suggest that delayed-RNNs can approximate topologies including stacked RNNs, bidirectional RNNs, and stacked bidirectional RNNs – but with equivalent or faster runtimes for the delayed-RNNs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-turek20a, title = {Approximating Stacked and Bidirectional Recurrent Architectures with the Delayed Recurrent Neural Network}, author = {Turek, Javier and Jain, Shailee and Vo, Vy and Capot{\u{a}}, Mihai and Huth, Alexander and Willke, Theodore}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {9648--9658}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/turek20a/turek20a.pdf}, url = {https://proceedings.mlr.press/v119/turek20a.html}, abstract = {Recent work has shown that topological enhancements to recurrent neural networks (RNNs) can increase their expressiveness and representational capacity. Two popular enhancements are stacked RNNs, which increases the capacity for learning non-linear functions, and bidirectional processing, which exploits acausal information in a sequence. In this work, we explore the delayed-RNN, which is a single-layer RNN that has a delay between the input and output. We prove that a weight-constrained version of the delayed-RNN is equivalent to a stacked-RNN. We also show that the delay gives rise to partial acausality, much like bidirectional networks. Synthetic experiments confirm that the delayed-RNN can mimic bidirectional networks, solving some acausal tasks similarly, and outperforming them in others. Moreover, we show similar performance to bidirectional networks in a real-world natural language processing task. These results suggest that delayed-RNNs can approximate topologies including stacked RNNs, bidirectional RNNs, and stacked bidirectional RNNs – but with equivalent or faster runtimes for the delayed-RNNs.} }
Endnote
%0 Conference Paper %T Approximating Stacked and Bidirectional Recurrent Architectures with the Delayed Recurrent Neural Network %A Javier Turek %A Shailee Jain %A Vy Vo %A Mihai Capotă %A Alexander Huth %A Theodore Willke %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-turek20a %I PMLR %P 9648--9658 %U https://proceedings.mlr.press/v119/turek20a.html %V 119 %X Recent work has shown that topological enhancements to recurrent neural networks (RNNs) can increase their expressiveness and representational capacity. Two popular enhancements are stacked RNNs, which increases the capacity for learning non-linear functions, and bidirectional processing, which exploits acausal information in a sequence. In this work, we explore the delayed-RNN, which is a single-layer RNN that has a delay between the input and output. We prove that a weight-constrained version of the delayed-RNN is equivalent to a stacked-RNN. We also show that the delay gives rise to partial acausality, much like bidirectional networks. Synthetic experiments confirm that the delayed-RNN can mimic bidirectional networks, solving some acausal tasks similarly, and outperforming them in others. Moreover, we show similar performance to bidirectional networks in a real-world natural language processing task. These results suggest that delayed-RNNs can approximate topologies including stacked RNNs, bidirectional RNNs, and stacked bidirectional RNNs – but with equivalent or faster runtimes for the delayed-RNNs.
APA
Turek, J., Jain, S., Vo, V., Capotă, M., Huth, A. & Willke, T.. (2020). Approximating Stacked and Bidirectional Recurrent Architectures with the Delayed Recurrent Neural Network. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:9648-9658 Available from https://proceedings.mlr.press/v119/turek20a.html.

Related Material