Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning

Guillaume Rabusseau, Tianyu Li, Doina Precup
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1630-1639, 2019.

Abstract

In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and second-order recurrent neural networks (2-RNNs): in the case of sequences of discrete symbols, WFAs and 2-RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vector-valued WFAs and propose the first provable learning algorithm for linear 2-RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the so-called Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-rabusseau19a, title = {Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning}, author = {Rabusseau, Guillaume and Li, Tianyu and Precup, Doina}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {1630--1639}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/rabusseau19a/rabusseau19a.pdf}, url = {http://proceedings.mlr.press/v89/rabusseau19a.html}, abstract = {In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and second-order recurrent neural networks (2-RNNs): in the case of sequences of discrete symbols, WFAs and 2-RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vector-valued WFAs and propose the first provable learning algorithm for linear 2-RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the so-called Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.} }
Endnote
%0 Conference Paper %T Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning %A Guillaume Rabusseau %A Tianyu Li %A Doina Precup %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-rabusseau19a %I PMLR %P 1630--1639 %U http://proceedings.mlr.press/v89/rabusseau19a.html %V 89 %X In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and second-order recurrent neural networks (2-RNNs): in the case of sequences of discrete symbols, WFAs and 2-RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vector-valued WFAs and propose the first provable learning algorithm for linear 2-RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the so-called Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.
APA
Rabusseau, G., Li, T. & Precup, D.. (2019). Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:1630-1639 Available from http://proceedings.mlr.press/v89/rabusseau19a.html.

Related Material