Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning

[edit]

Guillaume Rabusseau, Tianyu Li, Doina Precup ;
Proceedings of Machine Learning Research, PMLR 89:1630-1639, 2019.

Abstract

In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and second-order recurrent neural networks (2-RNNs): in the case of sequences of discrete symbols, WFAs and 2-RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vector-valued WFAs and propose the first provable learning algorithm for linear 2-RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the so-called Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.

Related Material