Connecting Weighted Automata and Recurrent Neural Networks through Spectral Learning
[edit]
Proceedings of Machine Learning Research, PMLR 89:16301639, 2019.
Abstract
In this paper, we unravel a fundamental connection between weighted finite automata (WFAs) and secondorder recurrent neural networks (2RNNs): in the case of sequences of discrete symbols, WFAs and 2RNNs with linear activation functions are expressively equivalent. Motivated by this result, we build upon a recent extension of the spectral learning algorithm to vectorvalued WFAs and propose the first provable learning algorithm for linear 2RNNs defined over sequences of continuous input vectors. This algorithm relies on estimating low rank subblocks of the socalled Hankel tensor, from which the parameters of a linear 2RNN can be provably recovered. The performances of the proposed method are assessed in a simulation study.
Related Material


