Learning of Regular Languages by Recurrent Neural Networks? (Mainly Questions)

Dana Angluin
Proceedings of 16th edition of the International Conference on Grammatical Inference, PMLR 217:4-4, 2023.

Abstract

Recurrent neural network architectures were introduced over 30 years ago. From the start attention focused on their performance at learning regular languages using some variant of gradient descent. This talk reviews some of the history of that research, includes some empirical observations, and emphasizes questions to which we still seek answers.

Cite this Paper


BibTeX
@InProceedings{pmlr-v217-angluin23a, title = {Learning of Regular Languages by Recurrent Neural Networks? (Mainly Questions)}, author = {Angluin, Dana}, booktitle = {Proceedings of 16th edition of the International Conference on Grammatical Inference}, pages = {4--4}, year = {2023}, editor = {Coste, Fran├žois and Ouardi, Faissal and Rabusseau, Guillaume}, volume = {217}, series = {Proceedings of Machine Learning Research}, month = {10--13 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v217/angluin23a/angluin23a.pdf}, url = {https://proceedings.mlr.press/v217/angluin23a.html}, abstract = {Recurrent neural network architectures were introduced over 30 years ago. From the start attention focused on their performance at learning regular languages using some variant of gradient descent. This talk reviews some of the history of that research, includes some empirical observations, and emphasizes questions to which we still seek answers.} }
Endnote
%0 Conference Paper %T Learning of Regular Languages by Recurrent Neural Networks? (Mainly Questions) %A Dana Angluin %B Proceedings of 16th edition of the International Conference on Grammatical Inference %C Proceedings of Machine Learning Research %D 2023 %E Fran├žois Coste %E Faissal Ouardi %E Guillaume Rabusseau %F pmlr-v217-angluin23a %I PMLR %P 4--4 %U https://proceedings.mlr.press/v217/angluin23a.html %V 217 %X Recurrent neural network architectures were introduced over 30 years ago. From the start attention focused on their performance at learning regular languages using some variant of gradient descent. This talk reviews some of the history of that research, includes some empirical observations, and emphasizes questions to which we still seek answers.
APA
Angluin, D.. (2023). Learning of Regular Languages by Recurrent Neural Networks? (Mainly Questions). Proceedings of 16th edition of the International Conference on Grammatical Inference, in Proceedings of Machine Learning Research 217:4-4 Available from https://proceedings.mlr.press/v217/angluin23a.html.

Related Material