Extracting Weighted Automata for Approximate Minimization in Language Modelling

Clara Lacroce, Prakash Panangaden, Guillaume Rabusseau
Proceedings of the Fifteenth International Conference on Grammatical Inference, PMLR 153:92-112, 2021.

Abstract

In this paper we study the approximate minimization problem for language modelling. We assume we are given some language model as a black box. The objective is to obtain a weighted finite automaton (WFA) that fits within a given size constraint and which mimics the behaviour of the original model while minimizing some notion of distance between the black box and the extracted WFA. We provide an algorithm for the approximate minimization of black boxes trained for language modelling of sequential data over a one-letter alphabet. By reformulating the problem in terms of Hankel matrices, we leverage classical results on the approximation of Hankel operators, namely the celebrated Adamyan-Arov-Krein (AAK) theory. This allows us to use the spectral norm to measure the distance between the black box and the WFA. We provide theoretical guarantees to study the potentially infinite-rank Hankel matrix of the black box, without accessing the training data, and we prove that our method returns an asymptotically-optimal approximation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v153-lacroce21a, title = {Extracting Weighted Automata for Approximate Minimization in Language Modelling}, author = {Lacroce, Clara and Panangaden, Prakash and Rabusseau, Guillaume}, booktitle = {Proceedings of the Fifteenth International Conference on Grammatical Inference}, pages = {92--112}, year = {2021}, editor = {Chandlee, Jane and Eyraud, Rémi and Heinz, Jeff and Jardine, Adam and van Zaanen, Menno}, volume = {153}, series = {Proceedings of Machine Learning Research}, month = {23--27 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v153/lacroce21a/lacroce21a.pdf}, url = {https://proceedings.mlr.press/v153/lacroce21a.html}, abstract = {In this paper we study the approximate minimization problem for language modelling. We assume we are given some language model as a black box. The objective is to obtain a weighted finite automaton (WFA) that fits within a given size constraint and which mimics the behaviour of the original model while minimizing some notion of distance between the black box and the extracted WFA. We provide an algorithm for the approximate minimization of black boxes trained for language modelling of sequential data over a one-letter alphabet. By reformulating the problem in terms of Hankel matrices, we leverage classical results on the approximation of Hankel operators, namely the celebrated Adamyan-Arov-Krein (AAK) theory. This allows us to use the spectral norm to measure the distance between the black box and the WFA. We provide theoretical guarantees to study the potentially infinite-rank Hankel matrix of the black box, without accessing the training data, and we prove that our method returns an asymptotically-optimal approximation.} }
Endnote
%0 Conference Paper %T Extracting Weighted Automata for Approximate Minimization in Language Modelling %A Clara Lacroce %A Prakash Panangaden %A Guillaume Rabusseau %B Proceedings of the Fifteenth International Conference on Grammatical Inference %C Proceedings of Machine Learning Research %D 2021 %E Jane Chandlee %E Rémi Eyraud %E Jeff Heinz %E Adam Jardine %E Menno van Zaanen %F pmlr-v153-lacroce21a %I PMLR %P 92--112 %U https://proceedings.mlr.press/v153/lacroce21a.html %V 153 %X In this paper we study the approximate minimization problem for language modelling. We assume we are given some language model as a black box. The objective is to obtain a weighted finite automaton (WFA) that fits within a given size constraint and which mimics the behaviour of the original model while minimizing some notion of distance between the black box and the extracted WFA. We provide an algorithm for the approximate minimization of black boxes trained for language modelling of sequential data over a one-letter alphabet. By reformulating the problem in terms of Hankel matrices, we leverage classical results on the approximation of Hankel operators, namely the celebrated Adamyan-Arov-Krein (AAK) theory. This allows us to use the spectral norm to measure the distance between the black box and the WFA. We provide theoretical guarantees to study the potentially infinite-rank Hankel matrix of the black box, without accessing the training data, and we prove that our method returns an asymptotically-optimal approximation.
APA
Lacroce, C., Panangaden, P. & Rabusseau, G.. (2021). Extracting Weighted Automata for Approximate Minimization in Language Modelling. Proceedings of the Fifteenth International Conference on Grammatical Inference, in Proceedings of Machine Learning Research 153:92-112 Available from https://proceedings.mlr.press/v153/lacroce21a.html.

Related Material