Dynamic Evaluation of Neural Sequence Models

Ben Krause, Emmanuel Kahembwe, Iain Murray, Steve Renals
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:2766-2775, 2018.

Abstract

We explore dynamic evaluation, where sequence models are adapted to the recent sequence history using gradient descent, assigning higher probabilities to re-occurring sequential patterns. We develop a dynamic evaluation approach that outperforms existing adaptation approaches in our comparisons. We apply dynamic evaluation to outperform all previous word-level perplexities on the Penn Treebank and WikiText-2 datasets (achieving 51.1 and 44.3 respectively) and all previous character-level cross-entropies on the text8 and Hutter Prize datasets (achieving 1.19 bits/char and 1.08 bits/char respectively).

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-krause18a, title = {Dynamic Evaluation of Neural Sequence Models}, author = {Krause, Ben and Kahembwe, Emmanuel and Murray, Iain and Renals, Steve}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {2766--2775}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/krause18a/krause18a.pdf}, url = {http://proceedings.mlr.press/v80/krause18a.html}, abstract = {We explore dynamic evaluation, where sequence models are adapted to the recent sequence history using gradient descent, assigning higher probabilities to re-occurring sequential patterns. We develop a dynamic evaluation approach that outperforms existing adaptation approaches in our comparisons. We apply dynamic evaluation to outperform all previous word-level perplexities on the Penn Treebank and WikiText-2 datasets (achieving 51.1 and 44.3 respectively) and all previous character-level cross-entropies on the text8 and Hutter Prize datasets (achieving 1.19 bits/char and 1.08 bits/char respectively).} }
Endnote
%0 Conference Paper %T Dynamic Evaluation of Neural Sequence Models %A Ben Krause %A Emmanuel Kahembwe %A Iain Murray %A Steve Renals %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-krause18a %I PMLR %P 2766--2775 %U http://proceedings.mlr.press/v80/krause18a.html %V 80 %X We explore dynamic evaluation, where sequence models are adapted to the recent sequence history using gradient descent, assigning higher probabilities to re-occurring sequential patterns. We develop a dynamic evaluation approach that outperforms existing adaptation approaches in our comparisons. We apply dynamic evaluation to outperform all previous word-level perplexities on the Penn Treebank and WikiText-2 datasets (achieving 51.1 and 44.3 respectively) and all previous character-level cross-entropies on the text8 and Hutter Prize datasets (achieving 1.19 bits/char and 1.08 bits/char respectively).
APA
Krause, B., Kahembwe, E., Murray, I. & Renals, S.. (2018). Dynamic Evaluation of Neural Sequence Models. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:2766-2775 Available from http://proceedings.mlr.press/v80/krause18a.html.

Related Material