Nugget: Neural Agglomerative Embeddings of Text

Guanghui Qin, Benjamin Van Durme
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:28337-28350, 2023.

Abstract

Embedding text sequences is a widespread requirement in modern language understanding. Existing approaches focus largely on constant-size representations. This is problematic, as the amount of information contained in text often varies with the length of the input. We propose a solution called Nugget, which encodes language into a representation based on a dynamically selected subset of input tokens. These nuggets are learned through tasks like autoencoding and machine translation, and intuitively segment language into meaningful units. We demonstrate Nugget outperforms related approaches in tasks involving semantic comparison. Finally, we illustrate these compact units allow for expanding the contextual window of a language model (LM), suggesting new future LMs that can condition on significantly larger amounts of content.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-qin23a, title = {Nugget: Neural Agglomerative Embeddings of Text}, author = {Qin, Guanghui and Van Durme, Benjamin}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {28337--28350}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/qin23a/qin23a.pdf}, url = {https://proceedings.mlr.press/v202/qin23a.html}, abstract = {Embedding text sequences is a widespread requirement in modern language understanding. Existing approaches focus largely on constant-size representations. This is problematic, as the amount of information contained in text often varies with the length of the input. We propose a solution called Nugget, which encodes language into a representation based on a dynamically selected subset of input tokens. These nuggets are learned through tasks like autoencoding and machine translation, and intuitively segment language into meaningful units. We demonstrate Nugget outperforms related approaches in tasks involving semantic comparison. Finally, we illustrate these compact units allow for expanding the contextual window of a language model (LM), suggesting new future LMs that can condition on significantly larger amounts of content.} }
Endnote
%0 Conference Paper %T Nugget: Neural Agglomerative Embeddings of Text %A Guanghui Qin %A Benjamin Van Durme %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-qin23a %I PMLR %P 28337--28350 %U https://proceedings.mlr.press/v202/qin23a.html %V 202 %X Embedding text sequences is a widespread requirement in modern language understanding. Existing approaches focus largely on constant-size representations. This is problematic, as the amount of information contained in text often varies with the length of the input. We propose a solution called Nugget, which encodes language into a representation based on a dynamically selected subset of input tokens. These nuggets are learned through tasks like autoencoding and machine translation, and intuitively segment language into meaningful units. We demonstrate Nugget outperforms related approaches in tasks involving semantic comparison. Finally, we illustrate these compact units allow for expanding the contextual window of a language model (LM), suggesting new future LMs that can condition on significantly larger amounts of content.
APA
Qin, G. & Van Durme, B.. (2023). Nugget: Neural Agglomerative Embeddings of Text. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:28337-28350 Available from https://proceedings.mlr.press/v202/qin23a.html.

Related Material