Effective Use of Bidirectional Language Modeling for Transfer Learning in Biomedical Named Entity Recognition

Devendra Singh Sachan, Pengtao Xie, Mrinmaya Sachan, Eric P. Xing
Proceedings of the 3rd Machine Learning for Healthcare Conference, PMLR 85:383-402, 2018.

Abstract

Biomedical named entity recognition (NER) is a fundamental task in text mining of medical documents and has many applications. Deep learning based approaches to this task have been gaining increasing attention in recent years as their parameters can be learned end-to-end without the need for hand-engineered features. However, these approaches rely on high-quality labeled data, which is expensive to obtain. To address this issue, we investigate how to use unlabeled text data to improve the performance of NER models. Specifically, we train a bidirectional language model (BiLM) on unlabeled data and transfer its weights to ?pretrain? an NER model with the same architecture as the BiLM, which results in a better parameter initialization of the NER model. We evaluate our approach on four benchmark datasets for biomedical NER and show that it leads to a substantial improvement in the F1 scores compared with the state-of-the-art approaches. We also show that BiLM weight transfer leads to a faster model training and the pretrained model requires fewer training examples to achieve a particular F1 score.

Cite this Paper


BibTeX
@InProceedings{pmlr-v85-sachan18a, title = {Effective Use of Bidirectional Language Modeling for Transfer Learning in Biomedical Named Entity Recognition}, author = {Sachan, Devendra Singh and Xie, Pengtao and Sachan, Mrinmaya and Xing, Eric P.}, booktitle = {Proceedings of the 3rd Machine Learning for Healthcare Conference}, pages = {383--402}, year = {2018}, editor = {Doshi-Velez, Finale and Fackler, Jim and Jung, Ken and Kale, David and Ranganath, Rajesh and Wallace, Byron and Wiens, Jenna}, volume = {85}, series = {Proceedings of Machine Learning Research}, month = {17--18 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v85/sachan18a/sachan18a.pdf}, url = {https://proceedings.mlr.press/v85/sachan18a.html}, abstract = {Biomedical named entity recognition (NER) is a fundamental task in text mining of medical documents and has many applications. Deep learning based approaches to this task have been gaining increasing attention in recent years as their parameters can be learned end-to-end without the need for hand-engineered features. However, these approaches rely on high-quality labeled data, which is expensive to obtain. To address this issue, we investigate how to use unlabeled text data to improve the performance of NER models. Specifically, we train a bidirectional language model (BiLM) on unlabeled data and transfer its weights to ?pretrain? an NER model with the same architecture as the BiLM, which results in a better parameter initialization of the NER model. We evaluate our approach on four benchmark datasets for biomedical NER and show that it leads to a substantial improvement in the F1 scores compared with the state-of-the-art approaches. We also show that BiLM weight transfer leads to a faster model training and the pretrained model requires fewer training examples to achieve a particular F1 score.} }
Endnote
%0 Conference Paper %T Effective Use of Bidirectional Language Modeling for Transfer Learning in Biomedical Named Entity Recognition %A Devendra Singh Sachan %A Pengtao Xie %A Mrinmaya Sachan %A Eric P. Xing %B Proceedings of the 3rd Machine Learning for Healthcare Conference %C Proceedings of Machine Learning Research %D 2018 %E Finale Doshi-Velez %E Jim Fackler %E Ken Jung %E David Kale %E Rajesh Ranganath %E Byron Wallace %E Jenna Wiens %F pmlr-v85-sachan18a %I PMLR %P 383--402 %U https://proceedings.mlr.press/v85/sachan18a.html %V 85 %X Biomedical named entity recognition (NER) is a fundamental task in text mining of medical documents and has many applications. Deep learning based approaches to this task have been gaining increasing attention in recent years as their parameters can be learned end-to-end without the need for hand-engineered features. However, these approaches rely on high-quality labeled data, which is expensive to obtain. To address this issue, we investigate how to use unlabeled text data to improve the performance of NER models. Specifically, we train a bidirectional language model (BiLM) on unlabeled data and transfer its weights to ?pretrain? an NER model with the same architecture as the BiLM, which results in a better parameter initialization of the NER model. We evaluate our approach on four benchmark datasets for biomedical NER and show that it leads to a substantial improvement in the F1 scores compared with the state-of-the-art approaches. We also show that BiLM weight transfer leads to a faster model training and the pretrained model requires fewer training examples to achieve a particular F1 score.
APA
Sachan, D.S., Xie, P., Sachan, M. & Xing, E.P.. (2018). Effective Use of Bidirectional Language Modeling for Transfer Learning in Biomedical Named Entity Recognition. Proceedings of the 3rd Machine Learning for Healthcare Conference, in Proceedings of Machine Learning Research 85:383-402 Available from https://proceedings.mlr.press/v85/sachan18a.html.

Related Material