Multi-Task Structured Prediction for Entity Analysis: Search-Based Learning Algorithms

Chao Ma, Janardhan Rao Doppa, Prasad Tadepalli, Hamed Shahbazi, Xiaoli Fern
Proceedings of the Ninth Asian Conference on Machine Learning, PMLR 77:514-529, 2017.

Abstract

Entity analysis in natural language processing involves solving multiple structured prediction problems such as mention detection, coreference resolution, and entity linking. We explore the space of search-based learning approaches to solve the problem of \em multi-task structured prediction (MTSP) in the context of entity analysis. In this paper, we study three different search architectures to solve MTSP problems that make different tradeoffs between speed and accuracy of training and inference. In all three architectures, we learn one or more scoring functions that employ both intra-task and inter-task features. In the “pipeline” architecture, which is the fastest, we solve different tasks one after another in a pipelined fashion. In the “joint” architecture, which is the most expensive, we formulate MTSP as a single-task structured prediction, and search the joint space of multi-task structured outputs. To improve the speed of joint architecture, we introduce two different pruning methods and associated learning techniques. In the intermediate “cyclic” architecture, we cycle through the tasks multiple times in sequence until there is no performance improvement. Results on two benchmark domains show that the joint architecture improves over the pipeline approach as well as the previous state-of-the-art approach based on graphical models. The cyclic architecture is faster than the joint approach and achieves competitive performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v77-ma17a, title = {Multi-Task Structured Prediction for Entity Analysis: Search-Based Learning Algorithms}, author = {Ma, Chao and Doppa, Janardhan Rao and Tadepalli, Prasad and Shahbazi, Hamed and Fern, Xiaoli}, booktitle = {Proceedings of the Ninth Asian Conference on Machine Learning}, pages = {514--529}, year = {2017}, editor = {Zhang, Min-Ling and Noh, Yung-Kyun}, volume = {77}, series = {Proceedings of Machine Learning Research}, address = {Yonsei University, Seoul, Republic of Korea}, month = {15--17 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v77/ma17a/ma17a.pdf}, url = {https://proceedings.mlr.press/v77/ma17a.html}, abstract = {Entity analysis in natural language processing involves solving multiple structured prediction problems such as mention detection, coreference resolution, and entity linking. We explore the space of search-based learning approaches to solve the problem of \em multi-task structured prediction (MTSP) in the context of entity analysis. In this paper, we study three different search architectures to solve MTSP problems that make different tradeoffs between speed and accuracy of training and inference. In all three architectures, we learn one or more scoring functions that employ both intra-task and inter-task features. In the “pipeline” architecture, which is the fastest, we solve different tasks one after another in a pipelined fashion. In the “joint” architecture, which is the most expensive, we formulate MTSP as a single-task structured prediction, and search the joint space of multi-task structured outputs. To improve the speed of joint architecture, we introduce two different pruning methods and associated learning techniques. In the intermediate “cyclic” architecture, we cycle through the tasks multiple times in sequence until there is no performance improvement. Results on two benchmark domains show that the joint architecture improves over the pipeline approach as well as the previous state-of-the-art approach based on graphical models. The cyclic architecture is faster than the joint approach and achieves competitive performance.} }
Endnote
%0 Conference Paper %T Multi-Task Structured Prediction for Entity Analysis: Search-Based Learning Algorithms %A Chao Ma %A Janardhan Rao Doppa %A Prasad Tadepalli %A Hamed Shahbazi %A Xiaoli Fern %B Proceedings of the Ninth Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Min-Ling Zhang %E Yung-Kyun Noh %F pmlr-v77-ma17a %I PMLR %P 514--529 %U https://proceedings.mlr.press/v77/ma17a.html %V 77 %X Entity analysis in natural language processing involves solving multiple structured prediction problems such as mention detection, coreference resolution, and entity linking. We explore the space of search-based learning approaches to solve the problem of \em multi-task structured prediction (MTSP) in the context of entity analysis. In this paper, we study three different search architectures to solve MTSP problems that make different tradeoffs between speed and accuracy of training and inference. In all three architectures, we learn one or more scoring functions that employ both intra-task and inter-task features. In the “pipeline” architecture, which is the fastest, we solve different tasks one after another in a pipelined fashion. In the “joint” architecture, which is the most expensive, we formulate MTSP as a single-task structured prediction, and search the joint space of multi-task structured outputs. To improve the speed of joint architecture, we introduce two different pruning methods and associated learning techniques. In the intermediate “cyclic” architecture, we cycle through the tasks multiple times in sequence until there is no performance improvement. Results on two benchmark domains show that the joint architecture improves over the pipeline approach as well as the previous state-of-the-art approach based on graphical models. The cyclic architecture is faster than the joint approach and achieves competitive performance.
APA
Ma, C., Doppa, J.R., Tadepalli, P., Shahbazi, H. & Fern, X.. (2017). Multi-Task Structured Prediction for Entity Analysis: Search-Based Learning Algorithms. Proceedings of the Ninth Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 77:514-529 Available from https://proceedings.mlr.press/v77/ma17a.html.

Related Material