Cost-Effective Interactive Attention Learning with Neural Attention Processes

Jay Heo, Junhyeon Park, Hyewon Jeong, Kwang Joon Kim, Juho Lee, Eunho Yang, Sung Ju Hwang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4228-4238, 2020.

Abstract

We propose a novel interactive learning framework which we refer to as Interactive Attention Learning (IAL), in which the human supervisors interactively manipulate the allocated attentions, to correct the model’s behaviour by updating the attention-generating network. However, such a model is prone to overfitting due to scarcity of human annotations, and requires costly retraining. Moreover, it is almost infeasible for the human annotators to examine attentions on tons of instances and features. We tackle these challenges by proposing a sample-efficient attention mechanism and a cost-effective reranking algorithm for instances and features. First, we propose Neural Attention Processes (NAP), which is an attention generator that can update its behaviour by incorporating new attention-level supervisions without any retraining. Secondly, we propose an algorithm which prioritizes the instances and the features by their negative impacts, such that the model can yield large improvements with minimal human feedback. We validate IAL on various time-series datasets from multiple domains (healthcare, real-estate, and computer vision) on which it significantly outperforms baselines with conventional attention mechanisms, or without cost-effective reranking, with substantially less retraining and human-model interaction cost.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-heo20a, title = {Cost-Effective Interactive Attention Learning with Neural Attention Processes}, author = {Heo, Jay and Park, Junhyeon and Jeong, Hyewon and Kim, Kwang Joon and Lee, Juho and Yang, Eunho and Hwang, Sung Ju}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4228--4238}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/heo20a/heo20a.pdf}, url = {https://proceedings.mlr.press/v119/heo20a.html}, abstract = {We propose a novel interactive learning framework which we refer to as Interactive Attention Learning (IAL), in which the human supervisors interactively manipulate the allocated attentions, to correct the model’s behaviour by updating the attention-generating network. However, such a model is prone to overfitting due to scarcity of human annotations, and requires costly retraining. Moreover, it is almost infeasible for the human annotators to examine attentions on tons of instances and features. We tackle these challenges by proposing a sample-efficient attention mechanism and a cost-effective reranking algorithm for instances and features. First, we propose Neural Attention Processes (NAP), which is an attention generator that can update its behaviour by incorporating new attention-level supervisions without any retraining. Secondly, we propose an algorithm which prioritizes the instances and the features by their negative impacts, such that the model can yield large improvements with minimal human feedback. We validate IAL on various time-series datasets from multiple domains (healthcare, real-estate, and computer vision) on which it significantly outperforms baselines with conventional attention mechanisms, or without cost-effective reranking, with substantially less retraining and human-model interaction cost.} }
Endnote
%0 Conference Paper %T Cost-Effective Interactive Attention Learning with Neural Attention Processes %A Jay Heo %A Junhyeon Park %A Hyewon Jeong %A Kwang Joon Kim %A Juho Lee %A Eunho Yang %A Sung Ju Hwang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-heo20a %I PMLR %P 4228--4238 %U https://proceedings.mlr.press/v119/heo20a.html %V 119 %X We propose a novel interactive learning framework which we refer to as Interactive Attention Learning (IAL), in which the human supervisors interactively manipulate the allocated attentions, to correct the model’s behaviour by updating the attention-generating network. However, such a model is prone to overfitting due to scarcity of human annotations, and requires costly retraining. Moreover, it is almost infeasible for the human annotators to examine attentions on tons of instances and features. We tackle these challenges by proposing a sample-efficient attention mechanism and a cost-effective reranking algorithm for instances and features. First, we propose Neural Attention Processes (NAP), which is an attention generator that can update its behaviour by incorporating new attention-level supervisions without any retraining. Secondly, we propose an algorithm which prioritizes the instances and the features by their negative impacts, such that the model can yield large improvements with minimal human feedback. We validate IAL on various time-series datasets from multiple domains (healthcare, real-estate, and computer vision) on which it significantly outperforms baselines with conventional attention mechanisms, or without cost-effective reranking, with substantially less retraining and human-model interaction cost.
APA
Heo, J., Park, J., Jeong, H., Kim, K.J., Lee, J., Yang, E. & Hwang, S.J.. (2020). Cost-Effective Interactive Attention Learning with Neural Attention Processes. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4228-4238 Available from https://proceedings.mlr.press/v119/heo20a.html.

Related Material