Model-Level Dual Learning

Yingce Xia, Xu Tan, Fei Tian, Tao Qin, Nenghai Yu, Tie-Yan Liu
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5383-5392, 2018.

Abstract

Many artificial intelligence tasks appear in dual forms like English$\leftrightarrow$French translation and speech$\leftrightarrow$text transformation. Existing dual learning schemes, which are proposed to solve a pair of such dual tasks, explore how to leverage such dualities from data level. In this work, we propose a new learning framework, model-level dual learning, which takes duality of tasks into consideration while designing the architectures for the primal/dual models, and ties the model parameters that playing similar roles in the two tasks. We study both symmetric and asymmetric model-level dual learning. Our algorithms achieve significant improvements on neural machine translation and sentiment analysis.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-xia18a, title = {Model-Level Dual Learning}, author = {Xia, Yingce and Tan, Xu and Tian, Fei and Qin, Tao and Yu, Nenghai and Liu, Tie-Yan}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5383--5392}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/xia18a/xia18a.pdf}, url = {https://proceedings.mlr.press/v80/xia18a.html}, abstract = {Many artificial intelligence tasks appear in dual forms like English$\leftrightarrow$French translation and speech$\leftrightarrow$text transformation. Existing dual learning schemes, which are proposed to solve a pair of such dual tasks, explore how to leverage such dualities from data level. In this work, we propose a new learning framework, model-level dual learning, which takes duality of tasks into consideration while designing the architectures for the primal/dual models, and ties the model parameters that playing similar roles in the two tasks. We study both symmetric and asymmetric model-level dual learning. Our algorithms achieve significant improvements on neural machine translation and sentiment analysis.} }
Endnote
%0 Conference Paper %T Model-Level Dual Learning %A Yingce Xia %A Xu Tan %A Fei Tian %A Tao Qin %A Nenghai Yu %A Tie-Yan Liu %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-xia18a %I PMLR %P 5383--5392 %U https://proceedings.mlr.press/v80/xia18a.html %V 80 %X Many artificial intelligence tasks appear in dual forms like English$\leftrightarrow$French translation and speech$\leftrightarrow$text transformation. Existing dual learning schemes, which are proposed to solve a pair of such dual tasks, explore how to leverage such dualities from data level. In this work, we propose a new learning framework, model-level dual learning, which takes duality of tasks into consideration while designing the architectures for the primal/dual models, and ties the model parameters that playing similar roles in the two tasks. We study both symmetric and asymmetric model-level dual learning. Our algorithms achieve significant improvements on neural machine translation and sentiment analysis.
APA
Xia, Y., Tan, X., Tian, F., Qin, T., Yu, N. & Liu, T.. (2018). Model-Level Dual Learning. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:5383-5392 Available from https://proceedings.mlr.press/v80/xia18a.html.

Related Material