Dual Learning: Theoretical Study and an Algorithmic Extension

Zhibing Zhao, Yingce Xia, Tao Qin, Lirong Xia, Tie-Yan Liu
Proceedings of The 12th Asian Conference on Machine Learning, PMLR 129:321-336, 2020.

Abstract

Dual learning has been successfully applied in many machine learning applications including machine translation, image-to-image transformation, etc. The high-level idea of dual learning is very intuitive: if we map an $x$ from one domain to another and then map it back, we should recover the original $x$. Although its effectiveness has been empirically verified, theoretical understanding of dual learning is still very limited. In this paper, we aim at understanding why and when dual learning works. Based on our theoretical analysis, we further extend dual learning by introducing more related mappings and propose multi-step dual learning, in which we leverage feedback signals from additional domains to improve the qualities of the mappings. We prove that multi-step dual learning can boost the performance of standard dual learning under mild conditions. Experiments on WMT 14 English↔German and MultiUN English↔French translations verify our theoretical findings on dual learning, and the results on the translations among English, French, and Spanish of MultiUN demonstrate the effectiveness of multi-step dual learning

Cite this Paper


BibTeX
@InProceedings{pmlr-v129-zhao20a, title = {Dual Learning: Theoretical Study and an Algorithmic Extension}, author = {Zhao, Zhibing and Xia, Yingce and Qin, Tao and Xia, Lirong and Liu, Tie-Yan}, booktitle = {Proceedings of The 12th Asian Conference on Machine Learning}, pages = {321--336}, year = {2020}, editor = {Pan, Sinno Jialin and Sugiyama, Masashi}, volume = {129}, series = {Proceedings of Machine Learning Research}, month = {18--20 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v129/zhao20a/zhao20a.pdf}, url = {https://proceedings.mlr.press/v129/zhao20a.html}, abstract = {Dual learning has been successfully applied in many machine learning applications including machine translation, image-to-image transformation, etc. The high-level idea of dual learning is very intuitive: if we map an $x$ from one domain to another and then map it back, we should recover the original $x$. Although its effectiveness has been empirically verified, theoretical understanding of dual learning is still very limited. In this paper, we aim at understanding why and when dual learning works. Based on our theoretical analysis, we further extend dual learning by introducing more related mappings and propose multi-step dual learning, in which we leverage feedback signals from additional domains to improve the qualities of the mappings. We prove that multi-step dual learning can boost the performance of standard dual learning under mild conditions. Experiments on WMT 14 English↔German and MultiUN English↔French translations verify our theoretical findings on dual learning, and the results on the translations among English, French, and Spanish of MultiUN demonstrate the effectiveness of multi-step dual learning} }
Endnote
%0 Conference Paper %T Dual Learning: Theoretical Study and an Algorithmic Extension %A Zhibing Zhao %A Yingce Xia %A Tao Qin %A Lirong Xia %A Tie-Yan Liu %B Proceedings of The 12th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Sinno Jialin Pan %E Masashi Sugiyama %F pmlr-v129-zhao20a %I PMLR %P 321--336 %U https://proceedings.mlr.press/v129/zhao20a.html %V 129 %X Dual learning has been successfully applied in many machine learning applications including machine translation, image-to-image transformation, etc. The high-level idea of dual learning is very intuitive: if we map an $x$ from one domain to another and then map it back, we should recover the original $x$. Although its effectiveness has been empirically verified, theoretical understanding of dual learning is still very limited. In this paper, we aim at understanding why and when dual learning works. Based on our theoretical analysis, we further extend dual learning by introducing more related mappings and propose multi-step dual learning, in which we leverage feedback signals from additional domains to improve the qualities of the mappings. We prove that multi-step dual learning can boost the performance of standard dual learning under mild conditions. Experiments on WMT 14 English↔German and MultiUN English↔French translations verify our theoretical findings on dual learning, and the results on the translations among English, French, and Spanish of MultiUN demonstrate the effectiveness of multi-step dual learning
APA
Zhao, Z., Xia, Y., Qin, T., Xia, L. & Liu, T.. (2020). Dual Learning: Theoretical Study and an Algorithmic Extension. Proceedings of The 12th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 129:321-336 Available from https://proceedings.mlr.press/v129/zhao20a.html.

Related Material