Unsupervised Transfer Learning for Spatiotemporal Predictive Networks

Zhiyu Yao, Yunbo Wang, Mingsheng Long, Jianmin Wang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:10778-10788, 2020.

Abstract

This paper explores a new research problem of unsupervised transfer learning across multiple spatiotemporal prediction tasks. Unlike most existing transfer learning methods that focus on fixing the discrepancy between supervised tasks, we study how to transfer knowledge from a zoo of unsupervisedly learned models towards another predictive network. Our motivation is that models from different sources are expected to understand the complex spatiotemporal dynamics from different perspectives, thereby effectively supplementing the new task, even if the task has sufficient training samples. Technically, we propose a differentiable framework named transferable memory. It adaptively distills knowledge from a bank of memory states of multiple pretrained RNNs, and applies it to the target network via a novel recurrent structure called the Transferable Memory Unit (TMU). Compared with finetuning, our approach yields significant improvements on three benchmarks for spatiotemporal prediction, and benefits the target task even from less relevant pretext ones.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-yao20a, title = {Unsupervised Transfer Learning for Spatiotemporal Predictive Networks}, author = {Yao, Zhiyu and Wang, Yunbo and Long, Mingsheng and Wang, Jianmin}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {10778--10788}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/yao20a/yao20a.pdf}, url = {https://proceedings.mlr.press/v119/yao20a.html}, abstract = {This paper explores a new research problem of unsupervised transfer learning across multiple spatiotemporal prediction tasks. Unlike most existing transfer learning methods that focus on fixing the discrepancy between supervised tasks, we study how to transfer knowledge from a zoo of unsupervisedly learned models towards another predictive network. Our motivation is that models from different sources are expected to understand the complex spatiotemporal dynamics from different perspectives, thereby effectively supplementing the new task, even if the task has sufficient training samples. Technically, we propose a differentiable framework named transferable memory. It adaptively distills knowledge from a bank of memory states of multiple pretrained RNNs, and applies it to the target network via a novel recurrent structure called the Transferable Memory Unit (TMU). Compared with finetuning, our approach yields significant improvements on three benchmarks for spatiotemporal prediction, and benefits the target task even from less relevant pretext ones.} }
Endnote
%0 Conference Paper %T Unsupervised Transfer Learning for Spatiotemporal Predictive Networks %A Zhiyu Yao %A Yunbo Wang %A Mingsheng Long %A Jianmin Wang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-yao20a %I PMLR %P 10778--10788 %U https://proceedings.mlr.press/v119/yao20a.html %V 119 %X This paper explores a new research problem of unsupervised transfer learning across multiple spatiotemporal prediction tasks. Unlike most existing transfer learning methods that focus on fixing the discrepancy between supervised tasks, we study how to transfer knowledge from a zoo of unsupervisedly learned models towards another predictive network. Our motivation is that models from different sources are expected to understand the complex spatiotemporal dynamics from different perspectives, thereby effectively supplementing the new task, even if the task has sufficient training samples. Technically, we propose a differentiable framework named transferable memory. It adaptively distills knowledge from a bank of memory states of multiple pretrained RNNs, and applies it to the target network via a novel recurrent structure called the Transferable Memory Unit (TMU). Compared with finetuning, our approach yields significant improvements on three benchmarks for spatiotemporal prediction, and benefits the target task even from less relevant pretext ones.
APA
Yao, Z., Wang, Y., Long, M. & Wang, J.. (2020). Unsupervised Transfer Learning for Spatiotemporal Predictive Networks. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:10778-10788 Available from https://proceedings.mlr.press/v119/yao20a.html.

Related Material