Test-Time Training with Self-Supervision for Generalization under Distribution Shifts

Yu Sun, Xiaolong Wang, Zhuang Liu, John Miller, Alexei Efros, Moritz Hardt
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9229-9248, 2020.

Abstract

In this paper, we propose Test-Time Training, a general approach for improving the performance of predictive models when training and test data come from different distributions. We turn a single unlabeled test sample into a self-supervised learning problem, on which we update the model parameters before making a prediction. This also extends naturally to data in an online stream. Our simple approach leads to improvements on diverse image classification benchmarks aimed at evaluating robustness to distribution shifts.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-sun20b, title = {Test-Time Training with Self-Supervision for Generalization under Distribution Shifts}, author = {Sun, Yu and Wang, Xiaolong and Liu, Zhuang and Miller, John and Efros, Alexei and Hardt, Moritz}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {9229--9248}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/sun20b/sun20b.pdf}, url = {https://proceedings.mlr.press/v119/sun20b.html}, abstract = {In this paper, we propose Test-Time Training, a general approach for improving the performance of predictive models when training and test data come from different distributions. We turn a single unlabeled test sample into a self-supervised learning problem, on which we update the model parameters before making a prediction. This also extends naturally to data in an online stream. Our simple approach leads to improvements on diverse image classification benchmarks aimed at evaluating robustness to distribution shifts.} }
Endnote
%0 Conference Paper %T Test-Time Training with Self-Supervision for Generalization under Distribution Shifts %A Yu Sun %A Xiaolong Wang %A Zhuang Liu %A John Miller %A Alexei Efros %A Moritz Hardt %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-sun20b %I PMLR %P 9229--9248 %U https://proceedings.mlr.press/v119/sun20b.html %V 119 %X In this paper, we propose Test-Time Training, a general approach for improving the performance of predictive models when training and test data come from different distributions. We turn a single unlabeled test sample into a self-supervised learning problem, on which we update the model parameters before making a prediction. This also extends naturally to data in an online stream. Our simple approach leads to improvements on diverse image classification benchmarks aimed at evaluating robustness to distribution shifts.
APA
Sun, Y., Wang, X., Liu, Z., Miller, J., Efros, A. & Hardt, M.. (2020). Test-Time Training with Self-Supervision for Generalization under Distribution Shifts. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:9229-9248 Available from https://proceedings.mlr.press/v119/sun20b.html.

Related Material