Self-Supervised Pretraining and Transfer Learning Enable\titlebreak Flu and COVID-19 Predictions in Small Mobile Sensing Datasets

Mika A Merrill, Tim Althoff
Proceedings of the Conference on Health, Inference, and Learning, PMLR 209:191-206, 2023.

Abstract

Detailed mobile sensing data from phones and fitness trackers offer an opportunity to quantify previously unmeasurable behavioral changes to improve individual health and accelerate responses to emerging diseases. Unlike in natural language processing and computer vision, deep learning has yet to broadly impact this domain, in which the majority of research and clinical applications still rely on manually defined features or even forgo predictive modeling altogether due to insufficient accuracy. This is due to unique challenges in the behavioral health domain, including very small datasets ($\sim \!\! 10^1$ participants), which frequently contain missing data, consist of long time series with critical long-range dependencies (length$<10^4$), and extreme class imbalances ($>10^3$:1). Here, we \new{describe} a neural architecture for multivariate time series classification designed to address these unique domain challenges. Our proposed behavioral representation learning approach combines novel tasks for self-supervised pretraining and transfer learning to address data scarcity, and captures long-range dependencies across long-history time series through transformer self-attention following convolutional neural network-based dimensionality reduction. We propose an evaluation framework aimed at reflecting expected real-world performance in plausible deployment scenarios. Concretely, we demonstrate (1) performance improvements over baselines of up to 0.15 ROC AUC across five influenza-related prediction tasks, (2) transfer learning-induced performance improvements \new{including a 16% relative increase} in PR AUC in small data scenarios, and (3) the potential of transfer learning in novel disease scenarios through an exploratory case study of zero-shot COVID-19 prediction in an independent data set. Finally, we discuss potential implications for medical surveillance testing.

Cite this Paper


BibTeX
@InProceedings{pmlr-v209-merrill23a, title = {Self-Supervised Pretraining and Transfer Learning Enable\titlebreak Flu and COVID-19 Predictions in Small Mobile Sensing Datasets}, author = {Merrill, Mika A and Althoff, Tim}, booktitle = {Proceedings of the Conference on Health, Inference, and Learning}, pages = {191--206}, year = {2023}, editor = {Mortazavi, Bobak J. and Sarker, Tasmie and Beam, Andrew and Ho, Joyce C.}, volume = {209}, series = {Proceedings of Machine Learning Research}, month = {22 Jun--24 Jun}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v209/merrill23a/merrill23a.pdf}, url = {https://proceedings.mlr.press/v209/merrill23a.html}, abstract = {Detailed mobile sensing data from phones and fitness trackers offer an opportunity to quantify previously unmeasurable behavioral changes to improve individual health and accelerate responses to emerging diseases. Unlike in natural language processing and computer vision, deep learning has yet to broadly impact this domain, in which the majority of research and clinical applications still rely on manually defined features or even forgo predictive modeling altogether due to insufficient accuracy. This is due to unique challenges in the behavioral health domain, including very small datasets ($\sim \!\! 10^1$ participants), which frequently contain missing data, consist of long time series with critical long-range dependencies (length$<10^4$), and extreme class imbalances ($>10^3$:1). Here, we \new{describe} a neural architecture for multivariate time series classification designed to address these unique domain challenges. Our proposed behavioral representation learning approach combines novel tasks for self-supervised pretraining and transfer learning to address data scarcity, and captures long-range dependencies across long-history time series through transformer self-attention following convolutional neural network-based dimensionality reduction. We propose an evaluation framework aimed at reflecting expected real-world performance in plausible deployment scenarios. Concretely, we demonstrate (1) performance improvements over baselines of up to 0.15 ROC AUC across five influenza-related prediction tasks, (2) transfer learning-induced performance improvements \new{including a 16% relative increase} in PR AUC in small data scenarios, and (3) the potential of transfer learning in novel disease scenarios through an exploratory case study of zero-shot COVID-19 prediction in an independent data set. Finally, we discuss potential implications for medical surveillance testing.} }
Endnote
%0 Conference Paper %T Self-Supervised Pretraining and Transfer Learning Enable\titlebreak Flu and COVID-19 Predictions in Small Mobile Sensing Datasets %A Mika A Merrill %A Tim Althoff %B Proceedings of the Conference on Health, Inference, and Learning %C Proceedings of Machine Learning Research %D 2023 %E Bobak J. Mortazavi %E Tasmie Sarker %E Andrew Beam %E Joyce C. Ho %F pmlr-v209-merrill23a %I PMLR %P 191--206 %U https://proceedings.mlr.press/v209/merrill23a.html %V 209 %X Detailed mobile sensing data from phones and fitness trackers offer an opportunity to quantify previously unmeasurable behavioral changes to improve individual health and accelerate responses to emerging diseases. Unlike in natural language processing and computer vision, deep learning has yet to broadly impact this domain, in which the majority of research and clinical applications still rely on manually defined features or even forgo predictive modeling altogether due to insufficient accuracy. This is due to unique challenges in the behavioral health domain, including very small datasets ($\sim \!\! 10^1$ participants), which frequently contain missing data, consist of long time series with critical long-range dependencies (length$<10^4$), and extreme class imbalances ($>10^3$:1). Here, we \new{describe} a neural architecture for multivariate time series classification designed to address these unique domain challenges. Our proposed behavioral representation learning approach combines novel tasks for self-supervised pretraining and transfer learning to address data scarcity, and captures long-range dependencies across long-history time series through transformer self-attention following convolutional neural network-based dimensionality reduction. We propose an evaluation framework aimed at reflecting expected real-world performance in plausible deployment scenarios. Concretely, we demonstrate (1) performance improvements over baselines of up to 0.15 ROC AUC across five influenza-related prediction tasks, (2) transfer learning-induced performance improvements \new{including a 16% relative increase} in PR AUC in small data scenarios, and (3) the potential of transfer learning in novel disease scenarios through an exploratory case study of zero-shot COVID-19 prediction in an independent data set. Finally, we discuss potential implications for medical surveillance testing.
APA
Merrill, M.A. & Althoff, T.. (2023). Self-Supervised Pretraining and Transfer Learning Enable\titlebreak Flu and COVID-19 Predictions in Small Mobile Sensing Datasets. Proceedings of the Conference on Health, Inference, and Learning, in Proceedings of Machine Learning Research 209:191-206 Available from https://proceedings.mlr.press/v209/merrill23a.html.

Related Material