Contrastive Pretraining for Stress Detection with Multimodal Wearable Sensor Data and Surveys

Zeyu Yang, Han Yu, Akane Sano
Proceedings of the sixth Conference on Health, Inference, and Learning, PMLR 287:166-178, 2025.

Abstract

Stress adversely affects mental and physical health and underscores the importance of early detection. Some studies have utilized physiological signals from wearable sensors and other information to monitor stress levels in daily life. Recent studies use self-supervised methods due to the high cost of collecting stress labels. However, self-supervised learning using both time series and tabular features such as demographics, traits, and contextual information has been understudied. Therefore, there is a need to further investigate how a model can be effectively trained with different granularity of multimodal data and limited number of labels. In this study, we introduce a self-supervised multimodal learning approach for stress detection that combines time series and tabular features. Our proposed method presents a promising solution for effectively monitoring stress using multimodal data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v287-yang25a, title = {Contrastive Pretraining for Stress Detection with Multimodal Wearable Sensor Data and Surveys}, author = {Yang, Zeyu and Yu, Han and Sano, Akane}, booktitle = {Proceedings of the sixth Conference on Health, Inference, and Learning}, pages = {166--178}, year = {2025}, editor = {Xu, Xuhai Orson and Choi, Edward and Singhal, Pankhuri and Gerych, Walter and Tang, Shengpu and Agrawal, Monica and Subbaswamy, Adarsh and Sizikova, Elena and Dunn, Jessilyn and Daneshjou, Roxana and Sarker, Tasmie and McDermott, Matthew and Chen, Irene}, volume = {287}, series = {Proceedings of Machine Learning Research}, month = {25--27 Jun}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v287/main/assets/yang25a/yang25a.pdf}, url = {https://proceedings.mlr.press/v287/yang25a.html}, abstract = {Stress adversely affects mental and physical health and underscores the importance of early detection. Some studies have utilized physiological signals from wearable sensors and other information to monitor stress levels in daily life. Recent studies use self-supervised methods due to the high cost of collecting stress labels. However, self-supervised learning using both time series and tabular features such as demographics, traits, and contextual information has been understudied. Therefore, there is a need to further investigate how a model can be effectively trained with different granularity of multimodal data and limited number of labels. In this study, we introduce a self-supervised multimodal learning approach for stress detection that combines time series and tabular features. Our proposed method presents a promising solution for effectively monitoring stress using multimodal data.} }
Endnote
%0 Conference Paper %T Contrastive Pretraining for Stress Detection with Multimodal Wearable Sensor Data and Surveys %A Zeyu Yang %A Han Yu %A Akane Sano %B Proceedings of the sixth Conference on Health, Inference, and Learning %C Proceedings of Machine Learning Research %D 2025 %E Xuhai Orson Xu %E Edward Choi %E Pankhuri Singhal %E Walter Gerych %E Shengpu Tang %E Monica Agrawal %E Adarsh Subbaswamy %E Elena Sizikova %E Jessilyn Dunn %E Roxana Daneshjou %E Tasmie Sarker %E Matthew McDermott %E Irene Chen %F pmlr-v287-yang25a %I PMLR %P 166--178 %U https://proceedings.mlr.press/v287/yang25a.html %V 287 %X Stress adversely affects mental and physical health and underscores the importance of early detection. Some studies have utilized physiological signals from wearable sensors and other information to monitor stress levels in daily life. Recent studies use self-supervised methods due to the high cost of collecting stress labels. However, self-supervised learning using both time series and tabular features such as demographics, traits, and contextual information has been understudied. Therefore, there is a need to further investigate how a model can be effectively trained with different granularity of multimodal data and limited number of labels. In this study, we introduce a self-supervised multimodal learning approach for stress detection that combines time series and tabular features. Our proposed method presents a promising solution for effectively monitoring stress using multimodal data.
APA
Yang, Z., Yu, H. & Sano, A.. (2025). Contrastive Pretraining for Stress Detection with Multimodal Wearable Sensor Data and Surveys. Proceedings of the sixth Conference on Health, Inference, and Learning, in Proceedings of Machine Learning Research 287:166-178 Available from https://proceedings.mlr.press/v287/yang25a.html.

Related Material