Automated Estimation of Food Type from Body-worn Audio and Motion Sensors in Free-Living Environments

Mark Mirtchouk, Dana L. McGuire, Andrea L. Deierlein, Samantha Kleinberg
Proceedings of the 4th Machine Learning for Healthcare Conference, PMLR 106:641-662, 2019.

Abstract

Nutrition is fundamental to maintaining health, managing chronic diseases, and preventing illness, but unlike physical activity there is not yet a way to unobtrusively and automatically measure nutrition. While recent work has shown that body-worn sensors can be used to identify meal times, to have an impact on health and fully replace manual food logs, we need to identify not only when someone is eating, but what they are consuming. However, it is challenging to collect labeled data in daily life, while lab data does not always generalize to reality. To address this, we develop new algorithms for semi-supervised hierarchical classification that enable higher accuracy when training on data with weak labels. Using this approach, we present the first results on automated classification of foods consumed in data collected from body-worn audio and motion sensors in free-living environments. We show that by exploiting a mix of lab and free-living data, we can achieve a classification accuracy of 88% on unrestricted meals (e.g. stir fry, pizza, salad) in unrestricted environments such as home and restaurants. Ultimately, this lays the foundation for body-worn devices that can calculate calories and macronutrients by identifying food type and quantity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v106-mirtchouk19a, title = {Automated Estimation of Food Type from Body-worn Audio and Motion Sensors in Free-Living Environments}, author = {Mirtchouk, Mark and McGuire, Dana L. and Deierlein, Andrea L. and Kleinberg, Samantha}, booktitle = {Proceedings of the 4th Machine Learning for Healthcare Conference}, pages = {641--662}, year = {2019}, editor = {Doshi-Velez, Finale and Fackler, Jim and Jung, Ken and Kale, David and Ranganath, Rajesh and Wallace, Byron and Wiens, Jenna}, volume = {106}, series = {Proceedings of Machine Learning Research}, month = {09--10 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v106/mirtchouk19a/mirtchouk19a.pdf}, url = {https://proceedings.mlr.press/v106/mirtchouk19a.html}, abstract = {Nutrition is fundamental to maintaining health, managing chronic diseases, and preventing illness, but unlike physical activity there is not yet a way to unobtrusively and automatically measure nutrition. While recent work has shown that body-worn sensors can be used to identify meal times, to have an impact on health and fully replace manual food logs, we need to identify not only when someone is eating, but what they are consuming. However, it is challenging to collect labeled data in daily life, while lab data does not always generalize to reality. To address this, we develop new algorithms for semi-supervised hierarchical classification that enable higher accuracy when training on data with weak labels. Using this approach, we present the first results on automated classification of foods consumed in data collected from body-worn audio and motion sensors in free-living environments. We show that by exploiting a mix of lab and free-living data, we can achieve a classification accuracy of 88% on unrestricted meals (e.g. stir fry, pizza, salad) in unrestricted environments such as home and restaurants. Ultimately, this lays the foundation for body-worn devices that can calculate calories and macronutrients by identifying food type and quantity.} }
Endnote
%0 Conference Paper %T Automated Estimation of Food Type from Body-worn Audio and Motion Sensors in Free-Living Environments %A Mark Mirtchouk %A Dana L. McGuire %A Andrea L. Deierlein %A Samantha Kleinberg %B Proceedings of the 4th Machine Learning for Healthcare Conference %C Proceedings of Machine Learning Research %D 2019 %E Finale Doshi-Velez %E Jim Fackler %E Ken Jung %E David Kale %E Rajesh Ranganath %E Byron Wallace %E Jenna Wiens %F pmlr-v106-mirtchouk19a %I PMLR %P 641--662 %U https://proceedings.mlr.press/v106/mirtchouk19a.html %V 106 %X Nutrition is fundamental to maintaining health, managing chronic diseases, and preventing illness, but unlike physical activity there is not yet a way to unobtrusively and automatically measure nutrition. While recent work has shown that body-worn sensors can be used to identify meal times, to have an impact on health and fully replace manual food logs, we need to identify not only when someone is eating, but what they are consuming. However, it is challenging to collect labeled data in daily life, while lab data does not always generalize to reality. To address this, we develop new algorithms for semi-supervised hierarchical classification that enable higher accuracy when training on data with weak labels. Using this approach, we present the first results on automated classification of foods consumed in data collected from body-worn audio and motion sensors in free-living environments. We show that by exploiting a mix of lab and free-living data, we can achieve a classification accuracy of 88% on unrestricted meals (e.g. stir fry, pizza, salad) in unrestricted environments such as home and restaurants. Ultimately, this lays the foundation for body-worn devices that can calculate calories and macronutrients by identifying food type and quantity.
APA
Mirtchouk, M., McGuire, D.L., Deierlein, A.L. & Kleinberg, S.. (2019). Automated Estimation of Food Type from Body-worn Audio and Motion Sensors in Free-Living Environments. Proceedings of the 4th Machine Learning for Healthcare Conference, in Proceedings of Machine Learning Research 106:641-662 Available from https://proceedings.mlr.press/v106/mirtchouk19a.html.

Related Material