[edit]
Automated Estimation of Food Type from Body-worn Audio and Motion Sensors in Free-Living Environments
Proceedings of the 4th Machine Learning for Healthcare Conference, PMLR 106:641-662, 2019.
Abstract
Nutrition is fundamental to maintaining health, managing chronic diseases, and preventing illness, but unlike physical activity there is not yet a way to unobtrusively and automatically measure nutrition. While recent work has shown that body-worn sensors can be used to identify meal times, to have an impact on health and fully replace manual food logs, we need to identify not only when someone is eating, but what they are consuming. However, it is challenging to collect labeled data in daily life, while lab data does not always generalize to reality. To address this, we develop new algorithms for semi-supervised hierarchical classification that enable higher accuracy when training on data with weak labels. Using this approach, we present the first results on automated classification of foods consumed in data collected from body-worn audio and motion sensors in free-living environments. We show that by exploiting a mix of lab and free-living data, we can achieve a classification accuracy of 88% on unrestricted meals (e.g. stir fry, pizza, salad) in unrestricted environments such as home and restaurants. Ultimately, this lays the foundation for body-worn devices that can calculate calories and macronutrients by identifying food type and quantity.