[edit]
DeepFaceLIFT: Interpretable Personalized Models for Automatic Estimation of Self-Reported Pain
Proceedings of IJCAI 2017 Workshop on Artificial Intelligence in Affective Computing, PMLR 66:1-16, 2017.
Abstract
Previous research on automatic pain estimation from facial expressions has focused primarily on “one-size-fits-all” metrics (such as PSPI). In this work, we focus on directly estimating each individual’s self-reported visual-analog scale (VAS) pain metric, as this is considered the gold standard for pain measurement. The VAS pain score is highly subjective and context-dependent, and its range can vary significantly among different persons. To tackle these issues, we propose a novel two-stage personalized model, named DeepFaceLIFT,for automatic estimation of VAS.This model is based on (1) Neural Network and (2) Gaussian process regression models, and is used to personalize the estimation of self-reported pain via a set of hand-crafted personal features and multi-task learning. We show on the benchmark dataset for pain analysis (The UNBC-McMaster Shoulder Pain Expression Archive) that the proposed personalized model largely outperforms the traditional, unpersonalized models the intra-class correlation improves from a baseline performance of 19% to a personalized performance of 35% while also providing confidence in the model's estimates–in contrast to existing models for the target task. Additionally, DeepFaceLIFT automatically discovers the pain-relevant facial regions for each person, allowing for an easy interpretation of the pain-related facial cues.