[edit]
LIMESegment: Meaningful, Realistic Time Series Explanations
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:3418-3433, 2022.
Abstract
LIME (Locally Interpretable Model-Agnostic Explanations) has become a popular way of generating explanations for tabular, image and natural language models, providing insight into why an instance was given a particular classification. In this paper we adapt LIME to time series classification, an under-explored area with existing approaches failing to account for the structure of this kind of data. We frame the non-trivial challenge of adapting LIME to time series classification as the following open questions: “What is a meaningful interpretable representation of a time series?”, “How does one realistically perturb a time series?” and “What is a local neighbourhood around a time series?”. We propose solutions to all three questions and combine them into a novel time series explanation framework called LIMESegment, which outperforms existing adaptations of LIME to time series on a variety of classification tasks.