Influence-Aware Attention for Multivariate Temporal Point Processes

Xiao Shou, Tian Gao, Dharmashankar Subramanian, Debarun Bhattacharjya, Kristin Bennett
Proceedings of the Second Conference on Causal Learning and Reasoning, PMLR 213:499-517, 2023.

Abstract

Identifying the subset of events that influence events of interest from continuous time datasets is of great interest in various applications. Existing methods however often fail to produce accurate and interpretable results in a time-efficient manner. In this paper, we propose a neural model – Influence-Aware Attention for Multivariate Temporal Point Processes (IAA-MTPPs) – which leverages the powerful attention mechanism in transformers to capture temporal dynamics between event types, which is different from existing instance-to-instance attentions, using variational inference while maintaining interpretability. Given event sequences and a prior influence matrix, IAA-MTPP efficiently learns an approximate posterior by an Attention-to-Influence mechanism, and subsequently models the conditional likelihood of the sequences given a sampled influence through an Influence-to-Attention formulation. Both steps are completed efficiently inside a B-block multi-head self-attention layer, thus our end-to-end training with parallelizable transformer architecture enables faster training compared to sequential models such as RNNs. We demonstrate strong empirical performance compared to existing baselines on multiple synthetic and real benchmarks, including qualitative analysis for an application in decentralized finance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v213-shou23a, title = {Influence-Aware Attention for Multivariate Temporal Point Processes}, author = {Shou, Xiao and Gao, Tian and Subramanian, Dharmashankar and Bhattacharjya, Debarun and Bennett, Kristin}, booktitle = {Proceedings of the Second Conference on Causal Learning and Reasoning}, pages = {499--517}, year = {2023}, editor = {van der Schaar, Mihaela and Zhang, Cheng and Janzing, Dominik}, volume = {213}, series = {Proceedings of Machine Learning Research}, month = {11--14 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v213/shou23a/shou23a.pdf}, url = {https://proceedings.mlr.press/v213/shou23a.html}, abstract = {Identifying the subset of events that influence events of interest from continuous time datasets is of great interest in various applications. Existing methods however often fail to produce accurate and interpretable results in a time-efficient manner. In this paper, we propose a neural model – Influence-Aware Attention for Multivariate Temporal Point Processes (IAA-MTPPs) – which leverages the powerful attention mechanism in transformers to capture temporal dynamics between event types, which is different from existing instance-to-instance attentions, using variational inference while maintaining interpretability. Given event sequences and a prior influence matrix, IAA-MTPP efficiently learns an approximate posterior by an Attention-to-Influence mechanism, and subsequently models the conditional likelihood of the sequences given a sampled influence through an Influence-to-Attention formulation. Both steps are completed efficiently inside a B-block multi-head self-attention layer, thus our end-to-end training with parallelizable transformer architecture enables faster training compared to sequential models such as RNNs. We demonstrate strong empirical performance compared to existing baselines on multiple synthetic and real benchmarks, including qualitative analysis for an application in decentralized finance.} }
Endnote
%0 Conference Paper %T Influence-Aware Attention for Multivariate Temporal Point Processes %A Xiao Shou %A Tian Gao %A Dharmashankar Subramanian %A Debarun Bhattacharjya %A Kristin Bennett %B Proceedings of the Second Conference on Causal Learning and Reasoning %C Proceedings of Machine Learning Research %D 2023 %E Mihaela van der Schaar %E Cheng Zhang %E Dominik Janzing %F pmlr-v213-shou23a %I PMLR %P 499--517 %U https://proceedings.mlr.press/v213/shou23a.html %V 213 %X Identifying the subset of events that influence events of interest from continuous time datasets is of great interest in various applications. Existing methods however often fail to produce accurate and interpretable results in a time-efficient manner. In this paper, we propose a neural model – Influence-Aware Attention for Multivariate Temporal Point Processes (IAA-MTPPs) – which leverages the powerful attention mechanism in transformers to capture temporal dynamics between event types, which is different from existing instance-to-instance attentions, using variational inference while maintaining interpretability. Given event sequences and a prior influence matrix, IAA-MTPP efficiently learns an approximate posterior by an Attention-to-Influence mechanism, and subsequently models the conditional likelihood of the sequences given a sampled influence through an Influence-to-Attention formulation. Both steps are completed efficiently inside a B-block multi-head self-attention layer, thus our end-to-end training with parallelizable transformer architecture enables faster training compared to sequential models such as RNNs. We demonstrate strong empirical performance compared to existing baselines on multiple synthetic and real benchmarks, including qualitative analysis for an application in decentralized finance.
APA
Shou, X., Gao, T., Subramanian, D., Bhattacharjya, D. & Bennett, K.. (2023). Influence-Aware Attention for Multivariate Temporal Point Processes. Proceedings of the Second Conference on Causal Learning and Reasoning, in Proceedings of Machine Learning Research 213:499-517 Available from https://proceedings.mlr.press/v213/shou23a.html.

Related Material