Probabilistic Attention-to-Influence Neural Models for Event Sequences

Xiao Shou, Debarun Bhattacharjya, Tian Gao, Dharmashankar Subramanian, Oktie Hassanzadeh, Kristin Bennett
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:31657-31674, 2023.

Abstract

Discovering knowledge about which types of events influence others, using datasets of event sequences without time stamps, has several practical applications. While neural sequence models are able to capture complex and potentially long-range historical dependencies, they often lack the interpretability of simpler models for event sequence dynamics. We provide a novel neural framework in such a setting - a probabilistic attention-to-influence neural model - which not only captures complex instance-wise interactions between events but also learns influencers for each event type of interest. Given event sequence data and a prior distribution on type-wise influence, we efficiently learn an approximate posterior for type-wise influence by an attention-to-influence transformation using variational inference. Our method subsequently models the conditional likelihood of sequences by sampling the above posterior to focus attention on influencing event types. We motivate our general framework and show improved performance in experiments compared to existing baselines on synthetic data as well as real-world benchmarks, for tasks involving prediction and influencing set identification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-shou23a, title = {Probabilistic Attention-to-Influence Neural Models for Event Sequences}, author = {Shou, Xiao and Bhattacharjya, Debarun and Gao, Tian and Subramanian, Dharmashankar and Hassanzadeh, Oktie and Bennett, Kristin}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {31657--31674}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/shou23a/shou23a.pdf}, url = {https://proceedings.mlr.press/v202/shou23a.html}, abstract = {Discovering knowledge about which types of events influence others, using datasets of event sequences without time stamps, has several practical applications. While neural sequence models are able to capture complex and potentially long-range historical dependencies, they often lack the interpretability of simpler models for event sequence dynamics. We provide a novel neural framework in such a setting - a probabilistic attention-to-influence neural model - which not only captures complex instance-wise interactions between events but also learns influencers for each event type of interest. Given event sequence data and a prior distribution on type-wise influence, we efficiently learn an approximate posterior for type-wise influence by an attention-to-influence transformation using variational inference. Our method subsequently models the conditional likelihood of sequences by sampling the above posterior to focus attention on influencing event types. We motivate our general framework and show improved performance in experiments compared to existing baselines on synthetic data as well as real-world benchmarks, for tasks involving prediction and influencing set identification.} }
Endnote
%0 Conference Paper %T Probabilistic Attention-to-Influence Neural Models for Event Sequences %A Xiao Shou %A Debarun Bhattacharjya %A Tian Gao %A Dharmashankar Subramanian %A Oktie Hassanzadeh %A Kristin Bennett %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-shou23a %I PMLR %P 31657--31674 %U https://proceedings.mlr.press/v202/shou23a.html %V 202 %X Discovering knowledge about which types of events influence others, using datasets of event sequences without time stamps, has several practical applications. While neural sequence models are able to capture complex and potentially long-range historical dependencies, they often lack the interpretability of simpler models for event sequence dynamics. We provide a novel neural framework in such a setting - a probabilistic attention-to-influence neural model - which not only captures complex instance-wise interactions between events but also learns influencers for each event type of interest. Given event sequence data and a prior distribution on type-wise influence, we efficiently learn an approximate posterior for type-wise influence by an attention-to-influence transformation using variational inference. Our method subsequently models the conditional likelihood of sequences by sampling the above posterior to focus attention on influencing event types. We motivate our general framework and show improved performance in experiments compared to existing baselines on synthetic data as well as real-world benchmarks, for tasks involving prediction and influencing set identification.
APA
Shou, X., Bhattacharjya, D., Gao, T., Subramanian, D., Hassanzadeh, O. & Bennett, K.. (2023). Probabilistic Attention-to-Influence Neural Models for Event Sequences. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:31657-31674 Available from https://proceedings.mlr.press/v202/shou23a.html.

Related Material