Imputing Missing Events in Continuous-Time Event Streams

Hongyuan Mei, Guanghui Qin, Jason Eisner
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4475-4485, 2019.

Abstract

Events in the world may be caused by other, unobserved events. We consider sequences of events in continuous time. Given a probability model of complete sequences, we propose particle smoothing—a form of sequential importance sampling—to impute the missing events in an incomplete sequence. We develop a trainable family of proposal distributions based on a type of bidirectional continuous-time LSTM: Bidirectionality lets the proposals condition on future observations, not just on the past as in particle filtering. Our method can sample an ensemble of possible complete sequences (particles), from which we form a single consensus prediction that has low Bayes risk under our chosen loss metric. We experiment in multiple synthetic and real domains, using different missingness mechanisms, and modeling the complete sequences in each domain with a neural Hawkes process (Mei & Eisner 2017). On held-out incomplete sequences, our method is effective at inferring the ground-truth unobserved events, with particle smoothing consistently improving upon particle filtering.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-mei19a, title = {Imputing Missing Events in Continuous-Time Event Streams}, author = {Mei, Hongyuan and Qin, Guanghui and Eisner, Jason}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4475--4485}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/mei19a/mei19a.pdf}, url = {https://proceedings.mlr.press/v97/mei19a.html}, abstract = {Events in the world may be caused by other, unobserved events. We consider sequences of events in continuous time. Given a probability model of complete sequences, we propose particle smoothing—a form of sequential importance sampling—to impute the missing events in an incomplete sequence. We develop a trainable family of proposal distributions based on a type of bidirectional continuous-time LSTM: Bidirectionality lets the proposals condition on future observations, not just on the past as in particle filtering. Our method can sample an ensemble of possible complete sequences (particles), from which we form a single consensus prediction that has low Bayes risk under our chosen loss metric. We experiment in multiple synthetic and real domains, using different missingness mechanisms, and modeling the complete sequences in each domain with a neural Hawkes process (Mei & Eisner 2017). On held-out incomplete sequences, our method is effective at inferring the ground-truth unobserved events, with particle smoothing consistently improving upon particle filtering.} }
Endnote
%0 Conference Paper %T Imputing Missing Events in Continuous-Time Event Streams %A Hongyuan Mei %A Guanghui Qin %A Jason Eisner %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-mei19a %I PMLR %P 4475--4485 %U https://proceedings.mlr.press/v97/mei19a.html %V 97 %X Events in the world may be caused by other, unobserved events. We consider sequences of events in continuous time. Given a probability model of complete sequences, we propose particle smoothing—a form of sequential importance sampling—to impute the missing events in an incomplete sequence. We develop a trainable family of proposal distributions based on a type of bidirectional continuous-time LSTM: Bidirectionality lets the proposals condition on future observations, not just on the past as in particle filtering. Our method can sample an ensemble of possible complete sequences (particles), from which we form a single consensus prediction that has low Bayes risk under our chosen loss metric. We experiment in multiple synthetic and real domains, using different missingness mechanisms, and modeling the complete sequences in each domain with a neural Hawkes process (Mei & Eisner 2017). On held-out incomplete sequences, our method is effective at inferring the ground-truth unobserved events, with particle smoothing consistently improving upon particle filtering.
APA
Mei, H., Qin, G. & Eisner, J.. (2019). Imputing Missing Events in Continuous-Time Event Streams. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4475-4485 Available from https://proceedings.mlr.press/v97/mei19a.html.

Related Material