Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification

Hongyuan Mei, Guanghui Qin, Minjie Xu, Jason Eisner
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6808-6819, 2020.

Abstract

Learning how to predict future events from patterns of past events is difficult when the set of possible event types is large. Training an unrestricted neural model might overfit to spurious patterns. To exploit domain-specific knowledge of how past events might affect an event’s present probability, we propose using a temporal deductive database to track structured facts over time. Rules serve to prove facts from other facts and from past events. Each fact has a time-varying state—a vector computed by a neural net whose topology is determined by the fact’s provenance, including its experience of past events. The possible event types at any time are given by special facts, whose probabilities are neurally modeled alongside their states. In both synthetic and real-world domains, we show that neural probabilistic models derived from concise Datalog programs improve prediction by encoding appropriate domain knowledge in their architecture.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-mei20a, title = {Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification}, author = {Mei, Hongyuan and Qin, Guanghui and Xu, Minjie and Eisner, Jason}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6808--6819}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/mei20a/mei20a.pdf}, url = {https://proceedings.mlr.press/v119/mei20a.html}, abstract = {Learning how to predict future events from patterns of past events is difficult when the set of possible event types is large. Training an unrestricted neural model might overfit to spurious patterns. To exploit domain-specific knowledge of how past events might affect an event’s present probability, we propose using a temporal deductive database to track structured facts over time. Rules serve to prove facts from other facts and from past events. Each fact has a time-varying state—a vector computed by a neural net whose topology is determined by the fact’s provenance, including its experience of past events. The possible event types at any time are given by special facts, whose probabilities are neurally modeled alongside their states. In both synthetic and real-world domains, we show that neural probabilistic models derived from concise Datalog programs improve prediction by encoding appropriate domain knowledge in their architecture.} }
Endnote
%0 Conference Paper %T Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification %A Hongyuan Mei %A Guanghui Qin %A Minjie Xu %A Jason Eisner %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-mei20a %I PMLR %P 6808--6819 %U https://proceedings.mlr.press/v119/mei20a.html %V 119 %X Learning how to predict future events from patterns of past events is difficult when the set of possible event types is large. Training an unrestricted neural model might overfit to spurious patterns. To exploit domain-specific knowledge of how past events might affect an event’s present probability, we propose using a temporal deductive database to track structured facts over time. Rules serve to prove facts from other facts and from past events. Each fact has a time-varying state—a vector computed by a neural net whose topology is determined by the fact’s provenance, including its experience of past events. The possible event types at any time are given by special facts, whose probabilities are neurally modeled alongside their states. In both synthetic and real-world domains, we show that neural probabilistic models derived from concise Datalog programs improve prediction by encoding appropriate domain knowledge in their architecture.
APA
Mei, H., Qin, G., Xu, M. & Eisner, J.. (2020). Neural Datalog Through Time: Informed Temporal Modeling via Logical Specification. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6808-6819 Available from https://proceedings.mlr.press/v119/mei20a.html.

Related Material