Parallel Neurosymbolic Integration with Concordia

Jonathan Feldstein, Modestas Jurčius, Efthymia Tsamoura
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:9870-9885, 2023.

Abstract

Parallel neurosymbolic architectures have been applied effectively in NLP by distilling knowledge from a logic theory into a deep model. However, prior art faces several limitations including supporting restricted forms of logic theories and relying on the assumption of independence between the logic and the deep network. We present Concordia, a framework overcoming the limitations of prior art. Concordia is agnostic both to the deep network and the logic theory offering support for a wide range of probabilistic theories. Our framework can support supervised training of both components and unsupervised training of the neural component. Concordia has been successfully applied to tasks beyond NLP and data classification, improving the accuracy of state-of-the-art on collective activity detection, entity linking and recommendation tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-feldstein23a, title = {Parallel Neurosymbolic Integration with Concordia}, author = {Feldstein, Jonathan and Jur\v{c}ius, Modestas and Tsamoura, Efthymia}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {9870--9885}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/feldstein23a/feldstein23a.pdf}, url = {https://proceedings.mlr.press/v202/feldstein23a.html}, abstract = {Parallel neurosymbolic architectures have been applied effectively in NLP by distilling knowledge from a logic theory into a deep model. However, prior art faces several limitations including supporting restricted forms of logic theories and relying on the assumption of independence between the logic and the deep network. We present Concordia, a framework overcoming the limitations of prior art. Concordia is agnostic both to the deep network and the logic theory offering support for a wide range of probabilistic theories. Our framework can support supervised training of both components and unsupervised training of the neural component. Concordia has been successfully applied to tasks beyond NLP and data classification, improving the accuracy of state-of-the-art on collective activity detection, entity linking and recommendation tasks.} }
Endnote
%0 Conference Paper %T Parallel Neurosymbolic Integration with Concordia %A Jonathan Feldstein %A Modestas Jurčius %A Efthymia Tsamoura %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-feldstein23a %I PMLR %P 9870--9885 %U https://proceedings.mlr.press/v202/feldstein23a.html %V 202 %X Parallel neurosymbolic architectures have been applied effectively in NLP by distilling knowledge from a logic theory into a deep model. However, prior art faces several limitations including supporting restricted forms of logic theories and relying on the assumption of independence between the logic and the deep network. We present Concordia, a framework overcoming the limitations of prior art. Concordia is agnostic both to the deep network and the logic theory offering support for a wide range of probabilistic theories. Our framework can support supervised training of both components and unsupervised training of the neural component. Concordia has been successfully applied to tasks beyond NLP and data classification, improving the accuracy of state-of-the-art on collective activity detection, entity linking and recommendation tasks.
APA
Feldstein, J., Jurčius, M. & Tsamoura, E.. (2023). Parallel Neurosymbolic Integration with Concordia. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:9870-9885 Available from https://proceedings.mlr.press/v202/feldstein23a.html.

Related Material