A Semantic Loss Function for Deep Learning with Symbolic Knowledge

Jingyi Xu, Zilu Zhang, Tal Friedman, Yitao Liang, Guy Broeck
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5502-5511, 2018.

Abstract

This paper develops a novel methodology for using symbolic knowledge in deep learning. From first principles, we derive a semantic loss function that bridges between neural output vectors and logical constraints. This loss function captures how close the neural network is to satisfying the constraints on its output. An experimental evaluation shows that it effectively guides the learner to achieve (near-)state-of-the-art results on semi-supervised multi-class classification. Moreover, it significantly increases the ability of the neural network to predict structured objects, such as rankings and paths. These discrete concepts are tremendously difficult to learn, and benefit from a tight integration of deep learning and symbolic reasoning methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-xu18h, title = {A Semantic Loss Function for Deep Learning with Symbolic Knowledge}, author = {Xu, Jingyi and Zhang, Zilu and Friedman, Tal and Liang, Yitao and Van den Broeck, Guy}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5502--5511}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/xu18h/xu18h.pdf}, url = {http://proceedings.mlr.press/v80/xu18h.html}, abstract = {This paper develops a novel methodology for using symbolic knowledge in deep learning. From first principles, we derive a semantic loss function that bridges between neural output vectors and logical constraints. This loss function captures how close the neural network is to satisfying the constraints on its output. An experimental evaluation shows that it effectively guides the learner to achieve (near-)state-of-the-art results on semi-supervised multi-class classification. Moreover, it significantly increases the ability of the neural network to predict structured objects, such as rankings and paths. These discrete concepts are tremendously difficult to learn, and benefit from a tight integration of deep learning and symbolic reasoning methods.} }
Endnote
%0 Conference Paper %T A Semantic Loss Function for Deep Learning with Symbolic Knowledge %A Jingyi Xu %A Zilu Zhang %A Tal Friedman %A Yitao Liang %A Guy Broeck %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-xu18h %I PMLR %P 5502--5511 %U http://proceedings.mlr.press/v80/xu18h.html %V 80 %X This paper develops a novel methodology for using symbolic knowledge in deep learning. From first principles, we derive a semantic loss function that bridges between neural output vectors and logical constraints. This loss function captures how close the neural network is to satisfying the constraints on its output. An experimental evaluation shows that it effectively guides the learner to achieve (near-)state-of-the-art results on semi-supervised multi-class classification. Moreover, it significantly increases the ability of the neural network to predict structured objects, such as rankings and paths. These discrete concepts are tremendously difficult to learn, and benefit from a tight integration of deep learning and symbolic reasoning methods.
APA
Xu, J., Zhang, Z., Friedman, T., Liang, Y. & Broeck, G.. (2018). A Semantic Loss Function for Deep Learning with Symbolic Knowledge. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:5502-5511 Available from http://proceedings.mlr.press/v80/xu18h.html.

Related Material