DL2: Training and Querying Neural Networks with Logic

Marc Fischer, Mislav Balunovic, Dana Drachsler-Cohen, Timon Gehr, Ce Zhang, Martin Vechev
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1931-1941, 2019.

Abstract

We present DL2, a system for training and querying neural networks with logical constraints. Using DL2, one can declaratively specify domain knowledge constraints to be enforced during training, as well as pose queries on the model to find inputs that satisfy a set of constraints. DL2 works by translating logical constraints into a loss function with desirable mathematical properties. The loss is then minimized with standard gradient-based methods. We evaluate DL2 by training networks with interesting constraints in unsupervised, semi-supervised and supervised settings. Our experimental evaluation demonstrates that DL2 is more expressive than prior approaches combining logic and neural networks, and its loss functions are better suited for optimization. Further, we show that for a number of queries, DL2 can find the desired inputs in seconds (even for large models such as ResNet-50 on ImageNet).

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-fischer19a, title = {{DL}2: Training and Querying Neural Networks with Logic}, author = {Fischer, Marc and Balunovic, Mislav and Drachsler-Cohen, Dana and Gehr, Timon and Zhang, Ce and Vechev, Martin}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {1931--1941}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/fischer19a/fischer19a.pdf}, url = {https://proceedings.mlr.press/v97/fischer19a.html}, abstract = {We present DL2, a system for training and querying neural networks with logical constraints. Using DL2, one can declaratively specify domain knowledge constraints to be enforced during training, as well as pose queries on the model to find inputs that satisfy a set of constraints. DL2 works by translating logical constraints into a loss function with desirable mathematical properties. The loss is then minimized with standard gradient-based methods. We evaluate DL2 by training networks with interesting constraints in unsupervised, semi-supervised and supervised settings. Our experimental evaluation demonstrates that DL2 is more expressive than prior approaches combining logic and neural networks, and its loss functions are better suited for optimization. Further, we show that for a number of queries, DL2 can find the desired inputs in seconds (even for large models such as ResNet-50 on ImageNet).} }
Endnote
%0 Conference Paper %T DL2: Training and Querying Neural Networks with Logic %A Marc Fischer %A Mislav Balunovic %A Dana Drachsler-Cohen %A Timon Gehr %A Ce Zhang %A Martin Vechev %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-fischer19a %I PMLR %P 1931--1941 %U https://proceedings.mlr.press/v97/fischer19a.html %V 97 %X We present DL2, a system for training and querying neural networks with logical constraints. Using DL2, one can declaratively specify domain knowledge constraints to be enforced during training, as well as pose queries on the model to find inputs that satisfy a set of constraints. DL2 works by translating logical constraints into a loss function with desirable mathematical properties. The loss is then minimized with standard gradient-based methods. We evaluate DL2 by training networks with interesting constraints in unsupervised, semi-supervised and supervised settings. Our experimental evaluation demonstrates that DL2 is more expressive than prior approaches combining logic and neural networks, and its loss functions are better suited for optimization. Further, we show that for a number of queries, DL2 can find the desired inputs in seconds (even for large models such as ResNet-50 on ImageNet).
APA
Fischer, M., Balunovic, M., Drachsler-Cohen, D., Gehr, T., Zhang, C. & Vechev, M.. (2019). DL2: Training and Querying Neural Networks with Logic. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:1931-1941 Available from https://proceedings.mlr.press/v97/fischer19a.html.

Related Material