Pylon: A PyTorch Framework for Learning with Constraints

Kareem Ahmed, Tao Li, Thy Ton, Quan Guo, Kai-Wei Chang, Parisa Kordjamshidi, Vivek Srikumar, Guy Van den Broeck, Sameer Singh
Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track, PMLR 176:319-324, 2022.

Abstract

Deep learning excels at learning low-level task information from large amounts of data, but struggles with learning high-level domain knowledge, which can often be directly and succinctly expressed. In this work, we introduce Pylon, a neuro-symbolic training framework that builds on PyTorch to augment procedurally trained neural networks with declaratively specified knowledge. Pylon allows users to programmatically specify constraints<\em> as PyTorch functions, and compiles them into a differentiable loss, thus training predictive models that fit the data whilst<\em> satisfying the specified constraints. Pylon includes both exact as well as approximate compilers to efficiently compute the loss, employing fuzzy logic, sampling methods, and circuits, ensuring scalability even to complex models and constraints. A guiding principle in designing Pylon has been the ease with which any existing deep learning codebase can be extended to learn from constraints using only a few lines: a function expressing the constraint and a single line of code to compile it into a loss. We include case studies from natural language processing, computer vision, logical games, and knowledge graphs, that can be interactively trained, and highlights Pylon{’}s usage.

Cite this Paper


BibTeX
@InProceedings{pmlr-v176-ahmed22a, title = {Pylon: A PyTorch Framework for Learning with Constraints}, author = {Ahmed, Kareem and Li, Tao and Ton, Thy and Guo, Quan and Chang, Kai-Wei and Kordjamshidi, Parisa and Srikumar, Vivek and Van den Broeck, Guy and Singh, Sameer}, booktitle = {Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track}, pages = {319--324}, year = {2022}, editor = {Kiela, Douwe and Ciccone, Marco and Caputo, Barbara}, volume = {176}, series = {Proceedings of Machine Learning Research}, month = {06--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v176/ahmed22a/ahmed22a.pdf}, url = {https://proceedings.mlr.press/v176/ahmed22a.html}, abstract = {Deep learning excels at learning low-level task information from large amounts of data, but struggles with learning high-level domain knowledge, which can often be directly and succinctly expressed. In this work, we introduce Pylon, a neuro-symbolic training framework that builds on PyTorch to augment procedurally trained neural networks with declaratively specified knowledge. Pylon allows users to programmatically specify constraints<\em> as PyTorch functions, and compiles them into a differentiable loss, thus training predictive models that fit the data whilst<\em> satisfying the specified constraints. Pylon includes both exact as well as approximate compilers to efficiently compute the loss, employing fuzzy logic, sampling methods, and circuits, ensuring scalability even to complex models and constraints. A guiding principle in designing Pylon has been the ease with which any existing deep learning codebase can be extended to learn from constraints using only a few lines: a function expressing the constraint and a single line of code to compile it into a loss. We include case studies from natural language processing, computer vision, logical games, and knowledge graphs, that can be interactively trained, and highlights Pylon{’}s usage.} }
Endnote
%0 Conference Paper %T Pylon: A PyTorch Framework for Learning with Constraints %A Kareem Ahmed %A Tao Li %A Thy Ton %A Quan Guo %A Kai-Wei Chang %A Parisa Kordjamshidi %A Vivek Srikumar %A Guy Van den Broeck %A Sameer Singh %B Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track %C Proceedings of Machine Learning Research %D 2022 %E Douwe Kiela %E Marco Ciccone %E Barbara Caputo %F pmlr-v176-ahmed22a %I PMLR %P 319--324 %U https://proceedings.mlr.press/v176/ahmed22a.html %V 176 %X Deep learning excels at learning low-level task information from large amounts of data, but struggles with learning high-level domain knowledge, which can often be directly and succinctly expressed. In this work, we introduce Pylon, a neuro-symbolic training framework that builds on PyTorch to augment procedurally trained neural networks with declaratively specified knowledge. Pylon allows users to programmatically specify constraints<\em> as PyTorch functions, and compiles them into a differentiable loss, thus training predictive models that fit the data whilst<\em> satisfying the specified constraints. Pylon includes both exact as well as approximate compilers to efficiently compute the loss, employing fuzzy logic, sampling methods, and circuits, ensuring scalability even to complex models and constraints. A guiding principle in designing Pylon has been the ease with which any existing deep learning codebase can be extended to learn from constraints using only a few lines: a function expressing the constraint and a single line of code to compile it into a loss. We include case studies from natural language processing, computer vision, logical games, and knowledge graphs, that can be interactively trained, and highlights Pylon{’}s usage.
APA
Ahmed, K., Li, T., Ton, T., Guo, Q., Chang, K., Kordjamshidi, P., Srikumar, V., Van den Broeck, G. & Singh, S.. (2022). Pylon: A PyTorch Framework for Learning with Constraints. Proceedings of the NeurIPS 2021 Competitions and Demonstrations Track, in Proceedings of Machine Learning Research 176:319-324 Available from https://proceedings.mlr.press/v176/ahmed22a.html.

Related Material