Augment with Care: Contrastive Learning for Combinatorial Problems

Haonan Duan, Pashootan Vaezipoor, Max B Paulus, Yangjun Ruan, Chris Maddison
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:5627-5642, 2022.

Abstract

Supervised learning can improve the design of state-of-the-art solvers for combinatorial problems, but labelling large numbers of combinatorial instances is often impractical due to exponential worst-case complexity. Inspired by the recent success of contrastive pre-training for images, we conduct a scientific study of the effect of augmentation design on contrastive pre-training for the Boolean satisfiability problem. While typical graph contrastive pre-training uses label-agnostic augmentations, our key insight is that many combinatorial problems have well-studied invariances, which allow for the design of label-preserving augmentations. We find that label-preserving augmentations are critical for the success of contrastive pre-training. We show that our representations are able to achieve comparable test accuracy to fully-supervised learning while using only 1% of the labels. We also demonstrate that our representations are more transferable to larger problems from unseen domains. Our code is available at https://github.com/h4duan/contrastive-sat.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-duan22b, title = {Augment with Care: Contrastive Learning for Combinatorial Problems}, author = {Duan, Haonan and Vaezipoor, Pashootan and Paulus, Max B and Ruan, Yangjun and Maddison, Chris}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {5627--5642}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/duan22b/duan22b.pdf}, url = {https://proceedings.mlr.press/v162/duan22b.html}, abstract = {Supervised learning can improve the design of state-of-the-art solvers for combinatorial problems, but labelling large numbers of combinatorial instances is often impractical due to exponential worst-case complexity. Inspired by the recent success of contrastive pre-training for images, we conduct a scientific study of the effect of augmentation design on contrastive pre-training for the Boolean satisfiability problem. While typical graph contrastive pre-training uses label-agnostic augmentations, our key insight is that many combinatorial problems have well-studied invariances, which allow for the design of label-preserving augmentations. We find that label-preserving augmentations are critical for the success of contrastive pre-training. We show that our representations are able to achieve comparable test accuracy to fully-supervised learning while using only 1% of the labels. We also demonstrate that our representations are more transferable to larger problems from unseen domains. Our code is available at https://github.com/h4duan/contrastive-sat.} }
Endnote
%0 Conference Paper %T Augment with Care: Contrastive Learning for Combinatorial Problems %A Haonan Duan %A Pashootan Vaezipoor %A Max B Paulus %A Yangjun Ruan %A Chris Maddison %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-duan22b %I PMLR %P 5627--5642 %U https://proceedings.mlr.press/v162/duan22b.html %V 162 %X Supervised learning can improve the design of state-of-the-art solvers for combinatorial problems, but labelling large numbers of combinatorial instances is often impractical due to exponential worst-case complexity. Inspired by the recent success of contrastive pre-training for images, we conduct a scientific study of the effect of augmentation design on contrastive pre-training for the Boolean satisfiability problem. While typical graph contrastive pre-training uses label-agnostic augmentations, our key insight is that many combinatorial problems have well-studied invariances, which allow for the design of label-preserving augmentations. We find that label-preserving augmentations are critical for the success of contrastive pre-training. We show that our representations are able to achieve comparable test accuracy to fully-supervised learning while using only 1% of the labels. We also demonstrate that our representations are more transferable to larger problems from unseen domains. Our code is available at https://github.com/h4duan/contrastive-sat.
APA
Duan, H., Vaezipoor, P., Paulus, M.B., Ruan, Y. & Maddison, C.. (2022). Augment with Care: Contrastive Learning for Combinatorial Problems. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:5627-5642 Available from https://proceedings.mlr.press/v162/duan22b.html.

Related Material