by , , , ,
Supervised learning can improve the design of state-of-the-art solvers for combinatorial problems, but labelling large numbers of combinatorial instances is often impractical due to exponential worst-case complexity. Inspired by the recent success of contrastive pre-training for images, we conduct a scientific study of the effect of augmentation design on contrastive pre-training for the Boolean satisfiability problem. While typical graph contrastive pre-training uses label-agnostic augmentations, our key insight is that many combinatorial problems have well-studied invariances, which allow for the design of label-preserving augmentations. We find that label-preserving augmentations are critical for the success of contrastive pre-training. We show that our representations are able to achieve comparable test accuracy to fully-supervised learning while using only 1% of the labels. We also demonstrate that our representations are more transferable to larger problems from unseen domains. Our code is available at
Augment with Care: Contrastive Learning for Combinatorial Problems H. Duan, P. Vaezipoor, M. B. Paulus, Y. Ruan, C. MaddisonIn International Conference on Machine Learning, 2022
Bibtex Entry:
	author = {Duan, Haonan and Vaezipoor, Pashootan and Paulus, Max Benedikt and Ruan, Yangjun and Maddison, Chris},
	booktitle = {International Conference on Machine Learning},
	organization = {PMLR},
	pages = {5627--5642},
	title = {Augment with Care: Contrastive Learning for Combinatorial Problems},
	year = {2022}}