by , ,
Higher-order models have been shown to be very useful for a plethora of computer vision tasks. However, existing techniques have focused mainly on MAP inference. In this paper, we present the first efficient approach towards approximate Bayesian marginal inference in a general class of high-order, multi-label attractive models, where previous techniques slow down exponentially with the order (clique size). We formalize this task as performing inference in log-supermodular models under partition constraints, and present an efficient variational inference technique. The resulting optimization problems are convex and yield bounds on the partition function. We also obtain a fully factorized approximation to the posterior, which can be used in lieu of the true complicated distribution. We empirically demonstrate the performance of our approach by comparing it to traditional inference methods on a challenging high-fidelity multi-label image segmentation dataset. We obtain state-of-the-art classification accuracy for MAP inference, and substantially improved ROC curves using the approximate marginals.
Higher-Order Inference for Multi-class Log-supermodular Models J. Zhang, J. Djolonga, A. KrauseIn International Conference on Computer Vision (ICCV), 2015
Bibtex Entry:
	Author = {Jian Zhang and Josip Djolonga and Andreas Krause},
	Booktitle = {International Conference on Computer Vision (ICCV)},
	Month = {December},
	Title = {Higher-Order Inference for Multi-class Log-supermodular Models},
	Year = {2015}}