by , , ,
While central to the application of probabilistic models to discrete data, the problem of marginal inference is in general intractable and efficient approximation schemes need to exploit the problem structure. Recently, there have been efforts to develop inference techniques that do not necessarily make factorization assumptions about the distribution, but rather exploit the fact that sometimes there exist efficient algorithms for finding the MAP configuration. In this paper, we theoretically prove that for discrete multi-label models the bounds on the partition function obtained by two of these approaches, Perturb-and-MAP and the bound from the infinite Rényi divergence, can be only improved by clamping any subset of the variables. For the case of log-supermodular models we provide a more detailed analysis and develop a set of efficient strategies for choosing the order in which the variables should be clamped. Finally, we present a number of numerical experiments showcasing the improvements obtained by the proposed methods on several models.
Improving Optimization-Based Approximate Inference by Clamping Variables J. Zhao, J. Djolonga, S. Tschiatschek, A. KrauseIn Conference on Uncertainty in Artificial Intelligence (UAI), 2017
Bibtex Entry:
	author = {Junyao Zhao and Josip Djolonga and Sebastian Tschiatschek and Andreas Krause},
	booktitle = {Conference on Uncertainty in Artificial Intelligence (UAI)},
	month = {August},
	title = {Improving Optimization-Based Approximate Inference by Clamping Variables},
	year = {2017}}