by ,
Submodular optimization has found many applications in machine learning and beyond. We carry out the first systematic investigation of inference in probabilistic models defined through submodular functions, generalizing regular pairwise MRFs and Determinantal Point Processes. In particular, we present L-Field, a variational approach to general log-submodular and log-supermodular distributions based on sub- and supergradients. We obtain both lower and upper bounds on the log-partition function, which enables us to compute probability intervals for marginals, conditionals and marginal likelihoods. We also obtain fully factorized approximate posteriors, at the same computational cost as ordinary submodular optimization. Our framework results in convex problems for optimizing over differentials of submodular functions, which we show how to optimally solve. We provide theoretical guarantees of the approximation quality with respect to the curvature of the function. We further establish natural relations between our variational approach and the classical mean-field method. Lastly, we empirically demonstrate the accuracy of our inference scheme on several submodular models.
From MAP to Marginals: Variational Inference in Bayesian Submodular Models J. Djolonga, A. KrauseIn Neural Information Processing Systems (NIPS), 2014
Bibtex Entry:
	Author = {Josip Djolonga and Andreas Krause},
	Booktitle = {Neural Information Processing Systems (NIPS)},
	Month = {December},
	Title = {From MAP to Marginals: Variational Inference in Bayesian Submodular Models},
	Year = {2014}}