by , ,
Abstract:
We consider the problem of variational inference in probabilistic models with both log-submodular and log-supermodular higher-order potentials. These models can represent arbitrary distributions over binary variables, and thus generalize the commonly used pairwise Markov random fields and models with log-supermodular potentials only, for which efficient approximate inference algorithms are known. While inference in the considered models is P-hard in general, we present efficient approximate algorithms exploiting recent advances in the field of discrete optimization. We demonstrate the effectiveness of our approach in a large set of experiments, where our model allows reasoning about preferences over sets of items with complements and substitutes.
Reference:
Variational Inference in Mixed Probabilistic Submodular Models J. Djolonga, S. Tschiatschek, A. KrauseIn Neural Information Processing Systems (NIPS), 2016
Bibtex Entry:
@inproceedings{djolonga16mixed,
	Author = {Josip Djolonga and Sebastian Tschiatschek and Andreas Krause},
	Booktitle = {Neural Information Processing Systems (NIPS)},
	Month = {December},
	Title = {Variational Inference in Mixed Probabilistic Submodular Models},
	Video = {https://www.youtube.com/watch?v=nGctro0ZsXY},
	Year = {2016}}