by , , ,
Abstract:
Continuous optimization techniques, such as SGD and its extensions, are the main workhorse of modern machine learning. Nevertheless, a variety of important machine learning problems require solving discrete optimization problems with submodular objectives. The goal of this paper is to unleash the toolkit of modern continuous optimization to such discrete problems. We first introduce a framework for \emphstochastic submodular optimization where, instead of the \emphoracle access to the underlying objective, one explicitly considers both the statistical and computational aspects of evaluating the objective. We then provide a formalization \emphstochastic submodular maximization for a class of important discrete optimization problems and show how the state-of-the-art techniques from continuous optimization can be lifted to the realm of discrete optimization. In an extensive experimental evaluation we demonstrate the practical impact of the proposed approach.
Reference:
Stochastic Submodular Maximization: The Case of Coverage Functions M. R. Karimi, M. Lucic, H. Hassani, A. KrauseIn Neural Information Processing Systems (NIPS), 2017
Bibtex Entry:
@inproceedings{karimiwhere, instead of the \emph{oracle} access to the underlying objective, one explicitly considers both the statistical and computational aspects of evaluating the objective. We then provide a formalization \emph{stochastic submodular maximization} for a class of important discrete optimization problems and show how the state-of-the-art techniques from continuous optimization can be lifted to the realm of discrete optimization. In an extensive experimental evaluation we demonstrate the practical impact of the proposed approach.},
	Author = {Mohammad R. Karimi and Mario Lucic and Hamed Hassani and Andreas Krause},
	Booktitle = {Neural Information Processing Systems (NIPS)},
	Month = {December},
	Title = {Stochastic Submodular Maximization: The Case of Coverage Functions},
	Year = {2017}}