Overview
This page collects some material and references related to submodular optimization,
with applications in particular in machine learning and AI.
Convex optimization has become a main workhorse for many machine
learning algorithms during the past ten years. When minimizing a convex
loss function for, e.g., training a Support Vector Machine, we can rest
assured to efficiently find an optimal solution, even for large
problems. In recent years, another fundamental problem structure, which
has similar beneficial properties, has emerged as very useful in a
variety of machine learning applications:
Submodularity is an
intuitive diminishing returns property, stating that adding an element
to a smaller set helps more than adding it to a larger set.
Similarly to convexity, submodularity allows one to efficiently find provably
(near-)optimal solutions.
- High-performance implementation of the minimum norm point
algorithm for submodular function minimization with several
applications [link]
- MATLAB Toolbox for submodular function optimization [link] maintained by Andreas Krause.
Journal of Machine Learning Research Open Source Software paper [pdf]
- Survey on Submodular Function Maximization
by Daniel Golovin and Andreas Krause. To appear as chapter in
Tractability: Practical Approaches to Hard Problems (This draft is for
personal use only. No further distribution without permission).
- Class on Submodular
Functions by Jeff Bilmes
- Annotated bibliography.
- NIPS 2013 Workshop on Discrete Optimization in Machine
Learning: Theory and Applications organized by Stefanie Jegelka, Andreas Krause, Pradeep Ravikumar, Jeff Bilmes.
- Cargese Workshop on Combinatorial Optimization, Topic: Submodular
Functions organized by Samuel Fiorini, Gianpaolo Oriolo, Gautier
Stauffer and Paolo Ventura.
- NIPS 2012 Workshop on Discrete Optimization in Machine Learning: Structure and Scalability organized by Stefanie Jegelka, Andreas Krause, Pradeep Ravikumar, Jeff Bilmes.
- Modern Aspects of Submodularity workshop at GeorgiaTech organized by Shabbir Ahmed, Nina Balcan, Satoru Iwata and Prasad Tetali
- NIPS 2011 Workshop on Discrete Optimization in Machine Learning: Uncertainty, Generalization and Feedback organized by Andreas Krause, Pradeep Ravikumar, Jeff Bilmes and Stefanie Jegelka. [videos]
- NIPS 2010 Workshop on Discrete Optimization in Machine Learning: Structures, Algorithms and Applications organized by Andreas Krause, Pradeep Ravikumar, Jeff Bilmes and Stefanie Jegelka. [videos]
- NIPS 2009 Workshop on Discrete Optimization in Machine Learning: Submodularity, Sparsity and Polyhedra organized by Andreas Krause, Pradeep Ravikumar and Jeff Bilmes
This page is maintained by
Andreas Krause and
Carlos Guestrin. Please
send suggested additions or corrections by email.