by E. Balkanski, and A. Krause, B. Mirzasoleiman, Y. Singer
Abstract:
We consider the problem of learning sparse representations of data sets, where the goal is to reduce a data set in manner that optimizes multiple objectives. Motivated by applications of data summarization, we develop a new model which we refer to as the two-stage submodular maximization problem. This task can be viewed as a combinatorial analogue of representation learning problems such as dictionary learning and sparse regression. The two-stage problem strictly generalizes the problem of cardinality constrained submodular maximization, though the objective function is not submodular and the techniques for submodular maximization cannot be applied. We describe a continuous optimization method which achieves an approximation ratio which asymptotically approaches 1-1/e. For instances where the asymptotics do not kick in, we design a local-search algorithm whose approximation ratio is arbitrarily close to 1/2. We empirically demonstrate the effectiveness of our methods on two multi-objective data summarization tasks, where the goal is to construct summaries via sparse representative subsets w.r.t. to predefined objectives.
Reference:
Learning Sparse Combinatorial Representations via Two-stage Submodular Maximization E. Balkanski, and A. Krause, B. Mirzasoleiman, Y. SingerIn Proc. International Conference on Machine Learning (ICML), 2016
Bibtex Entry:
@inproceedings{balkanski16learning,
author = {Erik Balkanski and and Andreas Krause and Baharan Mirzasoleiman and Yaron Singer},
booktitle = {Proc. International Conference on Machine Learning (ICML)},
month = {June},
title = {Learning Sparse Combinatorial Representations via Two-stage Submodular Maximization},
year = 2016}