by , , ,
Abstract:
A variety of large-scale machine learning problems can be cast as instances of constrained submodular maximization. Existing approaches for distributed submodular maximization have a critical drawback: The capacity - number of instances that can fit in memory - must grow with the data set size. In practice, while one can provision many machines, the capacity of each machine is limited by physical constraints. We propose a truly scalable approach for distributed submodular maximization under fixed capacity. The proposed framework applies to a broad class of algorithms and constraints and provides theoretical guarantees on the approximation factor for any available capacity. We empirically evaluate the proposed algorithm on a variety of data sets and demonstrate that it achieves performance competitive with the centralized greedy solution.
Reference:
Horizontally Scalable Submodular Maximization M. Lucic, O. Bachem, M. Zadimoghaddam, A. KrauseIn Proc. International Conference on Machine Learning (ICML), 2016
Bibtex Entry:
@inproceedings{lucic16horizontally,
	author = {Mario Lucic and Olivier Bachem and Morteza Zadimoghaddam and Andreas Krause},
	booktitle = {Proc. International Conference on Machine Learning (ICML)},
	month = {July},
	title = {Horizontally Scalable Submodular Maximization},
	video = {http://techtalks.tv/talks/horizontally-scalable-submodular-maximization/62481},
	year = 2016}