by O. Bachem, M. Lucic, A. Krause
Abstract:
Scalable training of Bayesian nonparametric models is a notoriously difficult challenge. We explore the use of coresets - a data summarization technique originating from computational geometry - for this task. Coresets are weighted subsets of the data such that models trained on these coresets are provably competitive with models trained on the full dataset. Coresets sublinear in the dataset size allow for fast approximate inference with provable guarantees. Existing constructions, however, are limited to parametric problems. Using novel techniques in coreset construction we show the existence of coresets for DP- Means - a prototypical nonparametric clustering problem - and provide a practical construction algorithm. We empirically demonstrate that our algorithm allows us to efficiently trade off computation time and approximation error and thus scale DP-Means to large datasets. For instance, with coresets we can obtain a computational speedup of 45x at an approximation error of only 2.4% compared to solving on the full data set. In contrast, for the same subsample size, the ``naive'' approach of uniformly subsampling the data incurs an approximation error of 22.5%.
Reference:
Coresets for Nonparametric Estimation - the Case of DP-Means O. Bachem, M. Lucic, A. KrauseIn Proc. International Conference on Machine Learning (ICML), 2015
Bibtex Entry:
@inproceedings{bachem15dpmeans,
author = {Olivier Bachem and Mario Lucic and Andreas Krause},
booktitle = {Proc. International Conference on Machine Learning (ICML)},
month = {July},
title = {Coresets for Nonparametric Estimation - the Case of DP-Means},
year = {2015}}