by , ,
Abstract:
Many applications in machine learning require optimizing unknown functions defined over a high-dimensional space from noisy samples that are expensive to obtain. We address this notoriously hard challenge, under the assumptions that the function varies only along some low-dimensional subspace and is smooth (i.e., it has a low norm in a Reproducible Kernel Hilbert Space). In particular, we present the SI-BO algorithm, which leverages recent low-rank matrix recovery techniques to learn the underlying subspace of the unknown function and applies Gaussian Process Upper Confidence sampling for optimization of the function. We carefully calibrate the exploration–exploitation tradeoff by allocating the sampling budget to subspace estimation and function optimization, and obtain the first subexponential cumulative regret bounds and convergence rates for Bayesian optimization in high-dimensions under noisy observations. Numerical results demonstrate the effectiveness of our approach in difficult scenarios.
Reference:
High Dimensional Gaussian Process Bandits J. Djolonga, A. Krause, V. CevherIn Proc. Neural Information Processing Systems (NeurIPS), 2013
Bibtex Entry:
@inproceedings{djolonga13high,
	author = {Josip Djolonga and Andreas Krause and Volkan Cevher},
	booktitle = {Proc. Neural Information Processing Systems (NeurIPS)},
	month = {December},
	title = {High Dimensional Gaussian Process Bandits},
	year = {2013}}