by , ,
Abstract:
Bayesian optimization and kernelized bandit algorithms are widely used techniques for sequential black box function optimization with applications in parameter tuning, control, robotics among many others. To be effective in high dimensional settings, previous approaches make additional assumptions, for example on low-dimensional subspaces or an additive structure. In this work, we go beyond the additivity assumption and use an orthogonal projection pursuit regression model, which strictly generalizes additive models. We present a two-stage algorithm motivated by experimental design to first decorrelate the additive components. Subsequently, the bandit optimization benefits from the statistically efficient additive model. Our method provably decorrelates the fully additive model and achieves optimal sublinear simple regret in terms of the number of function evaluations. To prove the rotation recovery, we derive novel concentration inequalities for linear regression on subspaces. In addition, we specifically address the issue of acquisition function optimization and present two domain dependent efficient algorithms. We validate the algorithm numerically on synthetic as well as real-world optimization problems.
Reference:
Experimental Design for Optimization of Orthogonal Projection Pursuit Models M. Mutný, J. Kirschner, A. KrauseIn Proceedings of the 34th AAAI Conference on Artificial Intelligence (AAAI), 2020
Bibtex Entry:
@inproceedings{Mutny2020,
	author = {Mutn\'{y}, Mojmir and Kirschner, Johannes and Krause, Andreas},
	booktitle = {Proceedings of the 34th AAAI Conference on Artificial Intelligence (AAAI)},
	month = {February},
	title = {Experimental Design for Optimization of Orthogonal Projection Pursuit Models},
	year = {2020}}