by , , ,
Abstract:
The optimization of expensive to evaluate, black-box, mixed-variable functions, i.e. functions that have continuous and discrete inputs, is a difficult and yet pervasive problem in science and engineering. In Bayesian optimization (BO), special cases of this problem that consider fully continuous or fully discrete domains have been widely studied. However, few methods exist for mixed-variable domains. In this paper, we introduce MiVaBo, a novel BO algorithm for the efficient optimization of mixed-variable functions that combines a linear surrogate model based on expressive feature representations with Thompson sampling. We propose two methods to optimize its acquisition function, a challenging problem for mixed-variable domains, and we show that MiVaBo can handle complex constraints over the discrete part of the domain that other methods cannot take into account. Moreover, we provide the first convergence analysis of a mixed-variable BO algorithm. Finally, we show that MiVaBo is significantly more sample efficient than state-of-the-art mixed-variable BO algorithms on hyperparameter tuning tasks.
Reference:
Mixed-Variable Bayesian Optimization E. Daxberger, A. Makarova, M. Turchetta, A. KrauseArXiv, 2019
Bibtex Entry:
@misc{daxberger2019mixedvariable,
	Archiveprefix = {arXiv},
	Author = {Erik Daxberger and Anastasia Makarova and Matteo Turchetta and Andreas Krause},
	Eprint = {1907.01329},
	Month = {July},
	Primaryclass = {cs.LG},
	Publisher = {ArXiv},
	Title = {Mixed-Variable Bayesian Optimization},
	Year = {2019}}