by , , ,
Abstract:
The optimization of expensive to evaluate, black-box, mixed-variable functions, i.e. functions that have continuous and discrete inputs, is a difficult and yet pervasive problem in science and engineering. In Bayesian optimization (BO), special cases of this problem that consider fully continuous or fully discrete domains have been widely studied. However, few methods exist for mixed-variable domains. In this paper, we introduce MiVaBo, a novel BO algorithm for the efficient optimization of mixed-variable functions that combines a linear surrogate model based on expressive feature representations with Thompson sampling. We propose two methods to optimize its acquisition function, a challenging problem for mixed-variable domains, and we show that MiVaBo can handle complex constraints over the discrete part of the domain that other methods cannot take into account. Moreover, we provide the first convergence analysis of a mixed-variable BO algorithm. Finally, we show that MiVaBo is significantly more sample efficient than state-of-the-art mixed-variable BO algorithms on hyperparameter tuning tasks.
Reference:
Mixed-Variable Bayesian Optimization E. Daxberger, A. Makarova, M. Turchetta, A. KrauseIn Proc. International Joint Conference on Artificial Intelligence (IJCAI), 2020
Bibtex Entry:
@inproceedings{daxberger2020mixedvariable,
	author = {Erik Daxberger and Anastasia Makarova and Matteo Turchetta and Andreas Krause},
	booktitle = {Proc. International Joint Conference on Artificial Intelligence (IJCAI)},
	title = {Mixed-Variable Bayesian Optimization},
	year = {2020}}