Bayesian Optimisation over Mixed Parameter Spaces

UoM administered thesis: Phd

  • Authors:
  • Marius Tudor Morar

Abstract

Bayesian optimisation is an active area of research that addresses the general parameter tuning problem. Optimisation over mixed variables (i.e. parameters) is an under-studied sub-problem of Bayesian optimisation, even though it arises frequently. Although the study of acquisition function maximisation does not receive much interest, we argue that the choice of acquisition function maximiser is essential to successful Bayesian optimisation, even more so when facing a mix of continuous, integer, and categorical variables. We compare three optimisation algorithms for acquisition function maximisation, including a novel "snap to grid" scheme. For the purpose of a good coverage of techniques, we include both gradient-based and gradient-free algorithms in our comparison. We test the three algorithms on synthetic benchmarks and on six machine learning hyperparameter optimisation tasks, and conclude that in some test cases the differences between the three optimisers are significant enough to warrant careful consideration when choosing the best one for a task. In particular, the gradient-free evolution strategy is competitive with the other two methods in the vast majority of the tasks, and especially well suited for tasks with many categorical variables. In addition to acquisition function maximisation, which is the main topic of this thesis, we also study the effect of initialisation size on Bayesian optimisation in the context of mixed variables.

Details

Original languageEnglish
Awarding Institution
Supervisors/Advisors
Award date1 Aug 2021