Bayesian optimization is a global optimization strategy for (potentially noisy) functions with unknown derivatives. With well-chosen priors, it can find optima with fewer function evaluations than alternatives, making it well suited for the optimization of costly objective functions.
Well known examples include hyper-parameter tuning of machine learning models (see e.g. Taking the Human Out of the Loop: A Review of Bayesian Optimization). The Julia package BayesianOptimization.jl currently supports only basic Bayesian optimization methods. There are multiple directions to improve the package, including (but not limited to)
Hybrid Bayesian Optimization (duration: 175h, expected difficulty: medium) with discrete and continuous variables. Implement e.g. HyBO see also here.
Scalable Bayesian Optimization (duration: 175h, expected difficulty: medium): implement e.g. TuRBO or SCBO.
Better Defaults (duration: 175h, expected difficulty: easy): write an extensive test suite and implement better defaults; draw inspiration from e.g. dragonfly.
Recommended Skills: Familiarity with Bayesian inference, non-linear optimization, writing Julia code and reading Python code.
Expected Outcome: Well-tested and well-documented new features.
Mentor: Johanni Brea