BOHB And Friends

Hyperparameter optimization and architecture search can easily become prohibitively expensive for regular black-box Bayesian optimization because training and validation of a single model can already take several hours. To overcome this, we introduce a tool suite for multi-fidelity Bayesian optimization that allows the specification of design spaces in Python, the efficient optimization of black-box functions by using cheap approximations and an automatic analysis of of the optimization process and results to gain better understanding.