Hyperparameter optimization and architecture search can easily become prohibitively expensive for regular black-box Bayesian optimization because training and validation of a single model can already take several hours. To overcome this, we introduce a tool suite for multi-fidelity Bayesian optimization that allows the specification of design spaces in Python, the efficient optimization of black-box functions by using cheap approximations and an automatic analysis of of the optimization process and results to gain better understanding.
- https://github.com/automl/CBC: Examples and Jupyter notebooks
- https://github.com/automl/ConfigSpace: Configuration space to define search space
- https://github.com/automl/HpBandSter: implementations of BOHB, Hyperband, successive halving and random search
- https://github.com/automl/CAVE: analysis and visualization of HPO runs
- Exemplary CAVE Report (30MB)