AutoML.org

Freiburg-Hannover-Tübingen

Our Tools for HPO

Our open-source philosophy and principles of reproducible research see us make available various packages for Hyperparameter Optimization that also incorporates most of our active research.

Below are a few of the most recent and maintained tools linked:


Neural Pipeline Search (NePS)
NePS helps deep learning experts to optimize the hyperparameters and/or architecture of their deep learning pipeline with joint architecture and hyperparameter optimization support. NePS has modules to perform Bayesian Optimization (BO), Multi-fidelity search, and Multi-fidelity BO, supports asynchronous parallelism, and allows an interface for expert prior input.

DEHB
This is a package for multi-fidelity HPO that uses Differential Evolution as the search component in HyperBand for strong anytime performance.


For a more comprehensive list of tools available in the larger community, refer here.