AutoML.org

Freiburg-Hannover-Tübingen

SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization

SMAC: Sequential Model-Based Algorithm Configuration

SMAC3 [Lindauer et al., 2022] offers a robust and flexible framework for Bayesian Optimization to support users in determining well-performing hyperparameter configurations for their (Machine Learning) algorithms, datasets and applications at hand. The main core consists of Bayesian Optimization in combination with an aggressive racing mechanism to efficiently decide which of two configurations performs better.

🐍 SMAC3 is written in Python3 and continuously tested with Python 3.8, 3.9, and 3.10. Its Random Forest is written in C++.

👩‍💻 We actively maintain SMAC3.

💬 We welcome any questions by posting issues at our repo.

📂 Docs can be found here.

🎁 Features

  • Rich search space with floats, ordinals, categoricals and conditions
  • Ask-and-Tell Interface
  • Continue and Warmstart Optimization
  • Intensification mechanism to efficiently compare configurations
  • User priors
  • Parallelization, local and on a cluster with Dask
  • Multi-fidelity optimization, e.g. when we can evaluate our function with different resolutions
  • Multi-objective optimization with ParEGO
  • Optimization across many tasks (aka algorithm configuration)
  • Function to optimize can either be pythonic or called via a script
  • Easily extensible with callbacks
  • Many components
    • Surrogate model: Gaussian Process, Random Forest
    • Acquisition function: EI, PI, LCB, TS

🐱‍🏍 Use Cases

In our examples we showcase different use cases, e.g.:

  • How to optimize a simple function?
  • How to optimize a Machine Learning model with cross-validation?
  • How to use our ask-and-tell interface? And how can we continue and warmstart the optimization?
  • How can we add our preferences in what area to search (user priors)?
  • How can we use multi-fidelity, i.e. exploit premature evaluations (e.g. only training for half the epochs)?
  • How do we set up a multi-objective problem?
  • How do we interface with the command line if our target function is not pythonic?

⚡ Powered by SMAC3

SMAC3 is not only used by itself but also as a backend for other HPO tools:

📜 History

Did you know that SMAC originated in 2011 [Hutter et al., 2011]? After the first version in Matlab, it got opensourced in java and experienced the first major revamp in 2012. Then python came in 2016. Six years later SMAC experienced its second major revamp, with a complete new user interface and core.

SMAC sparked and includes numerous publications, here is a non-exhaustive timeline:

2022Lindauer, M., Eggensperger, K., Feurer, M., Biedenkapp, A., Deng, D., Benjamins, C., Ruhkopf, T., Sass, R., and Hutter, F. (2022):
SMAC3: A versatile bayesian optimization package for Hyperparameter Optimization.
JMLR, 23(54):1–9.
2018Lindauer, M. and Hutter, F.:
Warmstarting of Model-Based Algorithm Configuration.
AAAI 2018: 1355-1362.
2015Falkner, S., Lindauer, M. and Hutter, F.:
SpySMAC: Automated Configuration and Performance Analysis of SAT Solvers.
Proceedings of the International Conference on Satisfiability Solving (SAT’15).
2013Hutter, F. Hoos, HH. and Leyton-Brown, K.:
An evaluation of sequential model-based optimization for expensive blackbox functions.
GECCO (Companion) 2013: 1209-1216.
2012Hutter, F. Hoos, HH. and Leyton-Brown, K.:
Parallel Algorithm Configuration.
LION 2012: 55-70
2011Hutter, F. Hoos, HH. and Leyton-Brown, K.:
Bayesian Optimization With Censored Response Data.
2011 NIPS workshop on Bayesian Optimization, Experimental Design, and Bandits.
2011Hutter, F. Hoos, HH. and Leyton-Brown, K.:
Sequential Model-Based Optimization for General Algorithm Configuration.
LION-5, 2011.
2010Hutter, F. Hoos, HH., Leyton-Brown, K. and Murphy, K.:
Time-Bounded Sequential Parameter Optimization.
LION4, 2010.

References

[Deng et al., 2022] Difan Deng, Florian Karl, Frank Hutter, Bernd Bischl, Marius Lindauer: Efficient Automated Deep Learning for Time Series Forecasting. ECML/PKDD (3) 2022: 664-680

[Feurer et al., 2015] Matthias Feurer, Aaron Klein, Katharina Eggensperger, Jost Tobias Springenberg, Manuel Blum, Frank Hutter: Efficient and Robust Automated Machine Learning. NIPS 2015: 2962-2970

[Feurer et al., 2022] Matthias Feurer, Katharina Eggensperger, Stefan Falkner, Marius Lindauer, Frank Hutter: Auto-Sklearn 2.0: Hands-free AutoML via Meta-Learning. J. Mach. Learn. Res. 23: 261:1-261:61 (2022)

[Hutter et al., 2011] Frank Hutter, Holger H. Hoos, Kevin Leyton-Brown:
Sequential Model-Based Optimization for General Algorithm Configuration. LION 2011: 507-523

[Kotthoff et al., 2017] Lars Kotthoff, Chris Thornton, Holger H. Hoos, Frank Hutter, Kevin Leyton-Brown:
Auto-WEKA 2.0: Automatic model selection and hyperparameter optimization in WEKA. J. Mach. Learn. Res. 18: 25:1-25:5 (2017)

[Lindauer et al., 2022] Lindauer, M., Eggensperger, K., Feurer, M., Biedenkapp, A., Deng, D., Benjamins, C., Ruhkopf, T., Sass, R., and Hutter, F. (2022). SMAC3: A versatile bayesian optimization package for Hyperparameter Optimization. JMLR, 23(54):1–9.

[Mendoza et al., 2019] Hector Mendoza, Aaron Klein, Matthias Feurer, Jost Tobias Springenberg, Matthias Urban, Michael Burkart, Maximilian Dippel, Marius Lindauer, Frank Hutter:
Towards Automatically-Tuned Deep Neural Networks. Automated Machine Learning 2019: 135-149

[Thornton et al., 2013] Chris Thornton, Frank Hutter, Holger H. Hoos, Kevin Leyton-Brown:
Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms. KDD 2013: 847-855

[Zimmer et al., 2021] Lucas Zimmer, Marius Lindauer, Frank Hutter: Auto-Pytorch: Multi-Fidelity MetaLearning for Efficient and Robust AutoDL. IEEE Trans. Pattern Anal. Mach. Intell. 43(9): 3079-3090 (2021)