AutoML.org

Freiburg-Hannover-Tübingen

AutoML

What is AutoML?

Automated Machine Learning provides methods and processes to make Machine Learning available for non-Machine Learning experts, to improve efficiency of Machine Learning and to accelerate research on Machine Learning.

Machine learning (ML) has achieved considerable successes in recent years and an ever-growing number of disciplines rely on it. However, this success crucially relies on human machine learning experts to perform the following tasks:

  • Preprocess and clean the data.
  • Select and construct appropriate features.
  • Select an appropriate model family.
  • Optimize model hyperparameters.
  • Design the topology of neural networks (if deep learning is used).
  • Postprocess machine learning models.
  • Critically analyze the results obtained.

As the complexity of these tasks is often beyond non-ML-experts, the rapid growth of machine learning applications has created a demand for off-the-shelf machine learning methods that can be used easily and without expert knowledge. We call the resulting research area that targets progressive automation of machine learning AutoML.

Examples of AutoML

Research in Automated Machine Learning is very diverse and brought up packages and methods targeted at both researchers and end users.

AutoML Systems

Throughout recent years several off-the-shelf packages have been developed which provide automated machine learning. We have developed:

  • AutoWEKA is an approach for the simultaneous selection of a machine learning algorithm and its hyperparameters; combined with the WEKA package it automatically yields good models for a wide variety of data sets.
  • Auto-sklearn is an extension of AutoWEKA using the Python library scikit-learn which is a drop-in replacement for regular scikit-learn classifiers and regressors.
  • Auto-PyTorch is based on the deep learning framework PyTorch and jointly optimizes hyperparameters and the neural architecture.

Other’s well-known AutoML packages include:

  • AutoGluon is a multi-layer stacking approach of diverse ML models.
  • H2O AutoML provides automated model selection and ensembling for the H2O machine learning and data analytics platform.
  • MLBoX is an AutoML  library with three components: preprocessing, optimisation and prediction.
  • TPOT is a data-science assistant which optimizes machine learning pipelines using genetic programming.
  • TransmogrifAI is an AutoML library running on top of Spark.

AutoML to Advance and Improve Research

Making a science of model search argues that the performance of a given technique depends on both the fundamental quality of the algorithm and the details of its tuning and that it is sometimes difficult to know whether a given technique is genuinely better, or simply better tuned. To improve the situation, Bergstra et al. proposed reporting results obtained by tuning all algorithms with the same hyperparameter optimization toolkit. Sculley et al.’s ICLR’18 workshop paper Winner’s Curse argues in the same direction and gives recent examples in which correct hyperperameter optimization of baselines improved over the latest state-of-the-art results and newly proposed methods.

Hyperparameter optimization and algorithm configuration provide methods to automate the tedious, time-consuming and error-prone process of tuning hyperparameters to new tasks at hand. We for example provide packages for hyperparameter optimization:

  • SMAC3 – a python re-implementation of the SMAC algorithm
  • DEHB: Differential Evolution combined with HyperBand

You can find more HPO packages here.

Architecture Search and Automated Deep Learning

The field of architecture search addresses the problem of finding a well-performing architecture of a deep neural network. Automated architecture search can substantially sped up the development of new deep learning application as developers do not need to painstakingly evaluate different architectures.

For an overview on architecture search, we refer the interested reader to our literature overview on neural architecture search.

Packages for architecture search and hyperparameter optimization for deep learning include:

See also here.

Further Resources