AutoML.org

Freiburg-Hannover-Tübingen

Neural Architecture Search

Neural Architecture Search (NAS) automates the process of architecture design of neural networks.  NAS approaches optimize the topology of the networks, incl. how to connect nodes and which operators to choose. User-defined optimization metrics can thereby include accuracy, model size or inference time to arrive at an optimal architecture for specific applications. Due to the extremely large search space, traditional evolution or reinforcement learning-based AutoML algorithms tend to be computationally expensive. Hence recent research on the topic has focused on exploring more efficient ways for NAS. In particular, recently developed gradient-based and multi-fidelity methods have provided a promising path and boosted research in these directions. Our group has been very active in developing state of the art NAS methods and has been at the forefront of driving NAS research forward. We give a summary of a few recent important work released from our group –

Selected NAS Papers

Literature Overview

NAS is one of the booming subfields of AutoML and the number of papers is quickly increasing. To provide a comprehensive overview of the recent trends, we provide the following sources:

One-Shot NAS Methods

Meta Learning of Neural Architectures

Neural Ensemble Search

Joint NAS and Hyperparameter Optimization

Multi-Objective NAS

Application-Specific NAS

Large-scale study of NAS methods

Our Blogs

Please also check out blog posts for the related work: