AutoML.org

Freiburg-Hannover-Tübingen

Dreaming of Many Worlds: Learning Contextual World Models Aids Zero-Shot Generalization

Authors: Sai Prasanna Raman, Karim Farid, Raghu Rajan, André Biedenkapp Zero-shot generalization (ZSG) to unseen dynamics is a major challenge for creating generally capable embodied agents. To address the broader challenge, we start with the simpler setting of contextual reinforcement learning (cRL), assuming observability of the context values that parameterize the variation in the system’s […]

Read More

Introducing Hypersweeper: Bridging the HPO Gap Between AutoML Research and ML Practitioners

The lack of widespread adoption of AutoML tools in the broader ML community has been a recurring topic of discussion within the field. Is this due to a lack of trust in these systems? Do our benchmarks fail to reflect real-world use cases? Or is it simply too difficult to find and implement state-of-the-art methods? […]

Read More

One-Shot Neural Architecture Search for Time Series Forecasting

Authors: Difan Deng and Marius Lindauer TL;DR In our paper, we present a one-shot neural architecture search space for time series forecasting tasks. Our search space contains a hybrid network that contains a Seq Net (for instance, transformers and RNNs that process sequences directly) and a Flat Net (for instance, MLPs that handle each variable […]

Read More

Incorporating Structure in Deep Reinforcement Learning

Authors: Aditya Mohan, Amy Zhang, Marius Lindauer Deep Reinforcement Learning (RL) has significantly advanced in various fields, from playing complex games to controlling robotic systems. However, its application in real-world scenarios still faces numerous challenges, such as poor data efficiency, limited generalization, and a lack of safety guarantees. This work provides a comprehensive overview of […]

Read More

Position: A Call to Action for a Human-Centered AutoML Paradigm

Paper Authors Marius Lindauer, Florian Karl, Anne Klier, Julia Moosbauer, Alexander Tornede, Andreas Mueller, Frank Hutter, Matthias Feurer, Bernd Bischl Motivation Automated Machine Learning (AutoML) has significantly transformed the machine learning landscape by automating the creation and optimization of ML models. This has opened up a path towards democratized ML, making it accessible to a […]

Read More

Review of the Year 2023 – AutoML Hannover

by the AutoML Hannover Team The year 2023 was the most successful for us as a (still relatively young) AutoML group in Hannover. With the start of several big projects, including the ERC starting grant on interactive and explainable AutoML and a BMUV-funded project on Green AutoML, the group has grown and we were able […]

Read More

New Horizons in Parameter Regularization: A Constraint Approach

Authors: Jörg Franke, Michael Hefenbrock, Gregor Koehler, Frank Hutter Introduction We present in our recent preprint a novel approach to parameter regularization for deep learning: Constrained Parameter Regularization (CPR). It is an alternative to traditional weight decay. Instead of applying a constant penalty uniformly to all parameters, we enforce an upper bound on a statistical […]

Read More

Rethinking Performance Measures of RNA Secondary Structure Problems

TL;DR In our NeurIPS workshop paper , we analyze different performance measures for the evaluation of RNA secondary structure prediction algorithms, showing that commonly used measures are flawed in certain settings. We then propose the Weisfeiler-Lehman graph kernel as a competent measure for performance assessment in the field. RNA Secondary Structure Prediction Ribonucleic acid (RNA) […]

Read More

LC-PFN: Efficient Bayesian Learning Curve Extrapolation using Prior-Data Fitted Networks

Authors: Steven Adriaensen*, Herilalaina Rakotoarison*, Samuel Müller, and Frank Hutter TL;DR In our paper, we propose LC-PFN, a novel method for Bayesian learning curve extrapolation. LC-PFN is a prior-data-fitted network (PFN), a transformer trained on synthetic learning curve data capable of doing Bayesian learning curve extrapolation in a single forward pass. We show that our […]

Read More

Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars

Authors: Simon Schrodi, Danny Stoll, Binxin Ru, Rhea Sanjay Sukthanker, Thomas Brox, and Frank Hutter TL;DR We take a functional view of neural architecture search that allows us to construct highly expressive search spaces based on context-free grammars, and show that we can efficiently find well-performing architectures. NAS is great, but… The neural architecture plays […]

Read More