Friday, 10:00

Forschungszentrum Jülich

Hyperparameter Optimization

[Translate to Englisch:]

The course aims to teach you the basic knowledge of hyperparamter optimization such that an appropriate set of numerical parameters for a learning algorithm can be found. The acquired knowledge is deepened in a two-hour practical session using Jupyter Notebooks.


Part I: Theory

  • train / development / test sets
  • regularization techniques (dropout, L1/L2-regularization)
  • optimization algorithms
  • batch normalization
  • grid search vs. random search vs. Bayesian optimization vs. gradient-based optimization vs. evolutionary optimization

Part II: Practical exercises "Hyperparameter optimization for the improvement of neural networks"

  • optimization of Jupyter notebooks using the Talos library for Keras
  • good practice guidelines for hyperparameter tuning

Dr Alexander Rüttgers and Dr Charlotte Debus, DLR (Cologne)


  • Laptop with Python3 + Jupyter notebooks
  • Participants should have basic knowledge of Python and Machine Learning
  • Course Materials: All exercise notebooks are provided on GitHub.

The course is designed for 50 students. There are no course fees, but you will have to cover the travel expenses yourself.

Please register at by October 16, 2020.