Events

13. November

10:00

Forschungszentrum Jülich

Hyperparameter Optimization

The course aims to teach you the basic knowledge of hyperparamter optimization such that an appropriate set of numerical parameters for a learning algorithm can be found. The acquired knowledge is deepened in a two-hour practical session using Jupyter Notebooks.

CONTENT

Part I: Theory

  • train / development / test sets
  • regularization techniques (dropout, L1/L2-regularization)
  • optimization algorithms
  • batch normalization
  • grid search vs. random search vs. Bayesian optimization vs. gradient-based optimization vs. evolutionary optimization

Part II: Practical exercises "Hyperparameter optimization for the improvement of neural networks"

  • optimization of Jupyter notebooks using the Talos library for Keras
  • good practice guidelines for hyperparameter tuning


Trainers:
Dr Alexander Rüttgers and Dr Charlotte Debus, DLR (Cologne)

Requirements:

  • Laptop with Python3 + Jupyter notebooks
  • Participants should have basic knowledge of Python and Machine Learning
  • Course Materials: All exercise notebooks are provided on GitHub.

Registration:
The course is designed for 50 students. There are no course fees, but you will have to cover the travel expenses yourself.

Please register at hds-lee@fz-juelich.de by October 16, 2020.

download