Springer, 2023. — 327 p. — ISBN: 978-981-19-5169-.
Hyperparameter tuning? Is this relevant in practice? Is it not rather an academic gimmick? This book provides a wealth of hands-on examples that illustrate how hyperparameter tuning can be applied in practice and gives deep insights into the working mechanisms of Machine Learning (ML) and Deep Learning (DL) methods. Programming code is provided so that users can reproduce the results. The aim of the book is to equip readers with the ability to achieve better results with significantly less time, costs, effort and resources using the methods described here. The case studies presented in this book can be run on a regular desktop or notebook computer. No high-performance computing facilities are required.
ML and DL methods are becoming more and more important and are used in many industrial production processes, e.g., Cyber-physical Production Systems (CPPS). Several hyperparameters of the methods used have to be set appropriately. Previous projects carried out produced inconsistent results in this regard. For example, with Support Vector Machines (SVMs) it could be observed that the tuning of the hyperparameters is critical to success with the same data material, with random forests the results do not differ too much from one another despite different selected hyperparameter values. While some methods have only one or a few hyperparameters, others provide a large number. In the latter case, optimization using a (more or less) fine grid (grid search) quickly becomes very time-consuming and can therefore no longer be implemented. In addition, the question of how the optimality of a selection can be measured in a statistically valid way (test problem: training/validation/test data and resampling methods) arises for both many and a few hyperparameters. In real-world projects, DL experts have gained profound knowledge over time as to what reasonable hyperparameters are, i.e., Hyper Parameter Tuning (HPT) skills are developed. These skills are based on human expert and domain knowledge and not on valid formal rules.
Part I Theory.
Tuning: Methodology Models.
Hyperparameter Tuning Approaches.
Ranking and Result Aggregation.
Part II Applications.
Hyperparameter Tuning and Optimization Applications.
Hyperparameter Tuning in German Official Statistics.
Case Study I: Tuning Random Forest (Ranger).
Case Study II: Tuning of Gradient Boosting (xgboost).
Case Study III: Tuning of Deep Neural Networks.
Case Study IV: Tuned Reinforcement Learning (in PYTHON).
Global Study: Influence of Tuning.
Software Installations.