43,95 €
43,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
22 °P sammeln
43,95 €
43,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
22 °P sammeln
Als Download kaufen
43,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
22 °P sammeln
Jetzt verschenken
43,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
22 °P sammeln
  • Format: PDF


Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods.
This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods.…mehr

Produktbeschreibung


Dive into hyperparameter tuning of machine learning models and focus on what hyperparameters are and how they work. This book discusses different techniques of hyperparameters tuning, from the basics to advanced methods.

This is a step-by-step guide to hyperparameter optimization, starting with what hyperparameters are and how they affect different aspects of machine learning models. It then goes through some basic (brute force) algorithms of hyperparameter optimization. Further, the author addresses the problem of time and memory constraints, using distributed optimization methods. Next you'll discuss Bayesian optimization for hyperparameter search, which learns from its previous history.

The book discusses different frameworks, such as Hyperopt and Optuna, which implements sequential model-based global optimization (SMBO) algorithms. During these discussions, you'll focus on different aspects such as creation of search spaces and distributed optimization of these libraries.

Hyperparameter Optimization in Machine Learning creates an understanding of how these algorithms work and how you can use them in real-life data science problems. The final chapter summaries the role of hyperparameter optimization in automated machine learning and ends with a tutorial to create your own AutoML script.

Hyperparameter optimization is tedious task, so sit back and let these algorithms do your work.

You will:

  • Discover how changes in hyperparameters affect the model's performance.
  • Apply different hyperparameter tuning algorithms to data science problems
  • Work with Bayesian optimization methods to create efficient machine learning and deep learning models
  • Distribute hyperparameter optimization using a cluster of machines
  • Approach automated machine learning using hyperparameter optimization



Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.

Autorenporträt
Tanay is a deep learning engineer and researcher, who graduated in 2019 in Bachelor of Technology from SMVDU, J&K. He is currently working at Curl Hg on SARA, an OCR platform. He is also advisor to Witooth Dental Services and Technologies. He started his career at MateLabs working on an AutoML Platform, Mateverse. He has worked extensively on hyperparameter optimization. He has also delivered talks on hyperparameter optimization at conferences including PyData, Delhi and PyCon, India.

Rezensionen
"The author keeps a firm grasp on the subject, going from a detailed description of what hyperparameter tuning is to the effective ways to use it. ... this book would be most useful to scholars and professionals working on machine learning models. Readers looking for implementational assistance with the performance of their models will be the best fit ... ." (Niraj Singh, Computing Reviews, December 2, 2022)