Optimization techniques are at the core of data science. An understanding of the basic techniques and their fundamental properties provides important grounding for students, researchers, and practitioners. This compact, self-contained text covers the fundamentals of optimization algorithms, focusing on the techniques most relevant to data science.
Optimization techniques are at the core of data science. An understanding of the basic techniques and their fundamental properties provides important grounding for students, researchers, and practitioners. This compact, self-contained text covers the fundamentals of optimization algorithms, focusing on the techniques most relevant to data science.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Stephen J. Wright holds the George B. Dantzig Professorship, the Sheldon Lubar Chair, and the Amar and Balinder Sohi Professorship of Computer Sciences at the University of Wisconsin-Madison and a Discovery Fellow in the Wisconsin Institute for Discovery. He works in computational optimization and its applications to data science and many other areas of science and engineering. He is a Fellow of SIAM, and recipient of the 2014 W. R. G. Baker Award from IEEE for most outstanding paper, the 2020 Khachiyan Prize by the INFORMS Optimization Society for lifetime achievements in optimization, and the 2020 NeurIPS Test of Time Award. Professor Wright is the author and co-author of widely used textbooks and reference books in optimization including Primal Dual Interior-Point Methods (1987) and Numerical Optimization (2006).
Inhaltsangabe
1. Introduction 2. Foundations of smooth optimization 3. Descent methods 4. Gradient methods using momentum 5. Stochastic gradient 6. Coordinate descent 7. First-order methods for constrained optimization 8. Nonsmooth functions and subgradients 9. Nonsmooth optimization methods 10. Duality and algorithms 11. Differentiation and adjoints.