In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. The authors cover the lasso for linear regression, generalized penalties, numerical methods for optimization, statistical inference methods for fitted (lasso) models, sparse multivariate analysis, graphical models, compressed sensing, and much more.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.
"The authors study and analyze methods using the sparsity property of some statistical models in order to recover the underlying signal in a dataset. They focus on the Lasso technique as an alternative to the standard least-squares method."
-Zentralblatt MATH 1319
"The book includes all the major branches of statistical learning. For each topic, the authors first give a concise introduction of the basic problem, evaluate conventional methods, pointing out their deficiencies, and then introduce a method based on sparsity. Thus, the book has the potential to be the standard textbook on the topic."
-Anand Panangadan, California State University, Fullerton
"It always first discusses regularized models based on equations, followed by example applications, before ending with a bibliography section detailing the historical development of the given method. Software recommendations (mostly open source R packages) are typically provided either in the main part or bibliography section of each chapter. And each chapter concludes with a set of selected exercises meant to deepen the gained knowledge on the given subject, which of course is of great help for teachers of statistics. For these reasons, we congratulate the authors of Statistical Learning with Sparsity and recommend the book to all statistically-inclined readers from intermediate to expert levels. In addition, it is worth pointing out that even for non-statisticians, the book is able to demonstrate,based on numerous real-world examples, the power of regularization."-Ivan Kondofersky and Fabian J. Theis, Institute for Computational Biology
-Zentralblatt MATH 1319
"The book includes all the major branches of statistical learning. For each topic, the authors first give a concise introduction of the basic problem, evaluate conventional methods, pointing out their deficiencies, and then introduce a method based on sparsity. Thus, the book has the potential to be the standard textbook on the topic."
-Anand Panangadan, California State University, Fullerton
"It always first discusses regularized models based on equations, followed by example applications, before ending with a bibliography section detailing the historical development of the given method. Software recommendations (mostly open source R packages) are typically provided either in the main part or bibliography section of each chapter. And each chapter concludes with a set of selected exercises meant to deepen the gained knowledge on the given subject, which of course is of great help for teachers of statistics. For these reasons, we congratulate the authors of Statistical Learning with Sparsity and recommend the book to all statistically-inclined readers from intermediate to expert levels. In addition, it is worth pointing out that even for non-statisticians, the book is able to demonstrate,based on numerous real-world examples, the power of regularization."-Ivan Kondofersky and Fabian J. Theis, Institute for Computational Biology