Bradley Efron, Trevor Hastie
Computer Age Statistical Inference, Student Edition
Algorithms, Evidence, and Data Science
31,99 €
inkl. MwSt.
Versandkostenfrei*
Verlag / Hersteller kann z. Zt. nicht liefern
Melden Sie sich
hier
hier
für den Produktalarm an, um über die Verfügbarkeit des Produkts informiert zu werden.
16 °P sammeln
Bradley Efron, Trevor Hastie
Computer Age Statistical Inference, Student Edition
Algorithms, Evidence, and Data Science
- Broschiertes Buch
Computing power has revolutionized the theory and practice of statistical inference. Now in paperback, and fortified with 130 class-tested exercises, this book explains modern statistical thinking from classical theories to state-of-the-art prediction algorithms. Anyone who applies statistical methods to data will value this landmark text.
Andere Kunden interessierten sich auch für
- Mathematical Aspects of Computer and Information Sciences37,99 €
- Ingo SteinwartSupport Vector Machines121,99 €
- Future and Emergent Trends in Language Technology37,99 €
- Conformal and Probabilistic Prediction with Applications38,99 €
- Transactions on Large-Scale Data- and Knowledge-Centered Systems XXVIII37,99 €
- Web-Age Information Management37,99 €
- Discovery Science37,99 €
-
-
-
Computing power has revolutionized the theory and practice of statistical inference. Now in paperback, and fortified with 130 class-tested exercises, this book explains modern statistical thinking from classical theories to state-of-the-art prediction algorithms. Anyone who applies statistical methods to data will value this landmark text.
Produktdetails
- Produktdetails
- Institute of Mathematical Statistics Monographs 6
- Verlag: Cambridge University Press
- Seitenzahl: 506
- Erscheinungstermin: 30. Juni 2021
- Englisch
- Abmessung: 227mm x 151mm x 23mm
- Gewicht: 820g
- ISBN-13: 9781108823418
- ISBN-10: 1108823416
- Artikelnr.: 61268437
- Institute of Mathematical Statistics Monographs 6
- Verlag: Cambridge University Press
- Seitenzahl: 506
- Erscheinungstermin: 30. Juni 2021
- Englisch
- Abmessung: 227mm x 151mm x 23mm
- Gewicht: 820g
- ISBN-13: 9781108823418
- ISBN-10: 1108823416
- Artikelnr.: 61268437
Bradley Efron is Max H. Stein Professor, Professor of Statistics, and Professor of Biomedical Data Science at Stanford University. He has held visiting faculty appointments at Harvard, UC Berkeley, and Imperial College London. Efron has worked extensively on theories of statistical inference, and is the inventor of the bootstrap sampling technique. He received the National Medal of Science in 2005, the Guy Medal in Gold of the Royal Statistical Society in 2014, and the International Prize in Statistics in 2019.
Part I. Classic Statistical Inference: 1. Algorithms and inference; 2. Frequentist inference; 3. Bayesian inference; 4. Fisherian inference and maximum likelihood estimation; 5. Parametric models and exponential families; Part II. Early Computer-Age Methods: 6. Empirical Bayes; 7. James-Stein estimation and ridge regression; 8. Generalized linear models and regression trees; 9. Survival analysis and the EM algorithm; 10. The jackknife and the bootstrap; 11. Bootstrap confidence intervals; 12. Cross-validation and Cp estimates of prediction error; 13. Objective Bayes inference and Markov chain Monte Carlo; 14. Statistical inference and methodology in the postwar era; Part III. Twenty-First-Century Topics: 15. Large-scale hypothesis testing and false-discovery rates; 16. Sparse modeling and the lasso; 17. Random forests and boosting; 18. Neural networks and deep learning; 19. Support-vector machines and kernel methods; 20. Inference after model selection; 21. Empirical Bayes estimation strategies; Epilogue; References; Author Index; Subject Index.
Part I. Classic Statistical Inference: 1. Algorithms and inference; 2. Frequentist inference; 3. Bayesian inference; 4. Fisherian inference and maximum likelihood estimation; 5. Parametric models and exponential families; Part II. Early Computer-Age Methods: 6. Empirical Bayes; 7. James-Stein estimation and ridge regression; 8. Generalized linear models and regression trees; 9. Survival analysis and the EM algorithm; 10. The jackknife and the bootstrap; 11. Bootstrap confidence intervals; 12. Cross-validation and Cp estimates of prediction error; 13. Objective Bayes inference and Markov chain Monte Carlo; 14. Statistical inference and methodology in the postwar era; Part III. Twenty-First-Century Topics: 15. Large-scale hypothesis testing and false-discovery rates; 16. Sparse modeling and the lasso; 17. Random forests and boosting; 18. Neural networks and deep learning; 19. Support-vector machines and kernel methods; 20. Inference after model selection; 21. Empirical Bayes estimation strategies; Epilogue; References; Author Index; Subject Index.