Most textbooks on regression focus on theory and the simplest of examples. Real statistical problems, however, are complex and subtle. This is not a book about the theory of regression. It is about using regression to solve real problems of comparison, estimation, prediction, and causal inference. Unlike other books, it focuses on practical issues such as sample size and missing data and a wide range of goals and techniques. It jumps right in to methods and computer code you can use immediately. Real examples, real stories from the authors' experience demonstrate what regression can do and its…mehr
Most textbooks on regression focus on theory and the simplest of examples. Real statistical problems, however, are complex and subtle. This is not a book about the theory of regression. It is about using regression to solve real problems of comparison, estimation, prediction, and causal inference. Unlike other books, it focuses on practical issues such as sample size and missing data and a wide range of goals and techniques. It jumps right in to methods and computer code you can use immediately. Real examples, real stories from the authors' experience demonstrate what regression can do and its limitations, with practical advice for understanding assumptions and implementing methods for experiments and observational studies. They make a smooth transition to logistic regression and GLM. The emphasis is on computation in R and Stan rather than derivations, with code available online. Graphics and presentation aid understanding of the models and model fitting.
The authors are experienced researchers who have published articles in hundreds of different scientific journals in fields including statistics, computer science, policy, public health, political science, economics, sociology, and engineering. They have also published articles in the Washington Post, New York Times, Slate, and other public venues. Their previous books include Bayesian Data Analysis, Teaching Statistics: A Bag of Tricks, and Data Analysis and Regression Using Multilevel/Hierarchical Models. Andrew Gelman is Higgins Professor of Statistics and Professor of Political Science at Columbia University.
Inhaltsangabe
Preface; Part I. Fundamentals: 1. Overview; 2. Data and measurement; 3. Some basic methods in mathematics and probability; 4. Statistical inference; 5. Simulation; Part II. Linear Regression: 6. Background on regression modeling; 7. Linear regression with a single predictor; 8. Fitting regression models; 9. Prediction and Bayesian inference; 10. Linear regression with multiple predictors; 11. Assumptions, diagnostics, and model evaluation; 12. Transformations and regression; Part III. Generalized Linear Models: 13. Logistic regression; 14. Working with logistic regression; 15. Other generalized linear models; Part IV. Before and After Fitting a Regression: 16. Design and sample size decisions; 17. Poststratification and missing-data imputation; Part V. Causal Inference: 18. Causal inference and randomized experiments; 19. Causal inference using regression on the treatment variable; 20. Observational studies with all confounders assumed to be measured; 21. Additional topics in causal inference; Part VI. What Comes Next?: 22. Advanced regression and multilevel models; Appendices: A. Computing in R; B. 10 quick tips to improve your regression modelling; References; Author index; Subject index.
Preface; Part I. Fundamentals: 1. Overview; 2. Data and measurement; 3. Some basic methods in mathematics and probability; 4. Statistical inference; 5. Simulation; Part II. Linear Regression: 6. Background on regression modeling; 7. Linear regression with a single predictor; 8. Fitting regression models; 9. Prediction and Bayesian inference; 10. Linear regression with multiple predictors; 11. Assumptions, diagnostics, and model evaluation; 12. Transformations and regression; Part III. Generalized Linear Models: 13. Logistic regression; 14. Working with logistic regression; 15. Other generalized linear models; Part IV. Before and After Fitting a Regression: 16. Design and sample size decisions; 17. Poststratification and missing-data imputation; Part V. Causal Inference: 18. Causal inference and randomized experiments; 19. Causal inference using regression on the treatment variable; 20. Observational studies with all confounders assumed to be measured; 21. Additional topics in causal inference; Part VI. What Comes Next?: 22. Advanced regression and multilevel models; Appendices: A. Computing in R; B. 10 quick tips to improve your regression modelling; References; Author index; Subject index.
Rezensionen
'Gelman, Hill and Vehtari provide an introductory regression book that hits an amazing trifecta: it motivates regression using real data examples, provides the necessary (but not superfluous) theory, and gives readers tools to implement these methods in their own work. The scope is ambitious - including introductions to causal inference and measurement - and the result is a book that I not only look forward to teaching from, but also keeping around as a reference for my own work.' Elizabeth Tipton, Northwestern University
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Shop der buecher.de GmbH & Co. KG i.I. Bürgermeister-Wegele-Str. 12, 86167 Augsburg Amtsgericht Augsburg HRA 13309