Mu Zhu (Professor, Professor, University of Waterloo)
Essential Statistics for Data Science
A Concise Crash Course
Mu Zhu (Professor, Professor, University of Waterloo)
Essential Statistics for Data Science
A Concise Crash Course
- Broschiertes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
Essential Statistics for Data Science: A Concise Crash Course is for students entering a serious graduate program in data science without knowing enough statistics.
Andere Kunden interessierten sich auch für
- Mu Zhu (Professor, Professor, University of Waterloo)Essential Statistics for Data Science77,99 €
- Will KurtBayesian Statistics the Fun Way41,99 €
- Tom ChiversEverything Is Predictable13,99 €
- Zhe George Zhang (Bellingham Western Washington University)Fundamentals of Stochastic Models203,99 €
- Sharon Bertsch McGrayneThe Theory That Would Not Die13,99 €
- Joachim GwinnerUncertainty Quantification in Variational Inequalities63,99 €
- Saeed GhahramaniFundamentals of Probability156,99 €
-
-
-
Essential Statistics for Data Science: A Concise Crash Course is for students entering a serious graduate program in data science without knowing enough statistics.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: Oxford University Press
- Seitenzahl: 176
- Erscheinungstermin: 4. Juli 2023
- Englisch
- Abmessung: 226mm x 163mm x 15mm
- Gewicht: 304g
- ISBN-13: 9780192867742
- ISBN-10: 0192867741
- Artikelnr.: 66742962
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- 06621 890
- Verlag: Oxford University Press
- Seitenzahl: 176
- Erscheinungstermin: 4. Juli 2023
- Englisch
- Abmessung: 226mm x 163mm x 15mm
- Gewicht: 304g
- ISBN-13: 9780192867742
- ISBN-10: 0192867741
- Artikelnr.: 66742962
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- 06621 890
Mu Zhu is Professor in the Department of Statistics & Actuarial Science at the University of Waterloo, and Fellow of the American Statistical Association. He received his AB magna cum laude in applied mathematics from Harvard University, and his PhD in statistics from Stanford University. He is currently Director of the Graduate Data Science Program at Waterloo.
Prologue
I Talking Probability
1: Eminence of Models
1.A. For brave eyes only
2: Building Vocabulary
2.1: Probability
2.1.1 Basic rules
2.2: Conditional probability
2.2.1 Independence
2.2.2 Law of total probability
2.2.3 Bayes law
2.3: Random variables
2.3.1 Summation and integration
2.3.2 Expectations and variances
2.3.3 Two simple distributions
2.4: The bell curve
3: Gaining Fluency
3.1: Multiple random quantities
3.1.1 Higher-dimensional problems
3.2: Two "hard" problems
3.2.1 Functions of random variables
3.2.2 Compound distributions
3.A. Sums of independent random variables
3.A.1 Convolutions
3.A.2 Moment generating functions
3.A.3 Formulae for expectations and variances
II Doing Statistics
4: Overview of Statistics
4.1: Frequentist approach
4.1.1 Functions of random variables
4.2: Bayesian approach
4.2.1 Compound distributions
4.3: Two more distributions
4.3.1 Poisson distribution
4.3.2 Gamma distribution
4.A: Expectation and variance of the Poisson
4.B: Waiting time in Poisson process
5: Frequentist Approach
5.1: Maximum likelihood estimation
5.1.1 Random variables that are i.i.d.
5.1.2 Problems with covariates
5.2. Statistical properties of estimators
5.3. Some advanced techniques
5.3.1 EM algorithm
5.3.2 Latent variables
5.A: Finite mixture models
6: Bayesian Approach
6.1: Basics
6.2: Empirical Bayes
6.3: Hierarchical Bayes
6.A: General sampling algorithms
6.A.1 Metropolis algorithm
6.A.2 Some theory
6.A.3 Metropolis-Hastings algorithm
III Facing Uncertainty
7: Interval Estimation
7.1: Uncertainty quantification
7.1.1 Bayesian version
7.1.2 Frequentist version
7.2: Main difficulty
7.3: Two useful methods
7.3.1 Likelihood ratio
7.3.2 Bootstrap
8: Tests of Significance
8.1: Basics
8.1.1 Relation to interval estimation
8.1.2 The p-value
8.2: Some challenges
8.2.1 Multiple testing
8.2.2 Six degrees of separation
8.A: Intuition of Benjamini-Hockberg
IV Appendices
A: Some Further Topics
A.1 Graphical models
A.2 Regression models
A.3 Data collection
Epilogue
Bibliography
Index
I Talking Probability
1: Eminence of Models
1.A. For brave eyes only
2: Building Vocabulary
2.1: Probability
2.1.1 Basic rules
2.2: Conditional probability
2.2.1 Independence
2.2.2 Law of total probability
2.2.3 Bayes law
2.3: Random variables
2.3.1 Summation and integration
2.3.2 Expectations and variances
2.3.3 Two simple distributions
2.4: The bell curve
3: Gaining Fluency
3.1: Multiple random quantities
3.1.1 Higher-dimensional problems
3.2: Two "hard" problems
3.2.1 Functions of random variables
3.2.2 Compound distributions
3.A. Sums of independent random variables
3.A.1 Convolutions
3.A.2 Moment generating functions
3.A.3 Formulae for expectations and variances
II Doing Statistics
4: Overview of Statistics
4.1: Frequentist approach
4.1.1 Functions of random variables
4.2: Bayesian approach
4.2.1 Compound distributions
4.3: Two more distributions
4.3.1 Poisson distribution
4.3.2 Gamma distribution
4.A: Expectation and variance of the Poisson
4.B: Waiting time in Poisson process
5: Frequentist Approach
5.1: Maximum likelihood estimation
5.1.1 Random variables that are i.i.d.
5.1.2 Problems with covariates
5.2. Statistical properties of estimators
5.3. Some advanced techniques
5.3.1 EM algorithm
5.3.2 Latent variables
5.A: Finite mixture models
6: Bayesian Approach
6.1: Basics
6.2: Empirical Bayes
6.3: Hierarchical Bayes
6.A: General sampling algorithms
6.A.1 Metropolis algorithm
6.A.2 Some theory
6.A.3 Metropolis-Hastings algorithm
III Facing Uncertainty
7: Interval Estimation
7.1: Uncertainty quantification
7.1.1 Bayesian version
7.1.2 Frequentist version
7.2: Main difficulty
7.3: Two useful methods
7.3.1 Likelihood ratio
7.3.2 Bootstrap
8: Tests of Significance
8.1: Basics
8.1.1 Relation to interval estimation
8.1.2 The p-value
8.2: Some challenges
8.2.1 Multiple testing
8.2.2 Six degrees of separation
8.A: Intuition of Benjamini-Hockberg
IV Appendices
A: Some Further Topics
A.1 Graphical models
A.2 Regression models
A.3 Data collection
Epilogue
Bibliography
Index
Prologue
I Talking Probability
1: Eminence of Models
1.A. For brave eyes only
2: Building Vocabulary
2.1: Probability
2.1.1 Basic rules
2.2: Conditional probability
2.2.1 Independence
2.2.2 Law of total probability
2.2.3 Bayes law
2.3: Random variables
2.3.1 Summation and integration
2.3.2 Expectations and variances
2.3.3 Two simple distributions
2.4: The bell curve
3: Gaining Fluency
3.1: Multiple random quantities
3.1.1 Higher-dimensional problems
3.2: Two "hard" problems
3.2.1 Functions of random variables
3.2.2 Compound distributions
3.A. Sums of independent random variables
3.A.1 Convolutions
3.A.2 Moment generating functions
3.A.3 Formulae for expectations and variances
II Doing Statistics
4: Overview of Statistics
4.1: Frequentist approach
4.1.1 Functions of random variables
4.2: Bayesian approach
4.2.1 Compound distributions
4.3: Two more distributions
4.3.1 Poisson distribution
4.3.2 Gamma distribution
4.A: Expectation and variance of the Poisson
4.B: Waiting time in Poisson process
5: Frequentist Approach
5.1: Maximum likelihood estimation
5.1.1 Random variables that are i.i.d.
5.1.2 Problems with covariates
5.2. Statistical properties of estimators
5.3. Some advanced techniques
5.3.1 EM algorithm
5.3.2 Latent variables
5.A: Finite mixture models
6: Bayesian Approach
6.1: Basics
6.2: Empirical Bayes
6.3: Hierarchical Bayes
6.A: General sampling algorithms
6.A.1 Metropolis algorithm
6.A.2 Some theory
6.A.3 Metropolis-Hastings algorithm
III Facing Uncertainty
7: Interval Estimation
7.1: Uncertainty quantification
7.1.1 Bayesian version
7.1.2 Frequentist version
7.2: Main difficulty
7.3: Two useful methods
7.3.1 Likelihood ratio
7.3.2 Bootstrap
8: Tests of Significance
8.1: Basics
8.1.1 Relation to interval estimation
8.1.2 The p-value
8.2: Some challenges
8.2.1 Multiple testing
8.2.2 Six degrees of separation
8.A: Intuition of Benjamini-Hockberg
IV Appendices
A: Some Further Topics
A.1 Graphical models
A.2 Regression models
A.3 Data collection
Epilogue
Bibliography
Index
I Talking Probability
1: Eminence of Models
1.A. For brave eyes only
2: Building Vocabulary
2.1: Probability
2.1.1 Basic rules
2.2: Conditional probability
2.2.1 Independence
2.2.2 Law of total probability
2.2.3 Bayes law
2.3: Random variables
2.3.1 Summation and integration
2.3.2 Expectations and variances
2.3.3 Two simple distributions
2.4: The bell curve
3: Gaining Fluency
3.1: Multiple random quantities
3.1.1 Higher-dimensional problems
3.2: Two "hard" problems
3.2.1 Functions of random variables
3.2.2 Compound distributions
3.A. Sums of independent random variables
3.A.1 Convolutions
3.A.2 Moment generating functions
3.A.3 Formulae for expectations and variances
II Doing Statistics
4: Overview of Statistics
4.1: Frequentist approach
4.1.1 Functions of random variables
4.2: Bayesian approach
4.2.1 Compound distributions
4.3: Two more distributions
4.3.1 Poisson distribution
4.3.2 Gamma distribution
4.A: Expectation and variance of the Poisson
4.B: Waiting time in Poisson process
5: Frequentist Approach
5.1: Maximum likelihood estimation
5.1.1 Random variables that are i.i.d.
5.1.2 Problems with covariates
5.2. Statistical properties of estimators
5.3. Some advanced techniques
5.3.1 EM algorithm
5.3.2 Latent variables
5.A: Finite mixture models
6: Bayesian Approach
6.1: Basics
6.2: Empirical Bayes
6.3: Hierarchical Bayes
6.A: General sampling algorithms
6.A.1 Metropolis algorithm
6.A.2 Some theory
6.A.3 Metropolis-Hastings algorithm
III Facing Uncertainty
7: Interval Estimation
7.1: Uncertainty quantification
7.1.1 Bayesian version
7.1.2 Frequentist version
7.2: Main difficulty
7.3: Two useful methods
7.3.1 Likelihood ratio
7.3.2 Bootstrap
8: Tests of Significance
8.1: Basics
8.1.1 Relation to interval estimation
8.1.2 The p-value
8.2: Some challenges
8.2.1 Multiple testing
8.2.2 Six degrees of separation
8.A: Intuition of Benjamini-Hockberg
IV Appendices
A: Some Further Topics
A.1 Graphical models
A.2 Regression models
A.3 Data collection
Epilogue
Bibliography
Index