Intermediate Statistical Methods
Herausgegeben:Wetherill, G. Barrie
Intermediate Statistical Methods
Herausgegeben:Wetherill, G. Barrie
- Broschiertes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
This book began many years ago as course notes for students at the University of Bath, and later at the University of Kent. Students used draft versions of the chapters, which were consequently revised. Second and third year students, as well as those taking MSc courses have used selections of the chapters. In particular, Chapters I to 7 (only) have been the basis of a very successful second-year course, the more difficult sections being omitted. The aims of this particular course were:- (a) to cover some interesting and useful applications of statistics with an emphasis on applications, but…mehr
Andere Kunden interessierten sich auch für
- G. Barrie WetherillSolutions to Exercises in Intermediate Statistical Methods42,99 €
- Regression Analysis with Applications42,99 €
- Advances in the Statistical Sciences: Foundations of Statistical Inference42,99 €
- G. P. BeaumontIntermediate Mathematical Statistics42,99 €
- Peter SprentApplied Nonparametric Statistical Methods37,99 €
- B. EverittIntroduction to Optimization Methods and their Application in Statistics37,99 €
- ZieglerContribution to Applied Statistics42,99 €
-
-
-
This book began many years ago as course notes for students at the University of Bath, and later at the University of Kent. Students used draft versions of the chapters, which were consequently revised. Second and third year students, as well as those taking MSc courses have used selections of the chapters. In particular, Chapters I to 7 (only) have been the basis of a very successful second-year course, the more difficult sections being omitted. The aims of this particular course were:- (a) to cover some interesting and useful applications of statistics with an emphasis on applications, but with really adequate theory; (b) to lay the foundations for interesting third-year courses; (c) to tie up with certain areas of pure mathematics and numerical analysis. 2 Students will find Chapter I a useful means of revising the t, X and F procedures, which is material assumed in this text, see Section 1.1. Later sections of Chapter I cover robustness and can be omitted by second-year students or at a first reading. Chapter 2 introduces some simple statistical models, so that the discussion of later chapters is more meaningful.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: Springer Netherlands / Springer, Berlin
- 1981.
- Seitenzahl: 408
- Erscheinungstermin: 18. Juni 1981
- Englisch
- Abmessung: 235mm x 155mm x 23mm
- Gewicht: 623g
- ISBN-13: 9780412164507
- ISBN-10: 0412164507
- Artikelnr.: 26645313
- Verlag: Springer Netherlands / Springer, Berlin
- 1981.
- Seitenzahl: 408
- Erscheinungstermin: 18. Juni 1981
- Englisch
- Abmessung: 235mm x 155mm x 23mm
- Gewicht: 623g
- ISBN-13: 9780412164507
- ISBN-10: 0412164507
- Artikelnr.: 26645313
1 Some properties of basic statistical procedures.- 1.1 Problems of statistics.- 1.2 The t, X2 and F procedures.- 1.3 Standard assumptions and their plausibility.- 1.4 Tests of normality.- 1.5 Moments of $$bar{x}$$ and s2.- 1.6 The effect of skewness and kurtosis on the t-test.- 1.7 The effect of skewness and kurtosis on inferences about variances.- 1.8 The effect of serial correlation.- 1.9 The effect of unequal variances on the two-sample t-test.- 1.10 Discussion.- Further reading.- 2 Regression and the linear model.- 2.1 Linear models.- 2.2 The method of least squares.- 2.3 Properties of the estimators and sums of squares.- 2.4 Further analysis of Example 2.1.- 2.5 The regressions of y on x and of x on y.- 2.6 Two regressor variables.- 2.7 Discussion.- 3 Statistical models and statistical inference.- 3.1 Parametric inference.- 3.2 Point estimates.- 3.3 The likelihood function.- 3.4 The method of maximum likelihood.- 3.5 The Cramér - Rao inequality.- 3.6 Sufficiency.- 3.7 The multivariate normal distribution.- 3.8 Proof of the Cramér - Rao inequality.- Further reading.- 4 Properties of the method of maximum likelihood.- 4.1 Introduction.- 4.2 Formal statements of main properties.- 4.3 Practical aspects - one-parameter case.- 4.4 Practical aspects - multiparameter case.- 4.5 Other methods of estimation.- 5 The method of least squares.- 5.1 Basic model.- 5.2 Properties of the method.- 5.3 Properties of residuals.- 5.4 Properties of sums of squares.- 5.5 Application to multiple regression.- Further reading.- 6 Multiple regression: Further analysis and interpretation.- 6.1 Testing the significance of subsets of explanatory variables.- 6.2 Application of the extra sum-of-squares principle to multiple regression.- 6.3 Problems of interpretation.- 6.4Relationships between sums of squares.- 6.5 Departures from assumptions.- 6.6 Predictions from regression.- 6.7 Strategies for multiple regression analysis.- 6.8 Practical details.- Further reading on practical points.- 7 Polynomial regression.- 7.1 Introduction.- 7.2 General theory.- 7.3 Derivation of the polynomials.- 7.4 Tables of orthogonal polynomials.- 7.5 An illustrative example.- 8 The use of transformations.- 8.1 Introduction.- 8.2 One explanatory variable.- 8.3 Transformations for homogeneity of variance.- 8.4 An example.- 8.5 The Box-Cox transformation.- 8.6 Transformations of regressor variables.- 8.7 Application to bioassay data.- Further reading.- 9 Correlation.- 9.1 Definition and examples.- 9.2 Correlation or regression?.- 9.3 Estimation of ?.- 9.4 Results on the distribution of R.- 9.5 Confidence intervals and hypothesis tests for ?.- 9.6 Relationship with regression.- 9.7 Partial correlation.- 9.8 The multiple correlation coefficient.- Further reading.- 10 The analysis of variance.- 10.1 An example.- 10.2 Generalized inverses.- 10.3 Least squares using generalized inverses.- 10.4 One-way classification analysis of variance.- 10.5 A discussion of Example 10.1.- 10.6 Two-way classification.- 10.7 A discussion of Example 10.2.- 10.8 General method for analysis of variance.- Further reading.- 11 Designs with regressions in the treatment effects.- 11.1 One-way analysis.- 11.2 Parallel regressions.- 11.3 The two-way analysis.- 12 An analysis of data on trees.- 12.1 The data.- 12.2 Regression analyses.- 12.3 The analysis of covariance.- 12.4 Residuals.- 13 The analysis of variance: Subsidiary analyses.- 13.1 Multiple comparisons: Introduction.- 13.2 Multiple comparisons: Various techniques.- 13.3 Departures from underlying assumptions.- 13.4 Tests forheteroscedasticity.- 13.5 Residuals and outliers.- 13.6 Some points of experimental design: General points.- 13.7 Some points of experimental design: Randomized blocks.- Further reading on experimental design.- 14 Components of variance.- 14.1 Components of variance.- 14.2 Components of variance: Follow-up analysis.- 14.3 Nested classifications.- 14.4 Outline analysis of Example 14.3.- 14.5 Nested classifications: Finite population model.- 14.6 Sampling from finite populations.- 14.7 Nested classifications with unequal numbers.- Further reading.- 15 Crossed classifications.- 15.1 Crossed classifications and interactions.- 15.2 More about interactions.- 15.3 Analysis of a two-way equally replicated design.- 15.4 An analysis of Example 15.1.- 15.5 Unit errors.- 15.6 Random-effects models.- 15.7 Analysis of a two-way unequally replicated design.- Further reading.- 16 Further analysis of variance.- 16.1 Three-way crossed classification.- 16.2 An analysis of Example 16.1.- Further reading.- 17 The generalized linear model.- 17.1 Introduction.- 17.2 The maximum likelihood ratio test.- 17.3 The family of probability distributions permitted.- 17.4 The generalized linear model.- 17.5 The analysis of deviance.- 17.6 Illustration using the radiation experiment data.- Further reading.- References.
1 Some properties of basic statistical procedures.- 1.1 Problems of statistics.- 1.2 The t, X2 and F procedures.- 1.3 Standard assumptions and their plausibility.- 1.4 Tests of normality.- 1.5 Moments of $$bar{x}$$ and s2.- 1.6 The effect of skewness and kurtosis on the t-test.- 1.7 The effect of skewness and kurtosis on inferences about variances.- 1.8 The effect of serial correlation.- 1.9 The effect of unequal variances on the two-sample t-test.- 1.10 Discussion.- Further reading.- 2 Regression and the linear model.- 2.1 Linear models.- 2.2 The method of least squares.- 2.3 Properties of the estimators and sums of squares.- 2.4 Further analysis of Example 2.1.- 2.5 The regressions of y on x and of x on y.- 2.6 Two regressor variables.- 2.7 Discussion.- 3 Statistical models and statistical inference.- 3.1 Parametric inference.- 3.2 Point estimates.- 3.3 The likelihood function.- 3.4 The method of maximum likelihood.- 3.5 The Cramér - Rao inequality.- 3.6 Sufficiency.- 3.7 The multivariate normal distribution.- 3.8 Proof of the Cramér - Rao inequality.- Further reading.- 4 Properties of the method of maximum likelihood.- 4.1 Introduction.- 4.2 Formal statements of main properties.- 4.3 Practical aspects - one-parameter case.- 4.4 Practical aspects - multiparameter case.- 4.5 Other methods of estimation.- 5 The method of least squares.- 5.1 Basic model.- 5.2 Properties of the method.- 5.3 Properties of residuals.- 5.4 Properties of sums of squares.- 5.5 Application to multiple regression.- Further reading.- 6 Multiple regression: Further analysis and interpretation.- 6.1 Testing the significance of subsets of explanatory variables.- 6.2 Application of the extra sum-of-squares principle to multiple regression.- 6.3 Problems of interpretation.- 6.4Relationships between sums of squares.- 6.5 Departures from assumptions.- 6.6 Predictions from regression.- 6.7 Strategies for multiple regression analysis.- 6.8 Practical details.- Further reading on practical points.- 7 Polynomial regression.- 7.1 Introduction.- 7.2 General theory.- 7.3 Derivation of the polynomials.- 7.4 Tables of orthogonal polynomials.- 7.5 An illustrative example.- 8 The use of transformations.- 8.1 Introduction.- 8.2 One explanatory variable.- 8.3 Transformations for homogeneity of variance.- 8.4 An example.- 8.5 The Box-Cox transformation.- 8.6 Transformations of regressor variables.- 8.7 Application to bioassay data.- Further reading.- 9 Correlation.- 9.1 Definition and examples.- 9.2 Correlation or regression?.- 9.3 Estimation of ?.- 9.4 Results on the distribution of R.- 9.5 Confidence intervals and hypothesis tests for ?.- 9.6 Relationship with regression.- 9.7 Partial correlation.- 9.8 The multiple correlation coefficient.- Further reading.- 10 The analysis of variance.- 10.1 An example.- 10.2 Generalized inverses.- 10.3 Least squares using generalized inverses.- 10.4 One-way classification analysis of variance.- 10.5 A discussion of Example 10.1.- 10.6 Two-way classification.- 10.7 A discussion of Example 10.2.- 10.8 General method for analysis of variance.- Further reading.- 11 Designs with regressions in the treatment effects.- 11.1 One-way analysis.- 11.2 Parallel regressions.- 11.3 The two-way analysis.- 12 An analysis of data on trees.- 12.1 The data.- 12.2 Regression analyses.- 12.3 The analysis of covariance.- 12.4 Residuals.- 13 The analysis of variance: Subsidiary analyses.- 13.1 Multiple comparisons: Introduction.- 13.2 Multiple comparisons: Various techniques.- 13.3 Departures from underlying assumptions.- 13.4 Tests forheteroscedasticity.- 13.5 Residuals and outliers.- 13.6 Some points of experimental design: General points.- 13.7 Some points of experimental design: Randomized blocks.- Further reading on experimental design.- 14 Components of variance.- 14.1 Components of variance.- 14.2 Components of variance: Follow-up analysis.- 14.3 Nested classifications.- 14.4 Outline analysis of Example 14.3.- 14.5 Nested classifications: Finite population model.- 14.6 Sampling from finite populations.- 14.7 Nested classifications with unequal numbers.- Further reading.- 15 Crossed classifications.- 15.1 Crossed classifications and interactions.- 15.2 More about interactions.- 15.3 Analysis of a two-way equally replicated design.- 15.4 An analysis of Example 15.1.- 15.5 Unit errors.- 15.6 Random-effects models.- 15.7 Analysis of a two-way unequally replicated design.- Further reading.- 16 Further analysis of variance.- 16.1 Three-way crossed classification.- 16.2 An analysis of Example 16.1.- Further reading.- 17 The generalized linear model.- 17.1 Introduction.- 17.2 The maximum likelihood ratio test.- 17.3 The family of probability distributions permitted.- 17.4 The generalized linear model.- 17.5 The analysis of deviance.- 17.6 Illustration using the radiation experiment data.- Further reading.- References.