- Broschiertes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
Taking the reader step-by-step through the intricacies, theory and practice of regression analysis, Damodar N. Gujarati uses a clear style that doesn’t overwhelm the reader with abstract mathematics.
Andere Kunden interessierten sich auch für
- Jefferson M GillGeneralized Linear Models51,99 €
- Robert H BruhlUnderstanding Statistical Analysis and Modeling170,99 €
- Dana K KellerThe Tao of Statistics51,99 €
- Francis P DonnellyExploring the U.S. Census83,99 €
- Matthew B MilesQualitative Data Analysis170,99 €
- Rebecca M WarnerApplied Statistics II199,99 €
- Neil J SalkindStudy Guide for Psychology to Accompany Salkind and Frey′s Statistics for People Who (Think They) Hate Statistics51,99 €
-
-
-
Taking the reader step-by-step through the intricacies, theory and practice of regression analysis, Damodar N. Gujarati uses a clear style that doesn’t overwhelm the reader with abstract mathematics.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
- Produktdetails
- Verlag: Sage Publications
- Seitenzahl: 272
- Erscheinungstermin: 20. Juli 2018
- Englisch
- Abmessung: 218mm x 139mm x 17mm
- Gewicht: 319g
- ISBN-13: 9781544336572
- ISBN-10: 1544336578
- Artikelnr.: 50910845
- Herstellerkennzeichnung
- Produktsicherheitsverantwortliche/r
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
- Verlag: Sage Publications
- Seitenzahl: 272
- Erscheinungstermin: 20. Juli 2018
- Englisch
- Abmessung: 218mm x 139mm x 17mm
- Gewicht: 319g
- ISBN-13: 9781544336572
- ISBN-10: 1544336578
- Artikelnr.: 50910845
- Herstellerkennzeichnung
- Produktsicherheitsverantwortliche/r
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
Damodar Gujarati (M.B.A. and Ph.D., both from University of Chicago) is Professor Emeritus of economics at the United States Military Academy at West Point. Prior to that, he taught for 25 years at the Baruch College of the City University of New York (CUNY) and at the Graduate Center of CUNY. He is the author of Government and Business, (McGraw Hill, 1984), the bestselling textbook Basic Econometrics (5th edition, 2009, with co-author Dawn Porter), as well as Essentials of Econometrics (4th edition, 2009, also with co-author Dawn Porter), both published by McGraw-Hill, and also Econometrics by Example (2nd edition, 2014, Palgrave-Macmillan). His experience spans business, consulting, and academia.
List of Figures
Series Editor's Introduction
Preface
About the Author
Acknowledgments
Chapter 1: The Linear Regression Model (LRM)
1.1 Introduction
1.2 Meaning of "Linear" in Linear Regression
1.3 Estimation of the LRM: An Algebraic Approach
1.4 Goodness of Fit of a Regression Model: The Coefficient of Determination
(R2)
1.5 R2 for Regression Through the Origin
1.6 An Example: The Determination of the Hourly Wages in the United States
1.7 Summary
Exercises
Appendix 1A: Derivation of the Normal Equations
Chapter 2: The Classical Linear Regression Model (CLRM)
2.1 Assumptions of the CLRM
2.2 The Sampling or Probability Distributions of the OLS Estimators
2.3 Properties of OLS Estimators: The Gauss-Markov Theorem
2.4 Estimating Linear Functions of the OLS Parameters
2.5 Large-Sample Properties of OLS Estimators
2.6 Summary
Exercises
Chapter 3: The Classical Normal Linear Regression Model: The Method of
Maximum Likelihood (ML)
3.1 Introduction
3.2 The Mechanics of ML
3.3 The Likelihood Function of the k-Variable Regression Model
3.4 Properties of the ML Method
3.5 Summary
Exercises
Appendix 3A: Asymptotic Efficiency of the ML Estimators of the LRM
Chapter 4: Linear Regression Model: Distribution Theory and Hypothesis
Testing
4.1 Introduction
4.2 Types of Hypotheses
4.3 Procedure for Hypothesis Testing
4.4 The Determination of Hourly Wages in the United States
4.5 Testing Hypotheses About an Individual Regression Coefficient
4.6 Testing the Hypothesis That All the Regressors Collectively Have No
Influence on the Regressand
4.7 Testing the Incremental Contribution of a Regressor
4.8 Confidence Interval for the Error Variance s 2
4.9 Large-Sample Tests of Hypotheses
4.10 Summary
Exercises
Appendix 4A: Constrained Least Squares: OLS Estimation Under Linear
Restrictions
Chapter 5: Generalized Least Squares (GLS): Extensions of the Classical
Linear Regression Model
5.1 Introduction
5.2 Estimation of B With a Nonscalar Covariance Matrix
5.3 Estimated Generalized Least Squares
5.4 Heteroscedasticity and Weighted Least Squares
5.5 White's Heteroscedasticity-Consistent Standard Errors
5.6 Autocorrelation
5.7 Summary
Exercises
Appendix 5A: ML Estimation of GLS
Chapter 6: Extensions of the Classical Linear Regression Model: The Case of
Stochastic or Endogenous Regressors
6.1 Introduction
6.2 X and u Are Distributed Independently
6.3 X and u Are Contemporaneously Uncorrelated
6.4 X and u Are Neither Independently Distributed Nor Contemporaneously
Uncorrelated
6.5 The Case of k Regressors
6.6 What Is the Solution? The Method of Instrumental Variables (IVs)
6.7 Hypothesis Testing Under IV Estimation
6.8 Practical Problems in the Application of the IV Method
6.9 Regression Involving More Than One Endogenous Regressor
6.10 An Illustrative Example: Earnings and Educational Attainment of Youth
in the United States
6.11 Regression Involving More Than One Endogenous Regressor
6.12 Summary
Appendix 6A: Properties of OLS When Random X and u Are Independently
Distributed
Appendix 6B: Properties of OLS Estimators When Random X and u Are
Contemporaneously Uncorrelated
Chapter 7: Selected Topics in Linear Regression
7.1 Introduction
7.2 The Nature of Multicollinearity
7.3 Model Specification Errors
7.4 Qualitative or Dummy Regressors
7.5 Nonnormal Error Term
7.6 Summary
Exercises
Appendix 7A: Ridge Regression: A Solution to Perfect Collinearity
Appendix 7B: Specification Errors
Appendix A: Basics of Matrix Algebra
A.1 Definitions
A.2 Types of Matrices
A.3 Matrix Operations
A.4 Matrix Transposition
A.5 Matrix Inversion
A.6 Determinants
A.7 Rank of a Matrix
A.8 Finding the Inverse of a Square Matrix
A.9 Trace of a Square Matrix
A.10 Quadratic Forms and Definite Matrices
A.11 Eigenvalues and Eigenvectors
A.12 Vector and Matrix Differentiation
Appendix B: Essentials of Large-Sample Theory
B.1 Some Inequalities
B.2 Types of Convergence
B.3 The Order of Magnitude of a Sequence
B.4 The Order of Magnitude of a Stochastic Sequence
Appendix C: Small- and Large-Sample Properties of Estimators
C.1 Small-Sample Properties of Estimators
C.2 Large-Sample Properties of Estimators
Appendix D: Some Important Probability Distributions
D.1 The Normal Distribution and the Z Test
D.2 The Gamma Distribution
D.3 The Chi-Square (? 2) Distribution and the ? 2 Test
D.4 Student's t Distribution
D.5 Fisher's F Distribution
D.6 Relationships Among Probability Distributions
D.7 Uniform Distributions
D.8 Some Special Features of the Normal Distribution
Index
Series Editor's Introduction
Preface
About the Author
Acknowledgments
Chapter 1: The Linear Regression Model (LRM)
1.1 Introduction
1.2 Meaning of "Linear" in Linear Regression
1.3 Estimation of the LRM: An Algebraic Approach
1.4 Goodness of Fit of a Regression Model: The Coefficient of Determination
(R2)
1.5 R2 for Regression Through the Origin
1.6 An Example: The Determination of the Hourly Wages in the United States
1.7 Summary
Exercises
Appendix 1A: Derivation of the Normal Equations
Chapter 2: The Classical Linear Regression Model (CLRM)
2.1 Assumptions of the CLRM
2.2 The Sampling or Probability Distributions of the OLS Estimators
2.3 Properties of OLS Estimators: The Gauss-Markov Theorem
2.4 Estimating Linear Functions of the OLS Parameters
2.5 Large-Sample Properties of OLS Estimators
2.6 Summary
Exercises
Chapter 3: The Classical Normal Linear Regression Model: The Method of
Maximum Likelihood (ML)
3.1 Introduction
3.2 The Mechanics of ML
3.3 The Likelihood Function of the k-Variable Regression Model
3.4 Properties of the ML Method
3.5 Summary
Exercises
Appendix 3A: Asymptotic Efficiency of the ML Estimators of the LRM
Chapter 4: Linear Regression Model: Distribution Theory and Hypothesis
Testing
4.1 Introduction
4.2 Types of Hypotheses
4.3 Procedure for Hypothesis Testing
4.4 The Determination of Hourly Wages in the United States
4.5 Testing Hypotheses About an Individual Regression Coefficient
4.6 Testing the Hypothesis That All the Regressors Collectively Have No
Influence on the Regressand
4.7 Testing the Incremental Contribution of a Regressor
4.8 Confidence Interval for the Error Variance s 2
4.9 Large-Sample Tests of Hypotheses
4.10 Summary
Exercises
Appendix 4A: Constrained Least Squares: OLS Estimation Under Linear
Restrictions
Chapter 5: Generalized Least Squares (GLS): Extensions of the Classical
Linear Regression Model
5.1 Introduction
5.2 Estimation of B With a Nonscalar Covariance Matrix
5.3 Estimated Generalized Least Squares
5.4 Heteroscedasticity and Weighted Least Squares
5.5 White's Heteroscedasticity-Consistent Standard Errors
5.6 Autocorrelation
5.7 Summary
Exercises
Appendix 5A: ML Estimation of GLS
Chapter 6: Extensions of the Classical Linear Regression Model: The Case of
Stochastic or Endogenous Regressors
6.1 Introduction
6.2 X and u Are Distributed Independently
6.3 X and u Are Contemporaneously Uncorrelated
6.4 X and u Are Neither Independently Distributed Nor Contemporaneously
Uncorrelated
6.5 The Case of k Regressors
6.6 What Is the Solution? The Method of Instrumental Variables (IVs)
6.7 Hypothesis Testing Under IV Estimation
6.8 Practical Problems in the Application of the IV Method
6.9 Regression Involving More Than One Endogenous Regressor
6.10 An Illustrative Example: Earnings and Educational Attainment of Youth
in the United States
6.11 Regression Involving More Than One Endogenous Regressor
6.12 Summary
Appendix 6A: Properties of OLS When Random X and u Are Independently
Distributed
Appendix 6B: Properties of OLS Estimators When Random X and u Are
Contemporaneously Uncorrelated
Chapter 7: Selected Topics in Linear Regression
7.1 Introduction
7.2 The Nature of Multicollinearity
7.3 Model Specification Errors
7.4 Qualitative or Dummy Regressors
7.5 Nonnormal Error Term
7.6 Summary
Exercises
Appendix 7A: Ridge Regression: A Solution to Perfect Collinearity
Appendix 7B: Specification Errors
Appendix A: Basics of Matrix Algebra
A.1 Definitions
A.2 Types of Matrices
A.3 Matrix Operations
A.4 Matrix Transposition
A.5 Matrix Inversion
A.6 Determinants
A.7 Rank of a Matrix
A.8 Finding the Inverse of a Square Matrix
A.9 Trace of a Square Matrix
A.10 Quadratic Forms and Definite Matrices
A.11 Eigenvalues and Eigenvectors
A.12 Vector and Matrix Differentiation
Appendix B: Essentials of Large-Sample Theory
B.1 Some Inequalities
B.2 Types of Convergence
B.3 The Order of Magnitude of a Sequence
B.4 The Order of Magnitude of a Stochastic Sequence
Appendix C: Small- and Large-Sample Properties of Estimators
C.1 Small-Sample Properties of Estimators
C.2 Large-Sample Properties of Estimators
Appendix D: Some Important Probability Distributions
D.1 The Normal Distribution and the Z Test
D.2 The Gamma Distribution
D.3 The Chi-Square (? 2) Distribution and the ? 2 Test
D.4 Student's t Distribution
D.5 Fisher's F Distribution
D.6 Relationships Among Probability Distributions
D.7 Uniform Distributions
D.8 Some Special Features of the Normal Distribution
Index
List of Figures
Series Editor's Introduction
Preface
About the Author
Acknowledgments
Chapter 1: The Linear Regression Model (LRM)
1.1 Introduction
1.2 Meaning of "Linear" in Linear Regression
1.3 Estimation of the LRM: An Algebraic Approach
1.4 Goodness of Fit of a Regression Model: The Coefficient of Determination
(R2)
1.5 R2 for Regression Through the Origin
1.6 An Example: The Determination of the Hourly Wages in the United States
1.7 Summary
Exercises
Appendix 1A: Derivation of the Normal Equations
Chapter 2: The Classical Linear Regression Model (CLRM)
2.1 Assumptions of the CLRM
2.2 The Sampling or Probability Distributions of the OLS Estimators
2.3 Properties of OLS Estimators: The Gauss-Markov Theorem
2.4 Estimating Linear Functions of the OLS Parameters
2.5 Large-Sample Properties of OLS Estimators
2.6 Summary
Exercises
Chapter 3: The Classical Normal Linear Regression Model: The Method of
Maximum Likelihood (ML)
3.1 Introduction
3.2 The Mechanics of ML
3.3 The Likelihood Function of the k-Variable Regression Model
3.4 Properties of the ML Method
3.5 Summary
Exercises
Appendix 3A: Asymptotic Efficiency of the ML Estimators of the LRM
Chapter 4: Linear Regression Model: Distribution Theory and Hypothesis
Testing
4.1 Introduction
4.2 Types of Hypotheses
4.3 Procedure for Hypothesis Testing
4.4 The Determination of Hourly Wages in the United States
4.5 Testing Hypotheses About an Individual Regression Coefficient
4.6 Testing the Hypothesis That All the Regressors Collectively Have No
Influence on the Regressand
4.7 Testing the Incremental Contribution of a Regressor
4.8 Confidence Interval for the Error Variance s 2
4.9 Large-Sample Tests of Hypotheses
4.10 Summary
Exercises
Appendix 4A: Constrained Least Squares: OLS Estimation Under Linear
Restrictions
Chapter 5: Generalized Least Squares (GLS): Extensions of the Classical
Linear Regression Model
5.1 Introduction
5.2 Estimation of B With a Nonscalar Covariance Matrix
5.3 Estimated Generalized Least Squares
5.4 Heteroscedasticity and Weighted Least Squares
5.5 White's Heteroscedasticity-Consistent Standard Errors
5.6 Autocorrelation
5.7 Summary
Exercises
Appendix 5A: ML Estimation of GLS
Chapter 6: Extensions of the Classical Linear Regression Model: The Case of
Stochastic or Endogenous Regressors
6.1 Introduction
6.2 X and u Are Distributed Independently
6.3 X and u Are Contemporaneously Uncorrelated
6.4 X and u Are Neither Independently Distributed Nor Contemporaneously
Uncorrelated
6.5 The Case of k Regressors
6.6 What Is the Solution? The Method of Instrumental Variables (IVs)
6.7 Hypothesis Testing Under IV Estimation
6.8 Practical Problems in the Application of the IV Method
6.9 Regression Involving More Than One Endogenous Regressor
6.10 An Illustrative Example: Earnings and Educational Attainment of Youth
in the United States
6.11 Regression Involving More Than One Endogenous Regressor
6.12 Summary
Appendix 6A: Properties of OLS When Random X and u Are Independently
Distributed
Appendix 6B: Properties of OLS Estimators When Random X and u Are
Contemporaneously Uncorrelated
Chapter 7: Selected Topics in Linear Regression
7.1 Introduction
7.2 The Nature of Multicollinearity
7.3 Model Specification Errors
7.4 Qualitative or Dummy Regressors
7.5 Nonnormal Error Term
7.6 Summary
Exercises
Appendix 7A: Ridge Regression: A Solution to Perfect Collinearity
Appendix 7B: Specification Errors
Appendix A: Basics of Matrix Algebra
A.1 Definitions
A.2 Types of Matrices
A.3 Matrix Operations
A.4 Matrix Transposition
A.5 Matrix Inversion
A.6 Determinants
A.7 Rank of a Matrix
A.8 Finding the Inverse of a Square Matrix
A.9 Trace of a Square Matrix
A.10 Quadratic Forms and Definite Matrices
A.11 Eigenvalues and Eigenvectors
A.12 Vector and Matrix Differentiation
Appendix B: Essentials of Large-Sample Theory
B.1 Some Inequalities
B.2 Types of Convergence
B.3 The Order of Magnitude of a Sequence
B.4 The Order of Magnitude of a Stochastic Sequence
Appendix C: Small- and Large-Sample Properties of Estimators
C.1 Small-Sample Properties of Estimators
C.2 Large-Sample Properties of Estimators
Appendix D: Some Important Probability Distributions
D.1 The Normal Distribution and the Z Test
D.2 The Gamma Distribution
D.3 The Chi-Square (? 2) Distribution and the ? 2 Test
D.4 Student's t Distribution
D.5 Fisher's F Distribution
D.6 Relationships Among Probability Distributions
D.7 Uniform Distributions
D.8 Some Special Features of the Normal Distribution
Index
Series Editor's Introduction
Preface
About the Author
Acknowledgments
Chapter 1: The Linear Regression Model (LRM)
1.1 Introduction
1.2 Meaning of "Linear" in Linear Regression
1.3 Estimation of the LRM: An Algebraic Approach
1.4 Goodness of Fit of a Regression Model: The Coefficient of Determination
(R2)
1.5 R2 for Regression Through the Origin
1.6 An Example: The Determination of the Hourly Wages in the United States
1.7 Summary
Exercises
Appendix 1A: Derivation of the Normal Equations
Chapter 2: The Classical Linear Regression Model (CLRM)
2.1 Assumptions of the CLRM
2.2 The Sampling or Probability Distributions of the OLS Estimators
2.3 Properties of OLS Estimators: The Gauss-Markov Theorem
2.4 Estimating Linear Functions of the OLS Parameters
2.5 Large-Sample Properties of OLS Estimators
2.6 Summary
Exercises
Chapter 3: The Classical Normal Linear Regression Model: The Method of
Maximum Likelihood (ML)
3.1 Introduction
3.2 The Mechanics of ML
3.3 The Likelihood Function of the k-Variable Regression Model
3.4 Properties of the ML Method
3.5 Summary
Exercises
Appendix 3A: Asymptotic Efficiency of the ML Estimators of the LRM
Chapter 4: Linear Regression Model: Distribution Theory and Hypothesis
Testing
4.1 Introduction
4.2 Types of Hypotheses
4.3 Procedure for Hypothesis Testing
4.4 The Determination of Hourly Wages in the United States
4.5 Testing Hypotheses About an Individual Regression Coefficient
4.6 Testing the Hypothesis That All the Regressors Collectively Have No
Influence on the Regressand
4.7 Testing the Incremental Contribution of a Regressor
4.8 Confidence Interval for the Error Variance s 2
4.9 Large-Sample Tests of Hypotheses
4.10 Summary
Exercises
Appendix 4A: Constrained Least Squares: OLS Estimation Under Linear
Restrictions
Chapter 5: Generalized Least Squares (GLS): Extensions of the Classical
Linear Regression Model
5.1 Introduction
5.2 Estimation of B With a Nonscalar Covariance Matrix
5.3 Estimated Generalized Least Squares
5.4 Heteroscedasticity and Weighted Least Squares
5.5 White's Heteroscedasticity-Consistent Standard Errors
5.6 Autocorrelation
5.7 Summary
Exercises
Appendix 5A: ML Estimation of GLS
Chapter 6: Extensions of the Classical Linear Regression Model: The Case of
Stochastic or Endogenous Regressors
6.1 Introduction
6.2 X and u Are Distributed Independently
6.3 X and u Are Contemporaneously Uncorrelated
6.4 X and u Are Neither Independently Distributed Nor Contemporaneously
Uncorrelated
6.5 The Case of k Regressors
6.6 What Is the Solution? The Method of Instrumental Variables (IVs)
6.7 Hypothesis Testing Under IV Estimation
6.8 Practical Problems in the Application of the IV Method
6.9 Regression Involving More Than One Endogenous Regressor
6.10 An Illustrative Example: Earnings and Educational Attainment of Youth
in the United States
6.11 Regression Involving More Than One Endogenous Regressor
6.12 Summary
Appendix 6A: Properties of OLS When Random X and u Are Independently
Distributed
Appendix 6B: Properties of OLS Estimators When Random X and u Are
Contemporaneously Uncorrelated
Chapter 7: Selected Topics in Linear Regression
7.1 Introduction
7.2 The Nature of Multicollinearity
7.3 Model Specification Errors
7.4 Qualitative or Dummy Regressors
7.5 Nonnormal Error Term
7.6 Summary
Exercises
Appendix 7A: Ridge Regression: A Solution to Perfect Collinearity
Appendix 7B: Specification Errors
Appendix A: Basics of Matrix Algebra
A.1 Definitions
A.2 Types of Matrices
A.3 Matrix Operations
A.4 Matrix Transposition
A.5 Matrix Inversion
A.6 Determinants
A.7 Rank of a Matrix
A.8 Finding the Inverse of a Square Matrix
A.9 Trace of a Square Matrix
A.10 Quadratic Forms and Definite Matrices
A.11 Eigenvalues and Eigenvectors
A.12 Vector and Matrix Differentiation
Appendix B: Essentials of Large-Sample Theory
B.1 Some Inequalities
B.2 Types of Convergence
B.3 The Order of Magnitude of a Sequence
B.4 The Order of Magnitude of a Stochastic Sequence
Appendix C: Small- and Large-Sample Properties of Estimators
C.1 Small-Sample Properties of Estimators
C.2 Large-Sample Properties of Estimators
Appendix D: Some Important Probability Distributions
D.1 The Normal Distribution and the Z Test
D.2 The Gamma Distribution
D.3 The Chi-Square (? 2) Distribution and the ? 2 Test
D.4 Student's t Distribution
D.5 Fisher's F Distribution
D.6 Relationships Among Probability Distributions
D.7 Uniform Distributions
D.8 Some Special Features of the Normal Distribution
Index