Book contents
- Frontmatter
- Contents
- List of figures
- List of tables
- List of panels
- Preface
- Part I Elementary statistical analysis
- Part II Samples and inductive statistics
- Part III Multiple linear regression
- Chapter 8 Multiple relationships
- Chapter 9 The classical linear regression model
- Chapter 10 Dummy variables and lagged values
- Chapter 11 Violating the assumptions of the classical linear regression model
- Part IV Further topics in regression analysis
- Part V Specifying and interpreting models: four case studies
- Appendix A The four data sets
- Appendix B Index numbers
- Bibliography
- Index of subjects
- Index of names
Chapter 11 - Violating the assumptions of the classical linear regression model
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- List of figures
- List of tables
- List of panels
- Preface
- Part I Elementary statistical analysis
- Part II Samples and inductive statistics
- Part III Multiple linear regression
- Chapter 8 Multiple relationships
- Chapter 9 The classical linear regression model
- Chapter 10 Dummy variables and lagged values
- Chapter 11 Violating the assumptions of the classical linear regression model
- Part IV Further topics in regression analysis
- Part V Specifying and interpreting models: four case studies
- Appendix A The four data sets
- Appendix B Index numbers
- Bibliography
- Index of subjects
- Index of names
Summary
We have thus far been developing the methodology of classical linear regression (CLR) using the ordinary least squares (OLS) system of estimation. This is a very powerful technique for uncovering the relationships among variables. Yet it has its limitations. It is clearly important to understand what these limitations are, to see how they affect the outcomes of the regression, and to suggest some procedures both to identify and to correct for these effects. That is the purpose of this chapter.
The assumptions of the classical linear regression model
The best way to proceed is first to indicate the main criteria that a model must satisfy in order to qualify as a good estimator, that is, to be what the econometricians call the best linear unbiased estimator (BLUE), and then to state the conditions under which OLS methods meet these criteria.
We have previously defined BLUE in §9.2.4 but repeat it here for convenience. In the case of a regression coefficient, b, the mean of its sampling distribution is said to be unbiased if it is equal to the true (unknown) population coefficient, β. If the assumptions underlying an OLS regression model are correct, then the estimates of its regression coefficients satisfy this criterion.
This does not mean that the estimate formed from the particular sample we happen to have will equal the true β; only that the average of all the estimates we would get if we could repeat the sampling process an infinite number of times would do so.
- Type
- Chapter
- Information
- Making History CountA Primer in Quantitative Methods for Historians, pp. 300 - 330Publisher: Cambridge University PressPrint publication year: 2002