7 - Multiple Regression
from PART 1 - DESCRIPTION
Published online by Cambridge University Press: 05 June 2012
Summary
As early as 1897 Mr. G. U. Yule, then my assistant, made an attempt in this direction. He fitted a line or plane by the method of least squares to a swarm of points, and this has been extended later to n-variates and is one of the best ways of reaching the multiple regression equations …
Karl PearsonIntroduction
This chapter introduces the concept of multiple regression, which in many ways is similar to bivariate regression. Both methods produce conditional predictions, though multiple regression employs more than one independent X variable to predict the value of the Y variable. Just as before, the predicted value of the dependent variable is expressed in a simple equation, and in the case of least squares regression the RMSE summarizes the likely size of the residual and the R2 statistic measures the fraction of total variation, which is explained by the regression. Once again, the OLS regression coefficients are those that minimize the SSR.
Multiple regression introduces some new issues, however. Some of the complications are purely mathematical. Although it is relatively easy to move back and forth between the algebraic expression and the pictorial (geometric) representation of the regression line in the bivariate case, most people have difficulty translating the algebraic formulation for a multiple regression into its geometric representation as a plane (in trivariate regression) or hyperplane (when there are more than two independent variables). Furthermore, the formulas for the OLS regression coefficients become very unwieldy (we discuss them in the appendix of this chapter).
- Type
- Chapter
- Information
- Introductory EconometricsUsing Monte Carlo Simulation with Microsoft Excel, pp. 164 - 197Publisher: Cambridge University PressPrint publication year: 2005