Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Variation
- 3 Uncertainty
- 4 Likelihood
- 5 Models
- 6 Stochastic Models
- 7 Estimation and Hypothesis Testing
- 8 Linear Regression Models
- 9 Designed Experiments
- 10 Nonlinear Regression Models
- 11 Bayesian Models
- 12 Conditional and Marginal Inference
- Appendix A Practicals
- Bibliography
- Name Index
- Example Index
- Index
10 - Nonlinear Regression Models
Published online by Cambridge University Press: 29 March 2011
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Variation
- 3 Uncertainty
- 4 Likelihood
- 5 Models
- 6 Stochastic Models
- 7 Estimation and Hypothesis Testing
- 8 Linear Regression Models
- 9 Designed Experiments
- 10 Nonlinear Regression Models
- 11 Bayesian Models
- 12 Conditional and Marginal Inference
- Appendix A Practicals
- Bibliography
- Name Index
- Example Index
- Index
Summary
Introduction
The regression models of Chapters 8 and 9 involve a continuous response that depends linearly on the parameters. Linear models remain the backbone of most statistical data analysis, but they have their deficiencies. In many applications, response variables are discrete, or statistical or substantive considerations suggest that covariates will appear nonlinearly. Models of this sort appeared on a somewhat ad hoc basis in the literature up to about 1970, since when there has been an explosion of generalizations to the linear model. Two important developments were the use of iterative weighted least squares for fitting, and the systematic use of exponential family response distributions. The iterative weighted least squares algorithm has wide applicability in nonlinear models and we outline its properties in Section 10.2, giving also a discussion of likelihood inference in this context. Exponential family response densities play a central role in generalized linear models, which we describe in Section 10.3, turning to the important special cases of binomial and Poisson responses in Sections 10.4 and 10.5. These models are widely used, but real data often display too much variation for them to be taken at face value. In Section 10.6 we outline remedies for this, based on the discussion of estimating functions in Section 7.2.
In each of these generalizations of the linear model our key notion that a few parameters summarize the entire model is retained. Section 10.7 branches out in a different direction, taking the viewpoint that the regression curve itself is more central than the parameters that summarize it.
- Type
- Chapter
- Information
- Statistical Models , pp. 468 - 564Publisher: Cambridge University PressPrint publication year: 2003
- 1
- Cited by