Hostname: page-component-cd4964975-g4d8c Total loading time: 0 Render date: 2023-04-01T08:38:43.031Z Has data issue: true Feature Flags: { "useRatesEcommerce": false } hasContentIssue true

# 9 - Maximum Likelihood and Nonlinear Regression

Published online by Cambridge University Press:  01 June 2011

## Summary

Introduction

Maximum likelihood is generally regarded as the best all-purpose approach for statistical analysis. Outside of the most common statistical procedures, when the “optimal” or “usual” method is unknown, most statisticians follow the principle of maximum likelihood for parameter estimation and statistical hypothesis tests. Bayesian statistical methods also rely heavily on maximum likelihood. The main reason for this reliance is that following the principle of maximum likelihood usually leads to very reasonable and effective estimators and tests. From a theoretical viewpoint, under very mild conditions, maximum likelihood estimators (MLEs) are consistent, asymptotically unbiased, and efficient. Moreover, MLEs are invariant under reparameterizations or transformations: the MLE of a function of the parameter is the function of the MLE. From a practical viewpoint, the estimates and test statistics can be constructed without a great deal of analysis, and large-sample standard errors can be computed. Overall, experience has shown that maximum likelihood works well most of the time.

The biggest computational challenge comes from the naive expectation that any statistical problem can be solved if the maximum of some function is found. Instead of relying solely on the unconstrained optimization methods presented in Chapter 8 to meet this unrealistic expectation, the nature of the likelihood function can be exploited in ways that are more effective for computing MLEs. Since the exploitable properties of likelihood functions follow from the large-sample theory, this chapter will begin with a summary of the consistency and asymptotic normality properties of MLEs.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2011

## Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

### Purchase

Buy print or eBook[Opens in a new window]

## References

(1970), Numerical Methods That Work. New York: Harper & Row.Google Scholar
and (1984), “On the Existence of Maximum Likelihood Estimates in Logistic Regression Models,” Biometrika 71: 1–10.CrossRefGoogle Scholar
, , and (1983), “Calculation of Intrinsic and Parameter-Effects Curvatures for Nonlinear Regression Models,” Communications in Statistics B 12: 469–77.CrossRefGoogle Scholar
and (1980), “Relative Curvature Measures of Nonlinearity,” Journal of the Royal Statistical Society B 42: 1–25.Google Scholar
and (1988), Nonlinear Regression Analysis and Its Applications. New York: Wiley.CrossRefGoogle Scholar
(1992), “On Generalized Score Tests,” American Statistician 46: 327–33.Google Scholar
(1970), Analysis of Binary Data. London: Methuen.Google Scholar
, , and (1977), “Maximum Likelihood from Incomplete Data via the EM Algorithm,” Journal of the Royal Statistical Society B 39: 1–38.Google Scholar
, , and (1981), “An Adaptive Nonlinear Least-Squares Algorithm,” ACM Transactions on Mathematical Software 7: 348–83.CrossRefGoogle Scholar
, and (1983), Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
and (1978), “Assessing the Accuracy of the Maximum Likelihood Estimator: Observed versus Expected Fisher Information,” Biometrika 65: 457–87.CrossRefGoogle Scholar
(1947), “The Estimation from Individual Records of the Relationship Between Dose and Quantal Response,” Biometrika 34: 320–34.CrossRefGoogle ScholarPubMed
(1983), “The Analysis of Rates Using Poisson Regression Models,” Biometrics 39: 665–74.CrossRefGoogle ScholarPubMed
(1976), Introduction to Statistical Time Series. New York: Wiley.Google Scholar
(1975), “Nonlinear Regression,” American Statistician 29: 73–81.Google Scholar
(1987), Nonlinear Statistical Models. New York: Wiley.CrossRefGoogle Scholar
(1973), “An Algorithm for Minimization Using Exact Second Derivatives,” Report no. TP515, Atomic Energy Research Establishment, Harwell, U.K.
and (1977), “Robust Regression Using Iteratively Reweighted Least Squares,” Communications in Statistics A6: 813–27.CrossRefGoogle Scholar
and (1975), “Maximum Likelihood Estimation by Means of Nonlinear Least Squares,” in Proceedings of the Statistical Computing Section, pp. 57–65. Washington, DC: American Statistical Association.Google Scholar
and (1968), “An Application of Stepwise Regression to Non-Linear Estimation,” Technometrics 10: 63–72.CrossRefGoogle Scholar
(1956), “On the Asymptotic Theory of Estimation and Testing Hypotheses,” in Proceedings of the Third Berkeley Symposium of Mathematical Statistics and Probability, vol. 1, pp. 129–56. Berkeley: University of California Press.Google Scholar
(1970), “On the Assumptions Used to Prove Asymptotic Normality of Maximum Likelihood Estimators,” Annals of Mathematical Statistics 41: 802–28.CrossRefGoogle Scholar
and (1990), Asymptotics in Statistics: Some Basic Concepts. New York: Springer-Verlag.CrossRefGoogle Scholar
(1963), “An Algorithm for Least-Squares Estimation of Nonlinear Parameters,” SIAM Journal of Applied Mathematics 11: 431–41.CrossRefGoogle Scholar
and (1992), Generalized Linear Models, 2nd ed. New York: Chapman & Hall.Google Scholar
(1983), “Fully Bayesian Analysis of ARMA Time Series Models,” Journal of Econometrics 21: 307–31.CrossRefGoogle Scholar
(1977), “The Levenberg–Marquardt Algorithm: Implementation and Theory,” in (Ed.), Numerical Analysis (Lecture Notes in Mathematics, no. 630), pp. 105–16. Berlin: Springer-Verlag.Google Scholar
, , and (1981), “Testing Unconstrained Optimization Software,” ACM Transactions on Mathematical Software 7: 17–41.CrossRefGoogle Scholar
and (2000), “On Profile Likelihood,” Journal of the American Statistical Association 95: 449–65.CrossRefGoogle Scholar
and (1948), “Consistent Estimates Based on Partially Consistent Observations,” Econometrica 16: 1–32.CrossRefGoogle Scholar
(1973), Linear Statistical Inference and Its Applications, 2nd ed. New York: Wiley.CrossRefGoogle Scholar
and (1986), “A Note on A. Albert and J. A. Anderson's Conditions for the Existence of Maximum Likelihood Estimates in Logistic Regression Models,” Biometrika 73: 755–58.CrossRefGoogle Scholar
(1980), Approximation Theorems and Mathematical Statistics. New York: Wiley.CrossRefGoogle Scholar
(2004), “On the Advantage of Using Two or More Econometric Software Systems to Solve the Same Problem,” Journal of Economic and Social Measurement 29: 307–20.Google Scholar
(1949), “Note on the Consistency of the Maximum Likelihood Estimate,” Annals of Mathematical Statistics 20: 595–601.CrossRefGoogle Scholar
(1976), “On the Existence and Uniqueness of the Maximum Likelihood Estimates for Certain Generalized Linear Models,” Biometrika 63: 27–32.CrossRefGoogle Scholar
(1949), “On Wald's Proof of the Consistency of the Maximum Likelihood Estimate,” Annals of Mathematical Statistics 20: 601–2.CrossRefGoogle Scholar
(1981), “Asymptotic Theory of Nonlinear Least Squares Estimation,” Annals of Statistics 9: 501–13.CrossRefGoogle Scholar
(1983), “On the Convergence Properties of the EM Algorithm,” Annals of Statistics 11: 95–103.CrossRefGoogle Scholar

# Save book to Kindle

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

# Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

# Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×