Book contents
- Frontmatter
- Contents
- List of examples
- Preface
- 1 Preliminaries
- 2 Some concepts and simple applications
- 3 Significance tests
- 4 More complicated situations
- 5 Interpretations of uncertainty
- 6 Asymptotic theory
- 7 Further aspects of maximum likelihood
- 8 Additional objectives
- 9 Randomization-based analysis
- Appendix A A brief history
- Appendix B A personal view
- References
- Author index
- Subject index
7 - Further aspects of maximum likelihood
Published online by Cambridge University Press: 17 March 2011
- Frontmatter
- Contents
- List of examples
- Preface
- 1 Preliminaries
- 2 Some concepts and simple applications
- 3 Significance tests
- 4 More complicated situations
- 5 Interpretations of uncertainty
- 6 Asymptotic theory
- 7 Further aspects of maximum likelihood
- 8 Additional objectives
- 9 Randomization-based analysis
- Appendix A A brief history
- Appendix B A personal view
- References
- Author index
- Subject index
Summary
Summary. Maximum likelihood estimation and related procedures provide effective solutions for a wide range of problems. There can, however, be difficulties leading at worst to inappropriate procedures with properties far from those sketched above. Some of the difficulties are in a sense mathematical pathologies but others have serious statistical implications. The first part of the chapter reviews the main possibilities for anomalous behaviour. For illustration relatively simple examples are used, often with a single unknown parameter. The second part of the chapter describes some modifications of the likelihood function that sometimes allow escape from these difficulties.
Multimodal likelihoods
In some limited cases, notably connected with exponential families, convexity arguments can be used to show that the log likelihood has a unique maximum. More commonly, however, there is at least the possibility of multiple maxima and saddle-points in the log likelihood surface. See Note 7.1.
There are a number of implications. First, proofs of the convergence of algorithms are of limited comfort in that convergence to a maximum that is in actuality not the overall maximum of the likelihood is unhelpful or worse. Convergence to the global maximum is nearly always required for correct interpretation. When there are two or more local maxima giving similar values to the log likelihood, it will in principle be desirable to know them all; the natural confidence set may consist of disjoint intervals surrounding these local maxima.
- Type
- Chapter
- Information
- Principles of Statistical Inference , pp. 133 - 160Publisher: Cambridge University PressPrint publication year: 2006