Book contents
- Frontmatter
- Dedication
- Contents
- List of Illustrations
- List of Tables
- List of Contributors
- Preface
- Part I Introduction to Modeling
- Part II Parameter Estimation
- 3 Basic Parameter Estimation Techniques
- 4 Maximum Likelihood Parameter Estimation
- 5 Combining Information from Multiple Participants
- 6 Bayesian Parameter Estimation
- 7 Bayesian Parameter Estimation
- 8 Bayesian Parameter Estimation
- 9 Multilevel or Hierarchical Modeling
- Part III Model Comparison
- Part IV Models in Psychology
- Appendix A Greek Symbols
- Appendix B Mathematical Terminology
- References
- Index
4 - Maximum Likelihood Parameter Estimation
from Part II - Parameter Estimation
Published online by Cambridge University Press: 05 February 2018
- Frontmatter
- Dedication
- Contents
- List of Illustrations
- List of Tables
- List of Contributors
- Preface
- Part I Introduction to Modeling
- Part II Parameter Estimation
- 3 Basic Parameter Estimation Techniques
- 4 Maximum Likelihood Parameter Estimation
- 5 Combining Information from Multiple Participants
- 6 Bayesian Parameter Estimation
- 7 Bayesian Parameter Estimation
- 8 Bayesian Parameter Estimation
- 9 Multilevel or Hierarchical Modeling
- Part III Model Comparison
- Part IV Models in Psychology
- Appendix A Greek Symbols
- Appendix B Mathematical Terminology
- References
- Index
Summary
In the previous chapters, we encountered one of the key issues in computational modeling: a full, quantitative specification of a model involves not just a description of the model (in the form of algorithms or equations), but also a specification of the parameters of the model and their values. Although in some cases we can use known parameter values (e.g., those determined from previous applications of the model; see Oberauer and Lewandowsky, 2008), in most cases we must estimate those parameters from the data. Chapter 3 described the basics of parameter estimation by minimizing the discrepancy between the data and the model's predictions. Chapter 4 deals with a popular and more principled alternative approach to parameter estimation called maximum likelihood estimation.
Unlike the techniques discussed in the previous chapter, maximum likelihood estimation is deeply rooted in statistical theory. Maximum likelihood estimators have known properties that are not possessed by estimates obtained via minimizing RMSD (except under specific situations detailed later); for example, maximum likelihood estimates are guaranteed to become more accurate on average with increasing sample size. Additionally, likelihood can be used to make statements about the relative weight of evidence for a particular hypothesis, either about the value of a particular parameter or about a model as a whole. This lays the groundwork for the material in upcoming chapters: likelihood plays a key role in Bayesian parameter estimation, and we will later use the idea of likelihood as the strength of evidence to explore a principled and rigorous technique for evaluating scientific models.
Basics of Probabilities
The term “likelihood” in common parlance is used interchangeably with probability; we might consider the likelihood of it raining tomorrow (which varies considerably between the two authors, who at the time of writing live in Australia and the UK), or the likelihood that an individual randomly selected from the population will live past the age of 80. By contrast, when considering statistical or computational modeling, the term likelihood takes on a very strict meaning which is subtly – but fundamentally – different from that of probability.
The best way to define likelihood, and to distinguish it from probability.
is to start with the concept of probability itself. We all have some intuitive notion of what a probability is, and these intuitions probably make some connection with the formal definitions we will introduce here.
- Type
- Chapter
- Information
- Computational Modeling of Cognition and Behavior , pp. 72 - 104Publisher: Cambridge University PressPrint publication year: 2018