Skip to main content Accessibility help
×
Hostname: page-component-7c8c6479df-nwzlb Total loading time: 0 Render date: 2024-03-28T10:01:55.383Z Has data issue: false hasContentIssue false

8 - Subjective Confidence and the Sampling of Knowledge

Published online by Cambridge University Press:  02 February 2010

Joshua Klayman
Affiliation:
University of Chicago Graduate School of Business
Jack B. Soll
Affiliation:
INSEAD Business School, France
Peter Juslin
Affiliation:
Professor of Psychology, Uppsala University in Sweden
Anders Winman
Affiliation:
Uppsala University, Sweden
Klaus Fiedler
Affiliation:
Ruprecht-Karls-Universität Heidelberg, Germany
Peter Juslin
Affiliation:
Umeå Universitet, Sweden
Get access

Summary

INTRODUCTION

In many situations people must, at least implicitly, make subjective judgments about how certain they are. Such judgments are part of decisions about whether to collect more information, whether to undertake a risky course of action, which contingencies to plan for, and so on. Underlying such decisions are subjective judgments about the quality of the decision maker's information. Accordingly, many researchers have been interested in the mental processes underlying such judgments, which go under the general label of confidence.

There are many ways in which confidence can be expressed, both in the real world and in the research lab. For example, Yaniv and Foster (1995) present the concept of grain size. People communicate their confidence in an estimate via the precision (grain size) with which they express it. “I think it was during the last half of the nineteenth century” implies a different degree of confidence than “I think it was around 1875.” Listeners expect speakers to choose a grain size appropriate to their level of knowledge. People also use a variety of verbal probability terms to describe confidence in their predictions, choices, and estimates (e.g., “I'm pretty sure …,” “It's definitely not …,” “I really don't know, but …”), which people understand to imply different degrees of certainty (Wallsten & Budescu, 1983; Zimmer, 1984).

In the lab, most studies use one of three predominant paradigms.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2005

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Block, R. A., & Harper, D. R. (1991). Overconfidence in estimation: Testing the anchoring-and-adjustment hypothesis. Organizational Behavior and Human Decision Processes, 49, 188–207CrossRefGoogle Scholar
Budescu, D. V., Erev, I., & Wallsten, T. S. (1997). On the importance of random error in the study of probability judgment. Part II: New theoretical developments. Journal of Behavioral Decision Making, 10, 157–1713.0.CO;2-#>CrossRefGoogle Scholar
Budescu, D. V., Wallsten, T. S., & Au, W. T. (1997). On the importance of random error in the study of probability judgment. Part II: Applying the stochastic judgment model to detect systematic trends. Journal of Behavioral Decision Making, 10, 172–1883.0.CO;2-6>CrossRefGoogle Scholar
Clemen, R. T. (1996). Making hard decisions: An introduction to decision analysis (2nd ed.). Boston: PWS-Kent PublishingGoogle Scholar
Clemen, R. T. (2001). Assessing 10–50–90s: A surprise. Decision Analysis Newsletter, 20(1), 2, 15Google Scholar
Dawes, R. M. (2001). Everyday irrationality. Cambridge, MA: Westview PressGoogle Scholar
Dawes, R. M., & Mulford, M. (1996). The false consensus effect and overconfidence: Flaws in judgment or flaws in how we study judgment?Organizational Behavior and Human Decision Processes, 65, 201–211CrossRefGoogle Scholar
Doherty, M. E., Mynatt, C. R., Tweeney, R. D., & Schiavo, M. D. (1979). Pseudodiagnosticity. Acta Psychologica, 43, 111–121CrossRefGoogle Scholar
Epley, N., & Gilovich, T. (2001). Putting adjustment back in the anchoring and adjustment heuristic: Differential processing of self-generated and experimenter-provided anchors. Psychological Science, 12, 391–396CrossRefGoogle Scholar
Erev, I., Wallsten, T. S., & Budescu, D. V. (1994). Simultaneous over- and underconfidence: The role of error in judgment processes. Psychological Review, 101, 519–527CrossRefGoogle Scholar
Fiedler, K. (2000). Beware of samples! A cognitive-ecological sampling approach to judgment biases. Psychological Review, 107, 659–676CrossRefGoogle ScholarPubMed
Gigerenzer, G., & Goldstein, D. G. (1999). Betting on one good reason: The take the best heuristic. In Gigerenzer, G., Todd, P. M., & the ABC Research Group (Eds.), Simple heuristics that make us smart (pp. 75–96). New York: Oxford University PressGoogle Scholar
Gigerenzer, G., Hoffrage, U., & Kleinbölting, H. (1991). Probabilistic mental models: A Brunswikian theory of confidence. Psychological Review, 98, 506–528CrossRefGoogle Scholar
Griffin, D., & Tversky, A. (1992). The weighing of evidence and the determinants of confidence. Cognitive Psychology, 24, 411–435CrossRefGoogle Scholar
Hammond, K. R. (1996). Human judgment and social policy: Irreducible uncertainty, inevitable error, unavoidable injustice. New York: Oxford University PressGoogle Scholar
Hoch, S. J. (1984). Availability and interference in predictive judgment. Journal of Experimental Psychology: Learning, Memory, and Cognition, 10, 649–662Google Scholar
Hoch, S. J. (1985). Counterfactual reasoning and accuracy in predicting personal events. Journal of Experimental Psychology: Learning, Memory, and Cognition, 11, 719–731Google Scholar
Hoch, S. J., & Ha, Y.-W. (1986). Consumer learning: Advertising and the ambiguity of product experience. Journal of Consumer Research, 13, 221–233CrossRefGoogle Scholar
Juslin, P. (1993). An explanation of the hard-easy effect in studies of realism of confidence in one's general knowledge. European Journal of Cognitive Psychology, 5, 55–71CrossRefGoogle Scholar
Juslin, P. (1994). The overconfidence phenomenon as a consequence of informal experimenter-guided selection of almanac items. Organizational Behavior and Human Decision Processes, 57, 226–246CrossRefGoogle Scholar
Juslin, P., Olsson, H., & Björkman, M. (1997). Brunswikian and Thurstonian origins of bias in probability assessment: On the interpretation of stochastic components of judgment. Journal of Behavioral Decision Making, 10, 189–2093.0.CO;2-4>CrossRefGoogle Scholar
Juslin, P., & Persson, M. (2002). PROBabilities from EXemplars (PROBEX): A “lazy” algorithm for probabilistic inference from generic knowledge. Cognitive Science, 26, 563–607CrossRefGoogle Scholar
Juslin, P., Wennerholm, P., & Olsson, H. (1999). Format dependence in subjective probability calibration. Journal of Experimental Psychology: Learning, Memory and Cognition, 28, 1038–1052Google Scholar
Juslin, P., Winman, A., & Hansson, P. (2003). The naïve intuitive statistician: A sampling model of format dependence in probability judgment. Manuscript, Department of Psychology, Uppsala University, SwedenGoogle Scholar
Juslin, P., Winman, A., & Olsson, H. (2000). Naive empiricism and dogmatism in confidence research: A critical examination of the hard-easy effect. Psychological Review, 107, 384–396CrossRefGoogle ScholarPubMed
Juslin, P., Winman, A., & Olsson, H. (2003). Calibration, additivity, and source independence of probability judgments in general knowledge and sensory discrimination tasks. Organizational Behavior and Human Decision Processes, 92, 34–51CrossRefGoogle Scholar
Kareev, Y., Arnon, S., & Horwitz-Zeliger, R. (2002). On the misperception of variability. Journal of Experimental Psychology: General, 131, 287–297CrossRefGoogle ScholarPubMed
Klayman, J. (1995). Varieties of confirmation bias. In Busemeyer, J. R., Hastie, R., & Medin, D. L. (Eds.), Decision making from the perspective of cognitive psychology (pp. 385–418). New York: Academic PressGoogle Scholar
Klayman, J., & Brown, K. (1993). Debias the environment instead of the judge: An alternative approach to reducing error in diagnostic (and other) judgment. Cognition, 49, 97–122CrossRefGoogle ScholarPubMed
Klayman, J., Soll, J. B., González-Vallejo, C., & Barlas, S. (1999). Overconfidence: It depends on how, what, and whom you ask. Organizational Behavior and Human Decision Processes, 79, 216–247CrossRefGoogle Scholar
Koehler, D. J. (1991). Explanation, imagination, and confidence in judgment. Psychological Bulletin, 110, 499–519CrossRefGoogle ScholarPubMed
Koehler, D. J., Brenner, L., & Griffin, D. (2002). In Gilovich, T., Griffin, D., & Kahneman, D. (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 686–715). New York: Cambridge University PressCrossRefGoogle Scholar
Koriat, A., Lichtenstein, S., & Fischhoff, B. (1980). Reasons for confidence. Journal of Experimental Psychology: Human Learning and Memory, 6, 107–118Google Scholar
Lichtenstein, S., & Fischhoff, B. (1977). Do those who know more also know more about how much they know?Organizational Behavior and Human Performance, 20, 159–183CrossRefGoogle Scholar
Lichtenstein, S., Fischhoff, B., & Phillips, L. D. (1982). Calibration of subjective probabilities: The state of the art up to 1980. In Kahneman, D., Slovic, P., & Tversky, A. (Eds.), Judgment under uncertainty: Heuristics and biases (pp. 306–334). New York: Cambridge University PressCrossRefGoogle Scholar
McKenzie, C. R. M. (1998). Taking into account the strength of an alternative hypothesis. Journal of Experimental Psychology: Learning, Memory, and Cognition, 24, 771–792Google Scholar
Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35, 136–164CrossRefGoogle Scholar
Russo, J. E., Meloy, M. G., & Medvec, V. H. (1998). Predecisional distortion of product information. Journal of Marketing Research, 35, 438–452CrossRefGoogle Scholar
Russo, J. E., & Schoemaker, P. J. H. (1992). Managing overconfidence. Sloan Management Review, 33, 7–17Google Scholar
Selvidge, J. E. (1980). Assessing the extremes of probability distributions by the fractile method. Decision Sciences, 11, 493–502CrossRefGoogle Scholar
Slowiaczek, L. M., Klayman, J., Sherman, S. J., & Skov, R. B. (1992). Information selection and use in hypothesis testing: What is a good question, and what is a good answer?Memory & Cognition, 20, 392–405CrossRefGoogle ScholarPubMed
Sniezek, J. A., Paese, P. W., & Switzer, F. S. C. (1990). The effects of choosing on confidence in choice. Organizational Behavior and Human Decision Processes, 46, 264–282CrossRefGoogle Scholar
Soll, J. B. (1996). Determinants of overconfidence and miscalibration: The roles of random error and ecological structure. Organizational Behavior and Human Decision Processes, 65, 117–137CrossRefGoogle Scholar
Soll, J. B., & Klayman, J. (2004). Overconfidence in interval estimates. Journal of Experimental Psychology: Learning, Memory, and Cognition, 30, 299–314Google ScholarPubMed
Spetzler, C. S., & Staël von Holstein, C.-A. S. (1975). Probability encoding in decision analysis. Management Science, 22, 340–358CrossRefGoogle Scholar
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1131CrossRefGoogle ScholarPubMed
Tversky, A., & Koehler, D. J. (1994). Support theory: A nonextensional representation of subjective probability. Psychological Review, 101, 547–567CrossRefGoogle Scholar
Winterfeldt, D., & Edwards, W. (1986). Decision analysis and behavioral research. Cambridge, UK: Cambridge University PressGoogle Scholar
Wallsten, T. S., & Budescu, D. V. (1983). Encoding subjective probabilities: A psychological and psychometric review. Management Science, 29, 151–173CrossRefGoogle Scholar
Yaniv, I., & Foster, D. P. (1995). Graininess of judgment under uncertainty: An accuracy informativeness trade-off. Journal of Experimental Psychology: General, 124, 424–432CrossRefGoogle Scholar
Zimmer, A. C. (1984). A model for the interpretation of verbal predictions. International Journal of Man–Machine Studies, 20, 121–134CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×