Published online by Cambridge University Press: 05 June 2012
When and how did humans begin to count? Where does arithmetic come from? Are humans innately endowed with arithmetical abilities or is human numerical cognition a strictly cultural achievement? To a large extent the answers to the preceding questions depend upon how precisely we define human numerical cognition and arithmetical abilities.
If by numerical cognition we refer to the property of approximation – that is, the capacity for a basic appreciation of changes in quantity and a simple number sense (oneness, twoness, and threeness) – then several lines of evidence in contemporary cognitive neurosciences clearly support the view that this can be considered to be an evolved, innate biological competence shared by human infants and other animals. For example, a number of studies show that both preverbal infants and animals are able to detect numerocities, discriminating between small sets of objects or sequences of sounds both within, but also beyond, the so-called subitizing range (up to three or four objects) (Antell & Keating 1983; Wynn 1996; Davis & Pérusse 1988; Brannon & Terrace 1998; 2000; 2002; Biro & Matsuzawa 2001) – provided that in the latter case the comparison ratios are large enough (i.e., infants were able to discriminate 8 from 16, but not 8 from 12 items) (Xu & Spelke 2000; Lipton & Spelke 2003). More characteristic might be the finding that infants as young as five months old (Wynn 1992), but also untrained rhesus monkeys, seem to have additive and subtractive expectations when they observe or choose between arrays containing small number of objects.