Published online by Cambridge University Press: 20 May 2010
In our everyday interactions we encounter a plethora of novel experiences in different contexts that require prompt decisions for successful actions and social interactions. Despite the seeming ease with which we perform these interactions, extracting the key information from the highly complex input of the natural world and deciding how to classify and interpret it is a computationally demanding task for the primate visual system. Accumulating evidence suggests that the brain's solution to this problem relies on the combination of sensory information and previous knowledge about the environment. Although evolution and development have been suggested to shape the structure and organization of the visual system (Gilbert et al. 2001a; Simoncelli and Olshausen 2001), learning through everyday experiences has been proposed to play an important role in the adaptive optimization of visual functions. In particular, numerous behavioral studies have shown experience-dependent changes in visual recognition using stimuli ranging from simple features, such as oriented lines and gratings (Fahle 2004), to complex objects (Fine and Jacobs 2002). Recent neurophysiological (Logothetis et al. 1995; Rolls 1995; Kobatake et al. 1998; Rainer and Miller 2000; Jagadeesh et al. 2001; Schoups et al. 2001b; Baker et al. 2002; Ghose et al. 2002; Lee et al. 2002; Sigala and Logothetis 2002; Freedman et al. 2003; Miyashita 2004; Rainer et al. 2004; Yang and Maunsell 2004) and functional magnetic resonance imaging (fMRI) investigations (Dolan et al. 1997; Gauthier et al. 1999; Schiltz et al. 1999; Grill-Spector et al. 2000; van Turennout et al. 2000; Furmanski et al. 2004; Kourtzi et al. 2005b) have focused on elucidating the loci of brain plasticity and changes in neuronal responses that underlie this visual learning.