We review the changing conceptions of schizophrenia over the past 50 years as it became understood as a disorder of brain function and structure in which neurocognitive dysfunction was identified at different illness phases. The centrality of neurocognition has been recognized, especially because neurocognitive deficits are strongly related to social and role functioning in the illness, and as a result neurocognitive measures are used routinely in clinical assessment of individuals with schizophrenia. From the original definitions of the syndrome of schizophrenia in the early 20th century, impaired cognition, especially attention, was considered to be important. Neurocognitive impairments are found in the vast majority of individuals with schizophrenia, and they vary from mild, relatively restricted deficits, to dementia-like syndromes, as early as the first psychotic episode. Neurocognitive deficits are found in the premorbid phase in a substantial minority of pre-teenage youth who later develop schizophrenia, and they apparently worsen by the prodromal, high-risk phase in a majority of those who develop the illness. While there is limited evidence for reversibility of impairments from pharmacological interventions in schizophrenia, promising results have emerged from cognitive remediation studies. Thus, we expect cognitive interventions to play a larger role in schizophrenia in the coming years. Moreover, because youth at risk for schizophrenia can be identified by an emergent high-risk syndrome, earlier interventions might be applied in a pre-emptive way to reduce disability and improve adaptation. The notion of schizophrenia as a developmental neurocognitive disorder with stages opens up a window of possibilities for earlier interventions. (JINS, 2017, 23, 881–892)