Jonathan Swift wrote that ‘every man desires to live long; but no man would be old’. What keeps our brains healthy through life? Unexpectedly, a study Reference Qizilbash, Gregson, Johnson, Pearce, Douglas and Wing1 of almost two million UK health records has identified that being underweight during mid-life is associated with an increased risk of dementia. Body mass index (BMI) is a relatively crude measure, and clearly cannot assure causality, but this does raise interesting questions such as the role of potential dietary deficiencies. In a broader expert review Reference Mattson2 three factors – labelled ‘intermittent challenges’ – were critical to enduring positive cognitive functioning and mood state: regular exercise, fasting and intellectual engagement. However, much in our first-world environment, characterised by food surpluses and effort-sparing technologies, militates against these evolutionary requirements to function optimally when hungry and hunting. The lifelong presence of these ‘challenges’ positively affects mitochondrial and cellular bioenergetics, neurotrophic signalling, synaptic plasticity and neurogenesis, simultaneously reducing oxidative stress, inflammation, protein aggregation and DNA damage. There are robust data demonstrating a protective effect against neurodegenerative disorders, including Alzheimer’s and Parkinson’s disease; unfortunately, sedentary lifestyles accelerate cardiometabolic and neurodegenerative decline. Science tries to keep apace with new medical, surgical, and pharmacological interventions, but the Academy of Medical Royal Colleges has recently published a report 3 that reminds us – exercise is medicine.
Neuromodulation in depression has generally focused on the ‘top-down’ techniques of repetitive transcranial magnetic stimulation (rTMS) and transcranial direct current stimulation (tDCS) to influence brain functioning. There is less work exploring the ‘bottom-up’ techniques of vagal nerve (VNS) and trigeminal nerve stimulation (TNS). They are based on the principle that stimulating the relevant cranial nerve activates their brainstem nuclei, which are interconnected with nuclei of monoaminergic projections and the limbic system. Neuroimaging data have shown that stimulating the vagal nerve leads to changes in noradrenergic and serotonergic neurons, and distal regions associated with depression such as the frontal and anterior cingulate cortices. Despite Food and Drug Administration approval for treating some forms of depressive illness, the use of VNS has been hindered by the need to surgically implant (and intermittently replace) a watch-sized lithium battery generator in the left axillary region and connect bipolar leads to the vagus nerve. Potentially overcoming this considerable hurdle, Fang et al Reference Fang, Rong, Hong, Fan, Liu and Wang4 have now demonstrated that transcutaneous VNS (tVNS) to the ear significantly reduced Hamilton Rating Scale for Depression scores at 1 month (compared with sham tVNS) in participants with mild-to-moderate depressive disorders. Neuroimaging showed that it also modulated the default mode network, a network of brain regions active during introspection that has been shown to dysfunction in depression. The ear – specifically the auricular concha area – was selected as it is the only superficial access to afferent vagal nerve distribution; participants applied tVNS at home via an ear clip for half an hour twice a day over the 1-month trial period. Given the equipment and clinician costs, and (relative) clinical administration burden of other neuromodulatory techniques (particularly rTMS), this is an exciting development if the findings are replicated in larger trials.
The need for novel depression treatment is underscored by the facts that 1 in 20 adults will attempt suicide at some point in their life, and rates of completed suicide are rising – up 16% in the USA over the past decade. Unfortunately, most epidemiological risk factors are non-specific, although a previous episode of self-harm has been identified as a relatively strong predictor of future behaviour. A large population-based cohort study Reference Finkelstein, Macdonald, Hollands, Sivilotti, Hutson and Mamdani5 followed up over 65 000 individuals after first episodes of self-poisoning, for a median of 5 years. Over this time they had a suicide rate of 278 in 100 000 person-years, 40-fold higher than the 7 in 100 000 for a matched control group, with a median time from first episode to completed suicide of 585 days (interquartile range 147–1301 days). Such individuals were also noticeably more likely to have accidental deaths (hazard ratio 10.45), some of which, the authors argue, are likely to be suicides but without sufficient proof of intent to be so labelled. Older age, male gender, multiple previous overdoses, higher income, a diagnosis of depression, and a past psychiatric history were all associated with greater risk. Although self-poisoning is a powerful predictor of subsequent suicide, the fact that the median duration to the suicide is so long after the event makes it difficult to evaluate the impact of short-term post-overdose interventions. The absolute rate of suicide – while relatively far higher than in controls – remains low. Undoubtedly, sustained vigilance is required in such individuals.
Dialectical behavioural therapy (DBT) has a good evidence base for positive outcomes in suicidal individuals with borderline personality disorder. However, it has multiple constituent components that can include individual therapy, group skills training, between-session telephone coaching, and a therapist consultation team, and it has not been clear which aspect(s) is/are most effective. Linehan et al Reference Linehan, Korslund, Harned, Gallop, Lungu and Neacsiu6 randomised 100 women with borderline personality disorder – all of whom had at least two suicide attempts or non-suicidal self-injury acts in the previous 5 years – into three treatment groups; first, to receive skills training plus manualised case management, but not individual therapy (DBT-S); second, to individual therapy plus an activities group, but no skills training (DBT-I); or third, to standard individual DBT (that included skills training and individual therapy). Over the 1-year follow-up all the interventions reduced the frequency and severity of suicidal ideation and suicide attempts, and against the a priori prediction the standard DBT was not superior to the other modalities. However, standard DBT and DBT-S (namely the interventions that included skills training) produced fewer non-suicidal self-injury acts and greater improvement in anxiety and depressive symptoms than the DBT-I intervention that lacked such training; standard DBT also resulted in the fewest treatment drop-outs and hospital admissions.
Recent advances in understanding human decision making have taken a pragmatic approach beyond game theory paradigms, and searched for evidence in vivo in the brain. The result has been a flurry of models focusing on how evidence is accumulated over time in the prefrontal cortex (PFC) and parietal cortex, with the evaluation of decisions in the striatum (e.g. Alexander & Brown’s Reference Alexander and Brown7 predicted-response-outcome model). In this debate, what remains unclear is how the mammalian brain uses accumulated evidence – that is, how does evidence go from being a continuous acquisition process to making a discrete categorical decision. Hanks et al Reference Hanks, Kopec, Brunton, Duan, Erlich and Brody8 propose an answer having recorded neuronal activity in rat PFC and posterior parietal cortices. Rats were trained to orient their noses variously to the left or right at a ‘go’ signal, based on the number of auditory clicks presented. Parietal neurons showed a smoothly graded response, proportional to the number of clicks (i.e. representing accumulated evidence for the decision to orient left or right); however, contrary to contemporary models, the PFC neuron’s firing rates divided into two categories, associated with left and right choice. It thus appears, at least in rats, that specific PFC neurons made the decision, with parietal neurons accumulating the evidence upon which this was based. This contrasts with monkey and human data about the role of prefrontal structures which suggest that accumulation and choice are coded jointly.
The large step from animals to children; Jean Piaget posited that hypothesis-testing appeared at around 11 years of age in the formal operations stage of development, but Stahl & Feigenson Reference Stahl and Feigenson9 propose this may begin much earlier. They explain that violation of expectations in 11-month-old infants makes them look for opportunities for exploration and learning. First, the infants were placed before two screens; a ball was placed behind the left screen, and both screens were removed simultaneously. In the ‘knowledge consistent’ condition, the ball remained behind the left screen; in the ‘knowledge violation’ condition, the ball appeared behind the right screen, contravening expectations about continuity. The infants were then shown a scene with a ball moving up and down, paired with a noise, for 12 seconds. Finally a ‘learning test’ was applied by measuring the time infants fixated on the now stationary ball in the presence of another, new, silent and stationary distractor object. Infants who saw the knowledge consistent scene showed reduced gaze fixation time on the ball in the presence of the distractor. However, infants who saw the knowledge violating scene showed an increased fixation on the ball versus the novel distractor object. This, the authors argue, suggests that being ‘primed’ by an expectation-violating event increased the infants’ desire to accumulate more knowledge about the object, echoing Carl Sagan’s notion of children as ‘natural born scientists’.
The smaller step from children to adults; cognitive flexibility is vital to allow focus on a goal-directed activity through exclusion of extraneous information – strategy exploitation – yet afford the ability to shift to novel, superior approaches – strategy exploration. Schuck et al Reference Schuck, Gaschler, Wenke, Heinzle, Frensch and Haynes10 investigated the occurrence of spontaneous strategy improvements in individuals exercising top-down cognitive control. Participants were instructed to perform a task based on the spatial location of a small stimulus of coloured squares; they were not told that another aspect of the task environment – the colour of the squares – could lead to the same, correct outcome. Focus on the initial instruction meant that about half did not discover the new strategy, and only about one-third adopted the colour-based approach, with behaviour analyses showing an abrupt and rapid change in individuals’ response type when this happened. Neuroimaging showed that where change occurred, the medial prefrontal cortex activated several minutes before adoption of the novel strategy, suggesting that this region avoids cognitive top-down biases, and continues to plan and evaluate future strategies by simulating alternatives. The authors call this counterfactual thinking, but we prefer Cicero’s contemplation that ‘any man can make mistakes, but only a fool persists in his error’.
Finally, disentangling a genetic contribution from environmental factors in the development of alcohol and substance misuse and criminal behaviour can be especially problematic. Twin studies help clarify such factors within a generation, but adoption studies that look at cross-generational transfer are methodologically complex and the rigorous selection of adopting parents makes them less representative of the broader population. Attempting to overcome these difficulties, Kendler et al Reference Kendler, Ohlsson, Sundquist and Sundquist11 undertook a novel genetic-epidemiological evaluation of over 40 000 individuals living in so-called ‘triparental families’. These are families with a rearing biological mother, a non-rearing (and minimal contact) biological father, and a step-father (living with the individual for ≥10 of their first 15 years). Such a model approximates for nature and nurture, nature only, and nurture only, respectively. Both genetic and environmental factors were important for the parental transmission of drug and alcohol misuse and criminal behaviour, but genetic factors were more important in all cases. The results were supported by cross-analytical comparison with ‘intact families’ and other ‘not-lived-with’ relatives. Those with mothers with drug or alcohol misuse, or a criminal history, were most likely to have cross-generational transmission of similar behaviour. In the blues classic ‘Born under a bad sign’ Albert King remonstrated ‘if it wasn’t for bad luck, I wouldn’t have no luck at all’, and unfortunately, for many people, there may be a cascade of genetic loading, adverse early life experiences, social impoverishment and lack of opportunity that will require a very significant psychosocial change in order to have an impact on their trajectory.
eLetters
No eLetters have been published for this article.