To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The ‘16Up’ study conducted at the QIMR Berghofer Medical Research Institute from January 2014 to December 2018 aimed to examine the physical and mental health of young Australian twins aged 16−18 years (N = 876; 371 twin pairs and 18 triplet sets). Measurements included online questionnaires covering physical and mental health as well as information and communication technology (ICT) use, actigraphy, sleep diaries and hair samples to determine cortisol concentrations. Study participants generally rated themselves as being in good physical (79%) and mental (73%) health and reported lower rates of psychological distress and exposure to alcohol, tobacco products or other substances than previously reported for this age group in the Australian population. Daily or near-daily online activity was almost universal among study participants, with no differences noted between males and females in terms of frequency or duration of internet access. Patterns of ICT use in this sample indicated that the respondents were more likely to use online information sources for researching physical health issues than for mental health or substance use issues, and that they generally reported partial levels of satisfaction with the mental health information they found online. This suggests that internet-based mental health resources can be readily accessed by adolescent Australians, and their computer literacy augurs well for future access to online health resources. In combination with other data collected as part of the ongoing Brisbane Longitudinal Twin Study, the 16Up project provides a valuable resource for the longitudinal investigation of genetic and environmental contributions to phenotypic variation in a variety of human traits.
The present study compared the age of first solid foods in a cohort of preterm infants with term infants and identified factors influencing timing of solid food introduction.
Structured interviews on infant feeding practices, growth and medical status at term equivalence and at 3, 6, 9 and 12 months corrected postnatal age. The age of solid food introduction was compared between term and preterm infants, and the influence of maternal, infant and milk feeding factors was assessed.
This prospective longitudinal study recruited primary carers of preterm and term infants from a regional metropolitan referral hospital in eastern Australia.
One hundred and fifty infants (preterm, n 85; term, n 65).
When corrected for prematurity, preterm infants received solid foods before the recommended age for the introduction of solid foods for term infants. Median introduction of solid foods for preterm infants was 14 weeks corrected age (range 12–17 weeks). This was significantly less than 19 weeks (range 17–21 weeks) for term infants (P < 0·001). Lower maternal education and male gender were associated with earlier introduction of solid foods among preterm infants.
Preterm infants are introduced to solid foods earlier than recommended for term infants, taking account of their corrected age. Further research is needed to assess any risk or benefit associated with this pattern and thus to develop clear evidence-based feeding guidelines for preterm infants.
In this article, we suggest that social policy may be on the cusp of a large-scale adoption of the notion of lived experience. However, within social policy and allied disciplines, the growing use of the term ‘lived experience’ is unaccompanied by discussion of what it may mean or imply. We argue that now is a good time to consider what this term could mean for social policy analysis. The peculiarities of Anglo-centric usage of the broader term ‘experience’ are explored, before we identify and discuss several roots from which understandings of ‘lived experience’ as a concept and a research strategy have grown: namely, phenomenology, feminist writing and ethnography. Drawing on multiple historical and contemporary international literatures, we identify a set of dilemmas and propositions around: assumed authenticity, questioning taken-for-grantedness, intercorporeality, embodied subjectivity; political strategies of recognition, risks of essentialising, and immediacy of unique personal experiences versus inscription of discourse. We argue that lived experience can inform sharp critique and offer an innovative window on aspects of the ‘shared typical’. Our central intention is to encourage and frame debate over what lived experience could mean theoretically and methodologically within social policy contexts and what the implications may be for its continued use.
In the beginning there was a formless void of emptiness known as chaos; from this darkness emerged a black bird known as Nyx (the goddess of night). Eventually the bird laid a golden egg, out of which was born Eros, the god of love. The shell of the egg broke into pieces, one of which rose into the air and became the sky (which Eros called Uranus) and the other became the Earth (called Gaia).
This is one version of the Greek creation myth. It considers that we started with ‘nothing’ and evolved fairly rapidly towards the environment which we experience today. In fact, this is a feature of nearly all creation myths - the Sun, the Earth, its inhabitants, and by inference, the planetary system around us, all formed soon after a divine event had acted to add purpose to the pre-existing nothingness, or chaos.
The details of the traditional scientific view are somewhat different. The Universe was created about 14 Ga ago, in the Big Bang (the exact age is unclear although somewhere between 13.7 and 13.9 Ga is the current consensus). Clumps of material then formed into galaxies, and galaxies spawned stars. From that time until the present day, the cycle of stellar birth and death has continued remorselessly. Our own Solar System formed around 4.56 Ga ago from materials that had been cycled in and out of stars several times (see Box 8.1).
There have been many different theories of how our Solar System formed. These can be split into theories suggesting that the processes that formed the Sun and the planets took place simultaneously in a single integrated event, versus theories suggesting that the planetary system was added to a pre-existing Sun, some time after the Sun's formation. These two approaches are referred to as monistic (single event) and dualistic (two separate events). An example of a dualistic theory of Solar System formation, would be the theory that another star passed close to the Sun, causing matter to be pulled from the Sun into a single filament, which then broke up along its length to form individual planets.
‘To the bereaved nothing but the return of the lost person can bring true comfort; should what we provide fall short of that it is felt almost as an insult.’
Bereavement is not a pathological process, but can lead to a significant mortality and morbidity. Some children may suffer significant psychological consequences (Pettle-Michael & Lansdown, 1986) but depression is rare (Pfeffer et al, 2000). The evidence for the efficacy or usefulness of therapeutic work is limited (Harrington & Harrison, 1999; Currier et al, 2007). Research suggests that positive outcomes from therapeutic work are more likely to be achieved if certain groups of children are selected and provided ‘timely’ treatment (Currier et al, 2007). The corollary of this is that many children do well with family and community support and never need to see child mental health services (Dyregov, 2008).
Indications for bereavement work
Children may need support at times of family bereavement. There are a number of reasons the impact of bereavement on the development of children might be more pronounced.
• Bereavement may be associated with circumstances in which the normal supportive family influences are severely hampered; such circumstances include parental mental illness (Van Eerdewegh et al, 1985), catastrophic parental bereavement responses, and emotionally abusive or neglecting parents (Elizur & Kaffman, 1983; Bifulco et al, 1987).
• Severe psychological trauma associated with the death, including parental suicide (Wright & Partridge, 1999; Pfeffer et al, 2000; Department of Health, 2008).
• Repeated bereavement.
• Prolonged disruption to the child's life.
• Family system changes (Wasserman, 1988).
• Extreme circumstances such as war (Goldstein et al, 1997).
Childhood bereavement services look at the effects of bereavement on children in a number of ways.
• Diagnostically: bereavement can lead to emotional or behavioural problems that have social or educational effects and that represent a diagnosable entity.
• Adult mental health: there may be effects on the parenting available to the child before or after the bereavement.
• Child protection: bereavement may upset parents’ emotional or physical care of a child.
• Systemically: there may be systemic effects that represent risk factors for the child.
• Developmentally: the circumstances surrounding the bereavement may damage the child's development.
• Attributionally: beliefs and attributions regarding the death, in either the child or the family, may be damaging.
‘Life is short, the craft long to learn, opportunity fleeting, experiment deceptive and judgement difficult. Not only must the physician be ready to do his duty but the patient, the attendants, and external circumstances must all conduce to a cure.’
The routine problems presenting to a CAMHS are likely to be addressed by those working at Tier 2 by an individual specialist mental health professional working with the problem. The demands of this everyday CAMHS work require all the specialist skills available in the service. No professional, unless at a very inexperienced stage, should not be available for Tier 2 work. To some extent the expenditure of energy on the development of the more high-profile ‘specialist’ Tier 3 teams is secondary and needs to be carefully managed to maintain availability of Tier 2 specialist provision. Treatments such as CBT are often offered by Tier 3 teams, although service members with relevant skills will do this as part of their Tier 2 work. The system needs to be coordinated and managed so that there is equity of access for required services at the most effective tier.
Requisites of Tier 2
‘Critical mass’ of staff
Meeting the needs of the community and providing a comprehensive range of services requires a critical mass of CAMHS staff with a multidisciplinary skill mix and a clear recognition of professional function.
Assessment represents the first stage of any therapeutic relationship and professionals working at Tier 2 need a clear model of assessment.
Continuum of care
The Tier 2 professional, who may link up with Tier 1 workers, will also be in a position to access and make use of Tier 3 and Tier 4 provision where required. This highlights the importance of communication both within CAMHS and with other agencies, as well as underpinning the principle that all disciplines should be involved in this area of service provision.
Training and supervision
Staff of all disciplines require access to affordable and relevant training. Training budgets are limited and unequal in their distribution. It may be that units develop alternative funding strategies to support less well-resourced disciplines. In-house training initiatives and multi-agency and multidisciplinary training programmes are effective and keep costs down. Professional supervision is a prerequisite for effective professional functioning.
It is unclear to what extent the traditional distinction between
neurological and psychiatric disorders reflects biological
To examine neuroimaging evidence for the distinction between neurological
and psychiatric disorders.
We performed an activation likelihood estimation meta-analysis on
voxel-based morphometry studies reporting decreased grey matter in 14
neurological and 10 psychiatric disorders, and compared the regional and
network-level alterations for these two classes of disease. In addition,
we estimated neuroanatomical heterogeneity within and between the two
Basal ganglia, insula, sensorimotor and temporal cortex showed greater
impairment in neurological disorders; whereas cingulate, medial frontal,
superior frontal and occipital cortex showed greater impairment in
psychiatric disorders. The two classes of disorders affected distinct
functional networks. Similarity within classes was higher than between
classes; furthermore, similarity within class was higher for neurological
than psychiatric disorders.
From a neuroimaging perspective, neurological and psychiatric disorders
represent two distinct classes of disorders.
Variation in human cognitive ability is of consequence to a large number of health and social outcomes and is substantially heritable. Genetic linkage, genome-wide association, and copy number variant studies have investigated the contribution of genetic variation to individual differences in normal cognitive ability, but little research has considered the role of rare genetic variants. Exome sequencing studies have already met with success in discovering novel trait-gene associations for other complex traits. Here, we use exome sequencing to investigate the effects of rare variants on general cognitive ability. Unrelated Scottish individuals were selected for high scores on a general component of intelligence (g). The frequency of rare genetic variants (in n = 146) was compared with those from Scottish controls (total n = 486) who scored in the lower to middle range of the g distribution or on a proxy measure of g. Biological pathway analysis highlighted enrichment of the mitochondrial inner membrane component and apical part of cell gene ontology terms. Global burden analysis showed a greater total number of rare variants carried by high g cases versus controls, which is inconsistent with a mutation load hypothesis whereby mutations negatively affect g. The general finding of greater non-synonymous (vs. synonymous) variant effects is in line with evolutionary hypotheses for g. Given that this first sequencing study of high g was small, promising results were found, suggesting that the study of rare variants in larger samples would be worthwhile.
Shorter telomere length (TL) has found to be associated with lower birth weight and with lower cognitive ability and psychiatric disorders. However, the direction of causation of these associations and the extent to which they are genetically or environmentally mediated are unclear. Within-pair comparisons of monozygotic (MZ) and dizygotic (DZ) twins can throw light on these questions. We investigated correlations of within pair differences in telomere length, IQ, and anxiety/depression in an initial sample from Brisbane (242 MZ pairs, 245 DZ same sex (DZSS) pairs) and in replication samples from Amsterdam (514 MZ pairs, 233 DZSS pairs) and Melbourne (19 pairs selected for extreme high or low birth weight difference). Intra-pair differences of birth weight and telomere length were significantly correlated in MZ twins, but not in DZSS twins. Greater intra-pair differences of telomere length were observed in the 10% of MZ twins with the greatest difference in birth weight compared to the bottom 90% in both samples and also in the Melbourne sample. Intra-pair differences of telomere length and IQ, but not of TL and anxiety/depression, were correlated in MZ twins, and to a smaller extent in DZSS twins. Our findings suggest that the same prenatal effects that reduce birth weight also influence telomere length in MZ twins. The association between telomere length and IQ is partly driven by the same prenatal effects that decrease birth weight.
People who have read War and Peace more than once, and enjoyed it immensely, can often scarcely remember a thing about it.
The concept of redundancy employed in this essay is the one used in mathematics and linguistics to designate symbols that do not add information to a sequence. One of the hazards of teaching twentieth-century war literature is the tacit inference of redundancy by readers, namely that the representational conventions as well as the facts and values represented are ‘predictable from … context’. The claim that twentieth-century war writing is made superfluous by War and Peace (1869) is polemical, but it is also intended to do serious work: to draw attention to representations of war which are not predictable from context, and to renew questions such as why representing war as irrational, murderous activity is unefficacious, and why we would imagine otherwise.
The designs of War and Peace as war writing can be recognised as early as 1853, when Tolstoy published a story drawing on his own military experience in the Caucasus:
War always interested me: not war in the sense of manoeuvres devised by great generals – my imagination refused to follow such immense movements, I did not understand them – but the reality of war, the actual killing. I was more interested to know in what way and under the influence of what feeling one soldier kills another than to know how the armies were arranged at Austerlitz and Borodino.
In october 1099, following the conquest of Jerusalem, First Crusade forces led by Duke Godfrey of Bouillon laid siege to the city of Arsuf, about fifteen miles north of modern Tel Aviv. According to the early-twelfth-century chronicler Albert of Aachen, the city's defenders attempted to distract Godfrey by crucifying one of Godfrey's men, Gerard of Avesnes. They placed him on the city walls within sight of the siege forces. Dying yet still able to talk, Gerard begged Godfrey to avenge his suffering and death. Godfrey told Gerard that, unfortunately, he could not avenge him; diverting men to do so would cost them the city. Furthermore, he added, ‘Certainly if you have to die, it is more useful that you alone should die than that our decree and be violated and this city remain always unsafe for pilgrims. For if you die to the present life, you will have life with Christ in heaven.’ With that Gerard was left to his fate, while the crusaders continued to assault the city.
The assault failed dramatically, prompting reflection on the potential causes of God's disfavour. In particular, Godfrey's response to Gerard's request for vengeance was called into question. Arnulf of Chocques, the newly appointed patriarch of Jerusalem, roundly condemned Godfrey not only for abandoning Gerard to his fate, but especially for failing to avenge his death. Arnulf described Godfrey's actions as ‘treachery and hardheartedness … impiety … base filth of all crimes’.
The officer stared at him curiously, as though doubting the evidence of his own ears.
‘A war?’ he chuckled at last, as though the word had amused him. ‘I'm afraid you're rather simplifying the issue, aren't you? The conception of war, you know, is rather an old-fashioned one, don't you agree? There's surely not much distinction nowadays between being at war and being at peace.’
Most writing about British Cold War culture has concentrated on nuclearism, pacifism, decolonisation, socialism, postmodernism, Americanisation – in short, on everything but war. One effect of the attention paid to these various narratives has been to obscure the fact that citizens of the USSR and those of Western capitalist democracies alike understood and feared the Cold War as war, even if later accounts have tended to lose sight of what Holger Nehring has called the ‘warlike character’ of their experiences. If the Cold War is to have any explanatory force as a context for literary works beyond serving as a useful periodising shorthand, then we need to know in what sense, if any, the literature of the Cold War era understood itself as a war literature. ‘What kind of war was this?’ asks the historian Anders Stephanson. ‘The two sides never went to war with each other. There is no obvious beginning, no single moment of initial aggression, no declaration of war, no crossing of a certain line, and no open military engagement.
W. H. Auden wrote that the Great War was ‘the decisive experience’ of Wilfred Owen's life. In the absence of such experience, Auden and his generation struggled to find grounds from which to write during the 1930s. In the present essay, I show that Auden's ‘Journal of an Airman’, which is Book II of The Orators (1932), reflects the legacy of the Great War for interwar English writers and ‘the guilt that every noncombatant feels’, as he calls it. This guilt is a manifestation of a larger cultural turn that military historians have traced back to the Enlightenment, in which non-transmittable knowledge is understood to be gained exclusively through sensory experience on the battlefield. That knowledge grants combat veterans the ‘authority of flesh-witnessing’. After discussing this cultural turn and how it affected Auden and his generation, I explore how positions of optical dominance inform the ‘Journal of an Airman’ and the text's concern with the idea of the poet as wartime orator – elements that presage Auden's World War II poetics. I conclude by considering an often-overlooked episode in Auden's life, when he was sent to assess the effects of air bombing on German morale as part of the 1945 US Strategic Bombing Survey.