To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study aimed to examine the predictors of cognitive performance in patients with pediatric mild traumatic brain injury (pmTBI) and to determine whether group differences in cognitive performance on a computerized test battery could be observed between pmTBI patients and healthy controls (HC) in the sub-acute (SA) and the early chronic (EC) phases of injury.
203 pmTBI patients recruited from emergency settings and 159 age- and sex-matched HC aged 8–18 rated their ongoing post-concussive symptoms (PCS) on the Post-Concussion Symptom Inventory and completed the Cogstate brief battery in the SA (1–11 days) phase of injury. A subset (156 pmTBI patients; 144 HC) completed testing in the EC (∼4 months) phase.
Within the SA phase, a group difference was only observed for the visual learning task (One-Card Learning), with pmTBI patients being less accurate relative to HC. Follow-up analyses indicated higher ongoing PCS and higher 5P clinical risk scores were significant predictors of lower One-Card Learning accuracy within SA phase, while premorbid variables (estimates of intellectual functioning, parental education, and presence of learning disabilities or attention-deficit/hyperactivity disorder) were not.
The absence of group differences at EC phase is supportive of cognitive recovery by 4 months post-injury. While the severity of ongoing PCS and the 5P score were better overall predictors of cognitive performance on the Cogstate at SA relative to premorbid variables, the full regression model explained only 4.1% of the variance, highlighting the need for future work on predictors of cognitive outcomes.
American black bears (Ursus americanus) are by far the most abundant species of bear, numbering more than twice that of all other bear species combined. Many US states share this history of black bear decline and resurgence, and today have burgeoning bear populations. To a large extent, the comeback of this species has been a consequence of restrictions on killing, and a fundamental change in how the public perceives and reacts to black bears. However, the success of this species is also due to its biological adaptiveness – its ability to live in a vast array of habitats, to adapt to radically variable food conditions, and to tolerate the presence of people and the changes they have imposed on the landscape. This chapter highlights the adaptability of the black bear using an extensive and diverse data set spanning 38 years. We explore reasons for their commonness, using a long-term case study from near the geographic center of this species’ range: Minnesota, USA.
Animal-derived dietary protein ingestion and physical activity stimulate myofibrillar protein synthesis rates in older adults. We determined whether a non-animal-derived diet can support daily myofibrillar protein synthesis rates to the same extent as an omnivorous diet. Nineteen healthy older adults (age 66±1 y; BMI 24±1 kg·m-2; m=12, f=7) participated in a randomised, parallel-group, controlled trial during which they consumed a 3-day isocaloric high-protein (1.8 g·kg body mass-1·d-1) diet, where the protein was provided from predominantly (71%) animal (OMNI; n=9; m=6, f=3) or exclusively vegan (VEG; n=10; m=6, f=4; mycoprotein providing 57% of daily protein intake) sources. During the dietary control period participants conducted a daily bout of unilateral resistance-type leg extension exercise. Prior to the dietary control period participants ingested 400 mL deuterated water, with 50 mL doses consumed daily thereafter. Saliva samples were collected throughout to determine body water deuterium (2H) enrichments, and muscle samples were collected from rested and exercised muscle to determine daily myofibrillar protein synthesis rates. Deuterated water dosing resulted in body water 2H enrichments of ~0.78±0.03%. Daily myofibrillar protein synthesis rates were 13±8 (P=0.169) and 12±4% (P=0.016) greater in the exercised compared with rested leg (1.59±0.12 vs 1.77±0.12 %·d-1 and 1.76±0.14 vs 1.93±0.12 %·d-1) in OMNI and VEG groups, respectively. Daily myofibrillar protein synthesis rates did not differ between OMNI and VEG in either rested or exercised muscle (P>0.05). Over the course of a three day intervention, omnivorous or vegan derived dietary protein sources can support equivalent rested and exercised daily myofibrillar protein synthesis rates in healthy older adults consuming a high-protein diet.
Can multicellular life be distinguished from single cellular life on an exoplanet? We hypothesize that abundant upright photosynthetic multicellular life (trees) will cast shadows at high sun angles that will distinguish them from single cellular life and test this using Earth as an exoplanet. We first test the concept using unmanned aerial vehicles at a replica moon-landing site near Flagstaff, Arizona and show trees have both a distinctive reflectance signature (red edge) and geometric signature (shadows at high sun angles) that can distinguish them from replica moon craters. Next, we calculate reflectance signatures for Earth at several phase angles with POLDER (Polarization and Directionality of Earth's reflectance) satellite directional reflectance measurements and then reduce Earth to a single pixel. We compare Earth to other planetary bodies (Mars, the Moon, Venus and Uranus) and hypothesize that Earth's directional reflectance will be between strongly backscattering rocky bodies with no weathering (like Mars and the Moon) and cloudy bodies with more isotropic scattering (like Venus and Uranus). Our modelling results put Earth in line with strongly backscattering Mars, while our empirical results put Earth in line with more isotropic scattering Venus. We identify potential weaknesses in both the modelled and empirical results and suggest additional steps to determine whether this technique could distinguish upright multicellular life on exoplanets.
Advanced imaging techniques are enhancing research capacity focussed on the developmental origins of adult health and disease (DOHaD) hypothesis, and consequently increasing awareness of future health risks across various subareas of DOHaD research themes. Understanding how these advanced imaging techniques in animal models and human population studies can be both additively and synergistically used alongside traditional techniques in DOHaD-focussed laboratories is therefore of great interest. Global experts in advanced imaging techniques congregated at the advanced imaging workshop at the 2019 DOHaD World Congress in Melbourne, Australia. This review summarizes the presentations of new imaging modalities and novel applications to DOHaD research and discussions had by DOHaD researchers that are currently utilizing advanced imaging techniques including MRI, hyperpolarized MRI, ultrasound, and synchrotron-based techniques to aid their DOHaD research focus.
Patients with psychiatric illness present a unique challenge to clinicians: in contrast to the traditional medical model, in which patients are conceptualised as being stricken by a disease, patients with certain psychiatric illnesses may seem complicit in the illness. Questions of free will, choice and the role of the physician can cause clinicians to feel helpless, disinterested or even resentful. These tensions are a lasting legacy of centuries of mind–body dualism. Over the past several decades, modern tools have finally allowed us to break down this false dichotomy. Integrating a modern neuroscience perspective into practice allows clinicians to conceptualise individuals with psychiatric illness in a way that promotes empathy and enhances patient care. Specifically, a strong grasp of neuroscience prevents clinicians from falling into the trap in which behavioural aspects of a patient's presentation are perceived as being separate from the disease process. We demonstrate the value of incorporating neuroscience into a biopsychosocial formulation through the example of a ‘difficult patient’.
There is increasing evidence that domestic violence (DV) is an important risk factor for suicidal behaviour. The level of risk and its contribution to the overall burden of suicidal behaviour among men and women has not been quantified in South Asia. We carried out a large case-control study to examine the association between DV and self-poisoning in Sri Lanka.
Cases (N = 291) were patients aged ⩾18 years, admitted to a tertiary hospital in Kandy Sri Lanka for self-poisoning. Sex and age frequency matched controls were recruited from the hospital's outpatient department (N = 490) and local population (N = 450). Exposure to DV was collected through the Humiliation, Afraid, Rape, Kick questionnaire. Multivariable logistic regression models were conducted to estimate the association between DV and self-poisoning, and population attributable fractions were calculated.
Exposure to at least one type of DV within the previous 12 months was strongly associated with self-poisoning for women [adjusted OR (AOR) 4.08, 95% CI 1.60–4.78] and men (AOR 2.52, 95% CI 1.51–4.21), compared to those reporting no abuse. Among women, the association was strongest for physical violence (AOR 14.07, 95% CI 5.87–33.72), whereas among men, emotional abuse showed the highest risk (AOR 2.75, 95% CI 1.57–4.82). PAF% for exposure to at least one type of DV was 38% (95% CI 32–43) in women and 22% (95% CI 14–29) in men.
Multi-sectoral interventions to address DV including enhanced identification in health care settings, community-based strategies, and integration of DV support and psychological services may substantially reduce suicidal behaviour in Sri Lanka.
Cognitive behavior therapy (CBT) is effective for most patients with a social anxiety disorder (SAD) but a substantial proportion fails to remit. Experimental and clinical research suggests that enhancing CBT using imagery-based techniques could improve outcomes. It was hypothesized that imagery-enhanced CBT (IE-CBT) would be superior to verbally-based CBT (VB-CBT) on pre-registered outcomes.
A randomized controlled trial of IE-CBT v. VB-CBT for social anxiety was completed in a community mental health clinic setting. Participants were randomized to IE (n = 53) or VB (n = 54) CBT, with 1-month (primary end point) and 6-month follow-up assessments. Participants completed 12, 2-hour, weekly sessions of IE-CBT or VB-CBT plus 1-month follow-up.
Intention to treat analyses showed very large within-treatment effect sizes on the social interaction anxiety at all time points (ds = 2.09–2.62), with no between-treatment differences on this outcome or clinician-rated severity [1-month OR = 1.45 (0.45, 4.62), p = 0.53; 6-month OR = 1.31 (0.42, 4.08), p = 0.65], SAD remission (1-month: IE = 61.04%, VB = 55.09%, p = 0.59); 6-month: IE = 58.73%, VB = 61.89%, p = 0.77), or secondary outcomes. Three adverse events were noted (substance abuse, n = 1 in IE-CBT; temporary increase in suicide risk, n = 1 in each condition, with one being withdrawn at 1-month follow-up).
Group IE-CBT and VB-CBT were safe and there were no significant differences in outcomes. Both treatments were associated with very large within-group effect sizes and the majority of patients remitted following treatment.
This article draws upon six social research studies completed by members of the Dementia and Ageing Research Team at The University of Manchester and their associated networks over an eight-year period (2011–2019) with the aim of constructing a definition of ‘being in the moment’ and situating it within a continuum of moments that could be used to contextualise and frame the lived experience of dementia. Using the approach formulated by Pound et al. (2005) in synthesising qualitative studies, we identified this continuum of moments as comprising four sequential and interlinked steps: (a) ‘creating the moment’, defined as the processes and procedures necessary to enable being in the moment to take place – the time necessary for this to occur can range from fleeting to prolonged; (b) ‘being in the moment’, which refers to the multi-sensory processes involved in a personal or relational interaction and embodied engagement – being in the moment can be sustained through creativity and flow; (c) ‘ending the moment’, defined as when a specific moment is disengaged – this can be triggered by the person(s) involved consciously or subconsciously, or caused by a distraction in the environment or suchlike; and (d) ‘reliving the moment’, which refers to the opportunity for the experience(s) involved in ‘being in the moment’ to be later remembered and shared, however fragmentary, supported or full the recall.
This chapter reviews the spread of irrigation technology across the Sahara in antiquity, and its effects on settlement agriculture and the movement of people. Recent work has stressed the close connections between the introduction of foggara technology and the rise of Garamantian civilisation, which featured intensive agriculture and incipient urbanism. However, many oases achieved substantial size through the use of well technologies, artesian springs or a combination of technologies. Another key question relates to the effects of the eventual decline and failure of these irrigation systems in terms of population movement and fragmentation of states such as the Garamantes. After presenting new AMS dating evidence for Garamantian foggaras, the chapter advances the discussion by examining the wider picture of foggara distribution within a survey of the evidence of irrigation technologies across the Sahara and whether and to what extent the distribution of foggaras beyond the core Garamantian heartlands might be seen as an indication of Garamantian control or influence. It explores what foggaras, wells and new crop introductions might suggest about agricultural intensification and organisation. This has implications for assessing agricultural intensification in the ancient Sahara. Finally, it considers causes and possible effects of irrigation failure and in some cases collapse.
The collapse of a gas or vapour bubble near a solid boundary produces a jet directed towards the boundary. High surface pressure and shear stress induced by this jet can damage, or clean, the surface. More complex geometries will result in changes in collapse behaviour, in particular the direction of the jet. The majority of prior research has focused on simple flat boundaries or cases with limited complexity. There is currently very little known about how complex geometries affect bubble collapse. We numerically and experimentally investigate how a slot in a flat boundary affects the jet direction for a single bubble. We use a boundary element model to predict how the jet direction depends on key geometric parameters and show that the results collapse to a single curve when the parameters are normalised appropriately. We then experimentally validate the predictions using laser-induced cavitation and compare the experimental results to the predicted dependencies. This research reveals a tendency for the jet to be directed away from a slot and shows that the jet direction is independent of slot height for slots of sufficient height.
Overreliance on herbicides for weed control has led to the evolution of herbicide-resistant Palmer amaranth populations. Farm managers should consider the long-term consequences of their short-term management decisions, especially when considering the soil weed seedbank. The objectives of this research were to (1) determine how soybean population and POST herbicide application timing affects in-season Palmer amaranth control and soybean yield, and (2) how those variables influence Palmer amaranth densities and cotton yields the following season. Soybeans were planted (19-cm row spacing) at a low-, medium-, and high-density population (268,000, 546,000, and 778,000 plants ha–1, respectively). Fomesafen and clethodim (280 and 210 g ai ha–1, respectively) were applied at the VE, V1, or V2 to V3 soybean growth stage. Nontreated plots were also included to assess the effect of soybean population alone. The following season, cotton was planted into these plots so as to understand the effects of soybean planting population on Palmer amaranth densities in the subsequent crop. When an herbicide application occurred at the V1 or V2 to V3 soybean stage, weed control in the high-density soybean population increased 17% to 23% compared to the low-density population. Economic return was not influenced by soybean population and was increased 72% to 94% with herbicide application compared to no treatment. In the subsequent cotton crop, Palmer amaranth densities were 24% to 39% lower 3 wk after planting when following soybean sprayed with herbicides compared to soybean without herbicides. Additionally, Palmer amaranth densities in cotton were 19% lower when soybean was treated at the VE stage compared to later stages. Thus, increasing soybean population can improve Palmer amaranth control without adversely affecting economic returns and can reduce future weed densities. Reducing the weed seedbank and selection pressure from herbicides are critical in mitigating resistance evolution.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.