To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
Longitudinal studies of first episode of psychosis (FEP) patients are critical to understanding the dynamic clinical factors influencing functional outcomes; negative symptoms and verbal memory (VM) deficits are two such factors that remain a therapeutic challenge. This study uses white-gray matter contrast at the inner edge of the cortex, in addition to cortical thickness, to probe changes in microstructure and their relation with negative symptoms and possible intersections with verbal memory.
T1-weighted images and clinical data were collected longitudinally for patients (N = 88) over a two-year period. Cognitive data were also collected at baseline. Relationships between baseline VM (immediate/delayed recall) and rate of change in two negative symptom dimensions, amotivation and expressivity, were assessed at the behavioral level, as well as at the level of brain structure.
VM, particularly immediate recall, was significantly and positively associated with a steeper rate of expressivity symptom decline (r = 0.32, q = 0.012). Significant interaction effects between baseline delayed recall and change in expressivity were uncovered in somatomotor regions bilaterally for both white-gray matter contrast and cortical thickness. Furthermore, interaction effects between immediate recall and change in expressivity on cortical thickness rates were uncovered across higher-order regions of the language processing network.
This study shows common neural correlates of language-related brain areas underlying expressivity and VM in FEP, suggesting deficits in these domains may be more linked to speech production rather than general cognitive capacity. Together, white-gray matter contrast and cortical thickness may optimally inform clinical investigations aiming to capture peri-cortical microstructural changes.
Laboratory identification of carbapenem-resistant Enterobacteriaceae (CRE) is a key step in controlling its spread. Our survey showed that most Veterans Affairs laboratories follow VA guidelines for initial CRE identification, whereas 55.0% use PCR to confirm carbapenemase production. Most respondents were knowledgeable about CRE guidelines. Barriers included staffing, training, and financial resources.
Objective: Post-stroke cognitive impairment is common, but mechanisms and risk factors are poorly understood. Frailty may be an important risk factor for cognitive impairment after stroke. We investigated the association between pre-stroke frailty and acute post-stoke cognition. Methods: We studied consecutively admitted acute stroke patients in a single urban teaching hospital during three recruitment waves between May 2016 and December 2017. Cognition was assessed using the Mini-Montreal Cognitive Assessment (min=0; max=12). A Frailty Index was used to generate frailty scores for each patient (min=0; max=100). Clinical and demographic information were collected, including pre-stroke cognition, delirium, and stroke-severity. We conducted univariate and multiple-linear regression analyses with covariates forced in (covariates included were: age, sex, stroke severity, stroke-type, pre-stroke cognitive impairment, delirium, previous stroke/transient ischemic attack) to investigate the association between pre-stroke frailty and post-stroke cognition. Results: Complete data were available for 154 stroke patients. Mean age was 68 years (SD=11; range=32–97); 93 (60%) were male. Median mini-Montreal Cognitive Assessment score was 8 (IQR=4–12). Mean Frailty Index score was 18 (SD=11). Pre-stroke cognitive impairment was apparent in 13/154 (8%) patients. Pre-stroke frailty was significantly associated with lower post-stroke cognition (Standardized-Beta=−0.40; p<0.001) and this association was independent of covariates (Unstandardized-Beta=−0.05; p=0.005). Additional significant variables in the multiple regression model were age (Unstandardized-Beta=−0.05; p=0.002), delirium (Unstandardized-Beta=−2.81; p<0.001), pre-stroke cognitive impairment (Unstandardized-Beta=−2.28; p=0.001), and stroke-severity (Unstandardized-Beta=−0.20; p<0.001). Conclusions: Pre-stroke frailty may be a moderator of post-stroke cognition, independent of other well-established post-stroke cognitive impairment risk factors. (JINS, 2019, 25, 501–506)
Depression is a common post-stroke complication. Pre-stroke depression may be an important contributor, however the epidemiology of pre-stroke depression is poorly understood. Using systematic review and meta-analysis, we described the prevalence of pre-stroke depression and its association with post-stroke depression.
We searched multiple cross-disciplinary databases from inception to July 2017 and extracted data on the prevalence of pre-stroke depression and its association with post-stroke depression. We assessed the risk of bias (RoB) using validated tools. We described summary estimates of prevalence and summary odds ratio (OR) for association with post-stroke depression, using random-effects models. We performed subgroup analysis describing the effect of depression assessment method. We used a funnel plot to describe potential publication bias. The strength of evidence presented in this review was summarised via ‘GRADE’.
Of 11 884 studies identified, 29 were included (total participants n = 164 993). Pre-stroke depression pooled prevalence was 11.6% [95% confidence interval (CI) 9.2–14.7]; range: 0.4–24% (I2 95.8). Prevalence of pre-stroke depression varied by assessment method (p = 0.02) with clinical interview suggesting greater pre-stroke depression prevalence (~17%) than case-note review (9%) or self-report (11%). Pre-stroke depression was associated with increased odds of post-stroke depression; summary OR 3.0 (95% CI 2.3–4.0). All studies were judged to be at RoB: 59% of included studies had an uncertain RoB in stroke assessment; 83% had high or uncertain RoB for pre-stroke depression assessment. Funnel plot indicated no risk of publication bias. The strength of evidence based on GRADE was ‘very low’.
One in six stroke patients have had pre-stroke depression. Reported rates may be routinely underestimated due to limitations around assessment. Pre-stroke depression significantly increases odds of post-stroke depression.
We evaluated rates of clinically confirmed long-term-care facility-onset Clostridium difficile infections from April 2014 through December 2016 in 132 Veterans Affairs facilities after the implementation of a prevention initiative. The quarterly pooled rate decreased 36.1% from the baseline (P<.0009 for trend) by the end of the analysis period.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
The Randolph Glacier Inventory (RGI) is a globally complete collection of digital outlines of glaciers, excluding the ice sheets, developed to meet the needs of the Fifth Assessment of the Intergovernmental Panel on Climate Change for estimates of past and future mass balance. The RGI was created with limited resources in a short period. Priority was given to completeness of coverage, but a limited, uniform set of attributes is attached to each of the ~198 000 glaciers in its latest version, 3.2. Satellite imagery from 1999–2010 provided most of the outlines. Their total extent is estimated as 726 800 ± 34 000 km2. The uncertainty, about ±5%, is derived from careful single-glacier and basin-scale uncertainty estimates and comparisons with inventories that were not sources for the RGI. The main contributors to uncertainty are probably misinterpretation of seasonal snow cover and debris cover. These errors appear not to be normally distributed, and quantifying them reliably is an unsolved problem. Combined with digital elevation models, the RGI glacier outlines yield hypsometries that can be combined with atmospheric data or model outputs for analysis of the impacts of climatic change on glaciers. The RGI has already proved its value in the generation of significantly improved aggregate estimates of glacier mass changes and total volume, and thus actual and potential contributions to sea-level rise.
Calving difficulty (CD) is a key functional trait with significant influence on herd profitability and animal welfare. Breeding plays an important role in managing CD both at farm and industry level. An alternative to the economic value approach to determine the CD penalty is to complement the economic models with the analysis of farmer perceived on-farm impacts of CD. The aim of this study was to explore dairy and beef farmer views and perceptions on the economic and non-economic on-farm consequences of CD, to ultimately inform future genetic selection tools for the beef and dairy industries in Ireland. A standardised quantitative online survey was released to all farmers with e-mail addresses on the Irish Cattle Breeding Federation database. In total, 271 farmers completed the survey (173 beef farmers and 98 dairy farmers). Both dairy and beef farmers considered CD a very important issue with economic and non-economic components. However, CD was seen as more problematic by dairy farmers, who mostly preferred to slightly reduce its incidence, than by beef farmers, who tended to support increases in calf value even though it would imply a slight increase in CD incidence. Farm size was found to be related to dairy farmer views of CD with farmers from larger farms considering CD as more problematic than farmers from smaller farms. CD breeding value was reported to be critical for selecting beef sires to mate with either beef or dairy cows, whereas when selecting dairy sires, CD had lower importance than breeding values for other traits. There was considerable variability in the importance farmers give to CD breeding values that could not be explained by the farm type or the type of sire used, which might be related to the farmer non-economic motives. Farmer perceived economic value associated with incremental increases in CD increases substantially as the CD level considered increases. This non-linear relationship cannot be reflected in a standard linear index weighting. The results of this paper provide key underpinning support to the development of non-linear index weightings for CD in Irish national indexes.
Aletta Bonn, Friedrich-Schiller-University Jena German Centre for Integrative Biodiversity Research (iDiv),
Tim Allott, University of Manchester,
Martin Evans, University of Manchester,
Hans Joosten, Ernst Moritz Arndt University of Greifswald,
Rob Stoneman, Yorkshire Wildlife Trust UK
In September 1997, the airports of Singapore and Kuala Lumpur shut down for several days. Fires from drained peatlands in Indonesia, over 1000 km away, were emitting vast clouds of smoke causing haze and poor visibility across large parts of South East Asia in the extremely dry El Niño year. Schools and businesses had to close, and people were admitted to hospitals with acute breathing problems. The amount of CO2 emitted from these fires was equivalent to 13–40% of annual global emissions from fossil fuels (Page et al. 2002). Economic losses due to the 1997–1998 wildfires exceeded several billion US dollars (ADB 1999).
In the hot August of 2010, people in Moscow were advised to stay at home, keep their windows closed and wear gauze masks to avoid inhaling ash particles when walking on the streets. Again the cause was fires, this time raging across nearly 2000 km2 of degraded peatlands in Russia. Carbon monoxide levels in the capital reached six times the maximum acceptable levels and death rates doubled due to heat and smog (Barriopedro et al. 2011).
These fires, resulting from peatland drainage and degradation that made them vulnerable to fire, dramatically highlight the huge liability that peatlands pose once degraded, especially in a changing climate. In sharp contrast, there is now wide recognition of the importance to human well-being of ecosystem services delivered by the peatland environment, not least the wildlife that underpins those ecosystem services. While peatlands cover not even 3% of the world’s surface, they hold two times more carbon than the entire global forest biomass pool, and represent more than 30% of the total global soil carbon store (see Chapter 4). As long-term carbon sinks, they provide crucial global climate-regulating services. If not safeguarded, however, the release of this carbon could further exacerbate climate change.
The range of peatland ecosystem services is far greater than simply their role in the carbon cycle. Pivotal peatland ecosystem services further include, for example, the provision of high-quality drinking water derived from peatland catchments. Peatlands also play a role in flood-water regulation, especially in lowland or coastal settings.