To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present the first general theory of glacier surging that includes both temperate and polythermal glacier surges, based on coupled mass and enthalpy budgets. Enthalpy (in the form of thermal energy and water) is gained at the glacier bed from geothermal heating plus frictional heating (expenditure of potential energy) as a consequence of ice flow. Enthalpy losses occur by conduction and loss of meltwater from the system. Because enthalpy directly impacts flow speeds, mass and enthalpy budgets must simultaneously balance if a glacier is to maintain a steady flow. If not, glaciers undergo out-of-phase mass and enthalpy cycles, manifest as quiescent and surge phases. We illustrate the theory using a lumped element model, which parameterizes key thermodynamic and hydrological processes, including surface-to-bed drainage and distributed and channelized drainage systems. Model output exhibits many of the observed characteristics of polythermal and temperate glacier surges, including the association of surging behaviour with particular combinations of climate (precipitation, temperature), geometry (length, slope) and bed properties (hydraulic conductivity). Enthalpy balance theory explains a broad spectrum of observed surging behaviour in a single framework, and offers an answer to the wider question of why the majority of glaciers do not surge.
Calling in staff and preparing the operating room for an urgent surgical procedure is a significant draw on hospital resources and disrupts care of other patients. It has been common practice to treat open fractures on an urgent basis. HTA methods can be applied to examine this prioritization of care, just like they can be applied to the acquisition of drugs and devices.
Our center completed a rapid systematic review of guidelines, systematic reviews, and primary clinical evidence, on urgent surgical debridement and stabilization of open fractures of long bones (“urgent” being defined as within six hours of the injury) compared to surgical debridement and reduction performed at a later time point. Meta-analyses were performed for infection and non-union outcomes and the GRADE system was used to assess the strength of evidence for each conclusion.
We found no published clinical guidelines for the urgency of treating open fractures. A good systematic review on the topic was published in 2012. We found six cohort studies published since completion of the earlier review. The summary odds ratio for any infection in patients with later treatment was 0.97 (95% confidence interval (CI) 0.78–1.22, sixteen studies, 3,615 patients) and for deep or “major” infections was 1.00 (95% CI 0.74–1.34, nine studies, 2,013 patients). The summary odds ratio of non-union with later treatment was 0.95 (95% CI 0.65–1.41, six studies, 1,308 patients). There was no significant heterogeneity in any of the results (I-squared = 0 percent) and no apparent trends in the results as a function of study size or publication date. We graded the strength of each of the conclusions as very low because they were based on cohort studies where the treating physician could elect immediate treatment for patients with severe soft-tissue injuries or patients at risk of complications. This raises the risk of spectrum bias.
Default urgent scheduling of patients with open fractures for surgical debridement and stabilization does not appear to reduce the risk of infection or fracture non-union. Based on this information, our surgery department managers no longer schedule patients with open fractures for immediate surgery unless there are specific circumstances necessitating it.
Alteplase is an effective treatment for ischaemic stroke patients, and it is widely available at all primary stroke centres. The effectiveness of alteplase is highly time-dependent. Large tertiary centres have reported significant improvements in their door-to-needle (DTN) times. However, these same improvements have not been reported at community hospitals.
Red Deer Regional Hospital Centre (RDRHC) is a community hospital of 370 beds that serves approximately 150,000 people in their acute stroke catchment area. The RDRHC participated in a provincial DTN improvement initiative, and implemented a streamlined algorithm for the treatment of stroke patients. During this intervention period, they implemented the following changes: early alert of an incoming acute stroke patient to the neurologist and care team, meeting the patient immediately upon arrival, parallel work processes, keeping the patient on the Emergency Medical Service stretcher to the CT scanner, and administering alteplase in the imaging area. Door-to-needle data were collected from July 2007 to December 2017.
A total of 289 patients were treated from July 2007 to December 2017. In the pre-intervention period, 165 patients received alteplase and the median DTN time was 77 minutes [interquartile range (IQR): 60–103 minutes]; in the post-intervention period, 104 patients received alteplase and the median DTN time was 30 minutes (IQR: 22–42 minutes) (p < 0.001). The annual number of patients that received alteplase increased from 9 to 29 in the pre-intervention period to annual numbers of 41 to 63 patients in the post-intervention period.
Community hospitals staffed with community neurologists can achieve median DTN times of 30 minutes or less.
Estimates of the incubation period for Q fever vary substantially between different reviews and expert advice documents. We systematically reviewed and quality appraised the literature to provide an evidence-based estimate of the incubation period of the Q fever by the aerosolised infection route. Medline (OVIDSP) and EMBASE were searched with the search limited to human studies and English language. Eligible studies included persons with symptomatic, acute Q fever, and defined exposure to Coxiella burnetti. After review of 7115 titles and abstracts, 320 records were screened at full-text level. Of these, 23 studies contained potentially useful data and were quality assessed, with eight studies (with 403 individual cases where the derivation of incubation period was possible) being of sufficient quality and providing individual-level data to produce a pooled summary. We found a median incubation period of 18 days, with 95% of cases expected to occur between 7 and 32 days after exposure.
To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing.
We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria.
PARTICIPANTS AND SETTING
Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital.
Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit.
In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination (P=.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P=.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P=.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%).
Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination.
In this paper we undertake a quantitative analysis of the dynamic process by which ice underneath a dry porous debris layer melts. We show that the incorporation of debris-layer airflow into a theoretical model of glacial melting can capture the empirically observed features of the so-called Østrem curve (a plot of the melt rate as a function of debris depth). Specifically, we show that the turning point in the Østrem curve can be caused by two distinct mechanisms: the increase in the proportion of ice that is debris-covered and/or a reduction in the evaporative heat flux as the debris layer thickens. This second effect causes an increased melt rate because the reduction in (latent) energy used for evaporation increases the amount of energy available for melting. Our model provides an explicit prediction for the melt rate and the temperature distribution within the debris layer, and provides insight into the relative importance of the two effects responsible for the maximum in the Østrem curve. We use the data of Nicholson and Benn (2006) to show that our model is consistent with existing empirical measurements.
We have obtained a K band image of the central 30 × 40 arcminutes of the Galaxy at a scale of 1.4″/pixel using a 256 × 256 Pt: Si Schottky barrier diode array detector provided by the Hughes Aircraft Company. The excellent cosmetic quality and large field of this device provide an unprecedented view of the inner Galaxy. Images of the central 10 arcminutes at a scale of 0.9″/pixel in the H (1.65μm) and K (2.2μm) bands produced with the same detector array have been combined to produce a color picture, which clearly shows the circumnuclear molecular ring in absorption; this picture demonstrates directly that the southwestern side of the ring lies in front of, and the northeastern side behind, the Galactic center.
To what extent do political campaigns mobilize voters? Despite the central role of campaigns in American politics and despite many experiments on campaigning, we know little about the aggregate effects of an entire campaign on voter participation. Drawing upon inside information from presidential campaigns and utilizing a geographic research design that exploits media markets spanning state boundaries, we estimate the aggregate effects of a large-scale campaign. We estimate that the 2012 presidential campaigns increased turnout in highly targeted states by 7–8 percentage points, on average, indicating that modern campaigns can significantly alter the size and composition of the voting population. Further evidence suggests that the predominant mechanism behind this effect is traditional ground campaigning, which has dramatically increased in scale in the last few presidential elections. Additionally, we find no evidence of diminishing marginal returns to ground campaigning, meaning that voter contacts, each likely exhibiting small individual effects, may aggregate to large effects over the course of a campaign.
Research was conducted from 2011 to 2014 to determine weed population dynamics and frequency of glyphosate-resistant (GR) Palmer amaranth with herbicide programs consisting of glyphosate, dicamba, and residual herbicides in dicamba-tolerant cotton. Five treatments were maintained in the same plots over the duration of the experiment: three sequential POST applications of glyphosate with or without pendimethalin plus diuron PRE; three sequential POST applications of glyphosate plus dicamba with and without the PRE herbicides; and a POST application of glyphosate plus dicamba plus acetochlor followed by one or two POST applications of glyphosate plus dicamba without PRE herbicides. Additional treatments included alternating years with three sequential POST applications of glyphosate only and glyphosate plus dicamba POST with and without PRE herbicides. The greatest population of Palmer amaranth was observed when glyphosate was the only POST herbicide throughout the experiment. Although diuron plus pendimethalin PRE in a program with only glyphosate POST improved control during the first 2 yr, these herbicides were ineffective by the final 2 yr on the basis of weed counts from soil cores. The lowest population of Palmer amaranth was observed when glyphosate plus dicamba were applied regardless of PRE herbicides or inclusion of acetochlor POST. Frequency of GR Palmer amaranth was 8% or less when the experiment was initiated. Frequency of GR Palmer amaranth varied by herbicide program during 2012 but was similar among all herbicide programs in 2013 and 2014. Similar frequency of GR Palmer amaranth across all treatments at the end of the experiment most likely resulted from pollen movement from Palmer amaranth treated with glyphosate only to any surviving female plants regardless of PRE or POST treatment. These data suggest that GR Palmer amaranth can be controlled by dicamba and that dicamba is an effective alternative mode of action to glyphosate in fields where GR Palmer amaranth exists.
In November 2013, national public health agencies in England and Scotland identified an increase in laboratory-confirmed Salmonella Mikawasima. The role of proton pump inhibitors (PPIs) as a risk factor for salmonellosis is unclear; we therefore captured information on PPI usage as part of our outbreak investigation. We conducted a case-control study, comparing each case with two controls. Adjusted odds ratios (aORs) and 95% confidence intervals (CIs) were estimated using multivariable logistic regression. Thirty-nine of 61 eligible cases were included in the study. The median age of cases was 45 years; 56% were female. Of these, 33% were admitted to hospital and 31% reported taking PPIs. We identified an association between PPIs and non-typhoidal salmonellosis (aOR 8·8, 95% CI 2·0–38·3). There is increasing evidence supporting the existence of an association between salmonellosis and PPIs; however, biological studies are needed to understand the effect of PPIs in the pathogenesis of Salmonella. We recommend future outbreak studies investigate PPI usage to strengthen evidence on the relevance of PPIs in Salmonella infection. These findings should be used to support the development of guidelines for patients and prescribers on the risk of gastrointestinal infection and PPI usage.
PSR 1822–09 is a nearby galactic disk pulsar with a typical .769 s period. Its large period derivative indicates a young timing age of 250,000 yrs and a strong estimated surface magnetic field of 6.4 × 1012 G. We have observed this pulsar at frequencies around 1700 MHz and 2650 MHz. The mean pulse profile at these frequencies consists of a double peaked main pulse with two dissimilar components and a weak interpulse 185° of pulse longitude after the main pulse second component. The interpulse is detected in both frequency ranges, and the main pulse - interpulse separation does not change between these frequencies and the low frequencies around 327 MHz where the interpulse was discovered by Cady and Ritchings (1977).
Paranoia is one of the commonest symptoms of psychosis but has rarely been studied in a population at risk of developing psychosis. Based on existing theoretical models, including the proposed distinction between ‘poor me’ and ‘bad me’ paranoia, we aimed to test specific predictions about associations between negative cognition, metacognitive beliefs and negative emotions and paranoid ideation and the belief that persecution is deserved (deservedness).
We used data from 117 participants from the Early Detection and Intervention Evaluation for people at risk of psychosis (EDIE-2) trial of cognitive–behaviour therapy, comparing them with samples of psychiatric in-patients and healthy students from a previous study. Multi-level modelling was utilized to examine predictors of both paranoia and deservedness, with post-hoc planned comparisons conducted to test whether person-level predictor variables were associated differentially with paranoia or with deservedness.
Our sample of at-risk mental state participants was not as paranoid, but reported higher levels of ‘bad-me’ deservedness, compared with psychiatric in-patients. We found several predictors of paranoia and deservedness. Negative beliefs about self were related to deservedness but not paranoia, whereas negative beliefs about others were positively related to paranoia but negatively with deservedness. Both depression and negative metacognitive beliefs about paranoid thinking were specifically related to paranoia but not deservedness.
This study provides evidence for the role of negative cognition, metacognition and negative affect in the development of paranoid beliefs, which has implications for psychological interventions and our understanding of psychosis.
Large numbers of evacuees arrived in Dallas, Texas, from Hurricanes Katrina and Rita just 3 weeks apart in 2005 and from Hurricanes Gustav and Ike just 3 weeks apart again in 2008. The Dallas community needed to locate, organize, and manage the response to provide shelter and health care with locally available resources. With each successive hurricane, disaster response leaders applied many lessons learned from prior operations to become more efficient and effective in the provision of services. Mental health services proved to be an essential component. From these experiences, a set of operating guidelines for large evacuee shelter mental health services in Dallas was developed, with involvement of key stakeholders. A generic description of the processes and procedures used in Dallas that highlights the important concepts, key considerations, and organizational steps was then created for potential adaptation by other communities. (Disaster Med Public Health Preparedness. 2015;9:423–429)
(See the commentary by Pfeiffer and Beldavs, on pages 984–986.)
Describe the epidemiology of carbapenem-resistant Enterobacteriaceae (CRE) and examine the effect of lower carbapenem breakpoints on CRE detection.
Inpatient care at community hospitals.
All patients with CRE-positive cultures were included.
CRE isolated from 25 community hospitals were prospectively entered into a centralized database from January 2008 through December 2012. Microbiology laboratory practices were assessed using questionnaires.
A total of 305 CRE isolates were detected at 16 hospitals (64%). Patients with CRE had symptomatic infection in 180 cases (59%) and asymptomatic colonization in the remainder (125 cases; 41%). Klebsiella pneumoniae (277 isolates; 91%) was the most prevalent species. The majority of cases were healthcare associated (288 cases; 94%). The rate of CRE detection increased more than fivefold from 2008 (0.26 cases per 100,000 patient-days) to 2012 (1.4 cases per 100,000 patient-days; incidence rate ratio (IRR), 5.3 [95% confidence interval (CI), 1.22–22.7]; P = .01). Only 5 hospitals (20%) had adopted the 2010 Clinical and Laboratory Standards Institute (CLSI) carbapenem breakpoints. The 5 hospitals that adopted the lower carbapenem breakpoints were more likely to detect CRE after implementation of breakpoints than before (4.1 vs 0.5 cases per 100,000 patient-days; P < .001; IRR, 8.1 [95% CI, 2.7–24.6]). Hospitals that implemented the lower carbapenem breakpoints were more likely to detect CRE than were hospitals that did not (3.3 vs 1.1 cases per 100,000 patient-days; P = .01).
The rate of CRE detection increased fivefold in community hospitals in the southeastern United States from 2008 to 2012. Despite this, our estimates are likely underestimates of the true rate of CRE detection, given the low adoption of the carbapenem breakpoints recommended in the 2010 CLSI guidelines.
Many citizens abstain from the political process, and the reasons for this abstention are of great interest and importance. Most scholars and pundits assume that greater electoral competition and the increased chance of pivotality will motivate citizens to participate. We test this hypothesis through a large-scale field experiment that exploits the rare opportunity of a tied election for major political office. Informing citizens that an upcoming election will be close has little mobilizing effect. Any effect that we do detect is concentrated among a small set of frequent voters. The evidence suggests that increased pivotality is not a solution to low turnout and the predominant models of turnout focusing on pivotality are of little practical use.
A girl aged 6 presented with haematuria and her sister (aged 5) presented with haematuria and proteinuria. Family history showed multiple individuals suffering from end stage renal failure from the paternal side of the pedigree. Following kidney biopsy in the father and paternal grandmother, the pathological diagnosis was of focal segmental glomerulosclerosis (FSGS). Exome sequencing was undertaken in the proband's sister and grandmother. Genetic variants shared by both affected individuals were interrogated to identify the genetic cause of disease. Candidate variants were then sequenced in all the family members to determine segregation with the disease. A mutation of COL4A5 known to cause Alport syndrome segregated with disease from the paternal side of the pedigree and a variant in NPHS1 was present in both paediatric cases and inherited from their mother. This study highlights the advantages of exome sequencing over single gene testing; disease presentation can be heterogeneous with several genes representing plausible candidates; candidate gene(s) may be unavailable as a diagnostic test; consecutive, single gene testing typically concludes once a single causal mutation is identified. In this family, we were able to confirm a diagnosis of Alport syndrome, which will facilitate testing in other family members.
For people with psychosis, contact with informal caregivers is an important source of social support, associated with recovery, and with better outcomes following individual cognitive therapy (CBTp). In this study, we tested whether increased flexibility in delusional thinking, an established predictor of positive outcome following CBTp, was a possible mechanism underlying this effect.
219 participants with delusions (mean age 38 years; 71% male; 75% White) were grouped according to the presence of a caregiver (37% with a caregiver) and caregiver level of expressed emotion (High/Low EE, 64% Low). Delusional belief flexibility was compared between groups, controlling for interpersonal functioning, severity of psychotic symptoms, and other hypothesised outcome predictors.
Participants with caregivers were nearly three times more likely than those without to show flexibility (OR = 2.7, 95% CI 1.5 to 5.0, p = 0.001), and five times more likely if the caregiving relationship was Low EE (OR = 5.0, 95% CI 2.0–13.0, p = 0.001). ORs remained consistent irrespective of controlling for interpersonal functioning and other predictors of outcome.
This is the first evidence that having supportive caregiving relationships is associated with a specific cognitive attribute in people with psychosis, suggesting a potential cognitive mechanism by which outcomes following CBTp, and perhaps more generally, are improved by social support.
To assess the knowledge, attitudes, and practices of infection control among staff in a residential care facility for children and young adults with neurologic and neurodevelopmental conditions.
Residential care facility (facility A).
Facility A staff (N = 200).
We distributed a survey to staff at facility A. We classified staff with direct care responsibilities as clinical (ie, physicians, nurses, and therapists) or nonclinical (ie, habilitation assistants, volunteers, and teachers) and used X2 tests to measure differences between staff agreement to questions.
Of 248 surveys distributed, 200 (81%) were completed; median respondent age was 36 years; 85% were female; and 151 were direct care staff (50 clinical, 101 nonclinical). Among direct care staff respondents, 86% agreed they could identify residents with respiratory symptoms, 70% stayed home from work when ill with respiratory infection, 64% agreed that facility administration encouraged them to stay home when ill with respiratory infection, and 72% reported that ill residents with respiratory infections were separated from well residents. Clinical and nonclinical staff differed in agreement about using waterless hand gel as a substitute for handwashing (96% vs 78%; P = .005) and whether handwashing was done after touching residents (92% vs 75%; P = .04).
Respondents' knowledge, attitudes, and practices regarding infection control could be improved, especially among nonclinical staff. Facilities caring for children and young adults with neurologic and neurodevelopmental conditions should encourage adherence to infection control best practices among all staff having direct contact with residents.
Social capital and community activity are thought to increase voter turnout, but reverse causation and omitted variables may bias the results of previous studies. This article exploits saint's day fiestas in Mexico as a natural experiment to test this causal relationship. Saint's day fiestas provide temporary but large shocks to the connectedness and trust within a community, and the timing of these fiestas is quasi-random. For both cross-municipality and within-municipality estimates, saint's day fiestas occurring near an election decrease turnout by 2.5 to 3.5 percentage points. So community activities that generate social capital can inhibit political participation. These findings may give pause to scholars and policy makers who assume that such community activity and social capital will improve the performance of democracy.