To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To estimate the impact of California’s antimicrobial stewardship program (ASP) mandate on methicillin-resistant Staphylococcus aureus (MRSA) and Clostridioides difficile infection (CDI) rates in acute-care hospitals.
Centers for Medicare and Medicaid Services (CMS)–certified acute-care hospitals in the United States.
2013–2017 data from the CMS Hospital Compare, Provider of Service File and Medicare Cost Reports.
Difference-in-difference model with hospital fixed effects to compare California with all other states before and after the ASP mandate. We considered were standardized infection ratios (SIRs) for MRSA and CDI as the outcomes. We analyzed the following time-variant covariates: medical school affiliation, bed count, quality accreditation, number of changes in ownership, compliance with CMS requirements, % intensive care unit beds, average length of stay, patient safety index, and 30-day readmission rate.
In 2013, California hospitals had an average MRSA SIR of 0.79 versus 0.94 in other states, and an average CDI SIR of 1.01 versus 0.77 in other states. California hospitals had increases (P < .05) of 23%, 30%, and 20% in their MRSA SIRs in 2015, 2016, and 2017, respectively. California hospitals were associated with a 20% (P < .001) decrease in the CDI SIR only in 2017.
The mandate was associated with a decrease in CDI SIR and an increase in MRSA SIR.
Background: Suspicion of urinary tract infection (UTI) is the most common justification for prescribing antibiotics in nursing homes. More than half of antibiotic prescriptions for treatment of UTI in nursing homes are either unnecessary or inappropriate. Achieving a better understanding of the factors that underlie UTI treatment decisions is necessary to improve the quality of antibiotic prescribing in nursing homes. An ongoing hybrid type 2 effectiveness-implementation cluster randomized trial of a recently developed nursing home UTI recognition and management tool kit provided us with an opportunity to explore the influence of organizational, clinical, and staff attributes on UTI antibiotic prescribing practices in nursing homes. Methods: Data on antibiotic starts for suspected UTIs were collected in 29 nursing homes over a 9-month period. Antibiotic practices evaluated included total antibiotic starts per 1,000 resident days, % antibiotic starts with treatment duration >7 days, % antibiotic starts in which the initial antibiotic choice was a fluoroquinolone, and % antibiotic starts meeting UTI tool-kit criteria of appropriateness. Prior research and bivariate analyses were used to select clinical and organizational attributes as well as individual nursing staff-level retention rates for inclusion in a stepwise linear regression model for each antibiotic practice outcome. Results: In total, 602 UTI antibiotic events were evaluated. Four associations were identified for antibiotic starts including nursing home urine culture rate, ICP status, nonprofit and part-time LPN retention. Nursing homes with higher full-time LPN retention had a lower rate of antibiotic treatment duration >7 days. Full-time CNAs and part-time LPNs retention and for-profit status was associated with the proportion of fluoroquinolone antibiotic starts. No attributes influenced the proportion of antibiotic starts meeting appropriateness criteria (Fig. 1). Urine culture rates are driving overall nursing home antibiotic prescribing. Conclusions: Urine culture practices was strongly associated with UTI treatment rates in nursing homes. A variety of organizational characteristics were also associated with UTI treatment rates as well as other UTI antibiotic prescribing practices. Some of these associations appear paradoxical but may reflect increasing resident acuity and increased capacity to standardize practices through organizational centralization.
Funding: Support for the project provided by the Wisconsin Partnership Program.
Lewy body dementia, consisting of both dementia with Lewy bodies (DLB) and Parkinson's disease dementia (PDD), is considerably under-recognised clinically compared with its frequency in autopsy series.
This study investigated the clinical diagnostic pathways of patients with Lewy body dementia to assess if difficulties in diagnosis may be contributing to these differences.
We reviewed the medical notes of 74 people with DLB and 72 with non-DLB dementia matched for age, gender and cognitive performance, together with 38 people with PDD and 35 with Parkinson's disease, matched for age and gender, from two geographically distinct UK regions.
The cases of individuals with DLB took longer to reach a final diagnosis (1.2 v. 0.6 years, P = 0.017), underwent more scans (1.7 v. 1.2, P = 0.002) and had more alternative prior diagnoses (0.8 v. 0.4, P = 0.002), than the cases of those with non-DLB dementia. Individuals diagnosed in one region of the UK had significantly more core features (2.1 v. 1.5, P = 0.007) than those in the other region, and were less likely to have dopamine transporter imaging (P < 0.001). For patients with PDD, more than 1.4 years prior to receiving a dementia diagnosis: 46% (12 of 26) had documented impaired activities of daily living because of cognitive impairment, 57% (16 of 28) had cognitive impairment in multiple domains, with 38% (6 of 16) having both, and 39% (9 of 23) already receiving anti-dementia drugs.
Our results show the pathway to diagnosis of DLB is longer and more complex than for non-DLB dementia. There were also marked differences between regions in the thresholds clinicians adopt for diagnosing DLB and also in the use of dopamine transporter imaging. For PDD, a diagnosis of dementia was delayed well beyond symptom onset and even treatment.
Wild sheep and many primitive domesticated breeds have two coats: coarse hairs covering shorter, finer fibres. Both are shed annually. Exploitation of wool for apparel in the Bronze Age encouraged breeding for denser fleeces and continuously growing white fibres. The Merino is regarded as the culmination of this process. Archaeological discoveries, ancient images and parchment records portray this as an evolutionary progression, spanning millennia. However, examination of the fleeces from feral, two-coated and woolled sheep has revealed a ready facility of the follicle population to change from shedding to continuous growth and to revert from domesticated to primitive states. Modifications to coat structure, colour and composition have occurred in timeframes and to sheep population sizes that exclude the likelihood of variations arising from mutations and natural selection. The features are characteristic of the domestication phenotype: an assemblage of developmental, physiological, skeletal and hormonal modifications common to a wide variety of species under human control. The phenotypic similarities appeared to result from an accumulation of cryptic genetic changes early during vertebrate evolution. Because they did not affect fitness in the wild, the mutations were protected from adverse selection, becoming apparent only after exposure to a domestic environment. The neural crest, a transient embryonic cell population unique to vertebrates, has been implicated in the manifestations of the domesticated phenotype. This hypothesis is discussed with reference to the development of the wool follicle population and the particular roles of Notch pathway genes, culminating in the specific cell interactions that typify follicle initiation.
To measure the association between statewide adoption of the Centers for Disease Control and Prevention’s (CDC’s) Core Elements for Hospital Antimicrobial Stewardship Programs (Core Elements) and hospital-associated methicillin-resistant Staphylococcus aureus bacteremia (MRSA) and Clostridioides difficile infection (CDI) rates in the United States. We hypothesized that states with a higher percentage of reported compliance with the Core Elements have significantly lower MRSA and CDI rates.
All US states.
Observational longitudinal study.
We used 2014–2016 data from Hospital Compare, Provider of Service files, Medicare cost reports, and the CDC’s Patient Safety Atlas website. Outcomes were MRSA standardized infection ratio (SIR) and CDI SIR. The key explanatory variable was the percentage of hospitals that meet the Core Elements in each state. We estimated state and time fixed-effects models with time-variant controls, and we weighted our analyses for the number of hospitals in the state.
The percentage of hospitals reporting compliance with the Core Elements between 2014 and 2016 increased in all states. A 1% increase in reported ASP compliance was associated with a 0.3% decrease (P < .01) in CDIs in 2016 relative to 2014. We did not find an association for MRSA infections.
Increasing documentation of the Core Elements may be associated with decreases in the CDI SIR. We did not find evidence of such an association for the MRSA SIR, probably due to the short length of the study and variety of stewardship strategies that ASPs may encompass.
The Fontan procedure is the final stage of surgical palliation for a single-ventricle circulation. Significant complications are common including rhythm disturbance necessitating implantation of a permanent pacemaker. This has been widely considered a negative prognostic indicator.
This single-centre, retrospective case control study involved all patients who underwent the Fontan procedure at the Leeds Congenital Heart Unit between 1990 and 2015 and have had regular follow-up in Yorkshire and Humber, United Kingdom. 167 Fontan patients were identified of which 2 were excluded for having a pre-procedure pacemaker. Of the remainder, 23 patients required a pacemaker. Outcomes were survival, early and late complications, need for further intervention and oxygen saturation in long-term follow-up.
There was no difference in survival (30-day survival pacemaker 92.6%, sinus rhythm 90.5%, p = 0.66, 1-year pacemaker 11.1%, sinus rhythm 10.1%, p = 1). The pacemaker group was more likely to have cerebral or renal complications in the first-year post-procedure (acute kidney injury: sinus rhythm 0.8%, pacemaker 19.1%, p = 0.002). No difference was observed in longer term complications including protein losing enteropathy (sinus rhythm 3.5%, pacemaker 0% p = 1). There was no difference in saturations between the two groups at follow-up. Paced patients were more likely to have required further intervention, with a higher incidence of cardiopulmonary bypass procedures (sinus rhythm 6.3%, pacemaker 35%, p < 0.001).
Despite an increase in early complications and the need for further interventions, pacemaker requirement does not appear to affect long-term survival following the Fontan procedure.
How galaxies reionized the universe remains an open question, but we can gain insights from the low-redshift Green Pea galaxies, one of the only known populations of Lyman continuum (LyC) emitters. Using VLA H i 21 cm observations and HST UV spectra of Green Peas, we investigate how neutral gas content and geometry influence LyC and Lyα escape. Our results suggest that LyC Emitters may have high ratios of star formation rate to H i mass. Low gas covering fractions are common among the population, but not all sightlines are optically thin. Based on the observed relationship between high ionization parameters, low metallicities, and narrow Lyα profiles, we propose that weak stellar feedback at low metallicities results in a gas geometry of dense clumps within a low-density medium, which facilitates Lyα and LyC escape. We address the implications of these results for identifying LyC emitters at high redshift with JWST and ALMA.
A liver transplant recipient developed hospital-acquired symptomatic hepatitis C virus (HCV) genotype 6a infection 14 months post transplant.
Standard outbreak investigation.
Patient chart review, interviews of patients and staff, observational study of patient care practices, environmental surveillance, blood collection simulation experiments, and phylogenetic study of HCV strains using partial envelope gene sequences (E1–E2) of HCV genotype 6a strains from the suspected source patient, the environment, and the index patient were performed.
Investigations and data review revealed no further cases of HCV genotype 6a infection in the transplant unit. However, a suspected source with a high HCV load was identified. HCV genotype 6a was found in a contaminated reusable blood-collection tube holder with barely visible blood and was identified as the only shared item posing risk of transmission to the index case patient. Also, 14 episodes of sequential blood collection from the source patient and the index case patient were noted on the computerized time log of the laboratory barcoding system during their 13 days of cohospitalization in the liver transplant ward. Disinfection of the tube holders was not performed after use between patients. Blood collection simulation experiments showed that HCV and technetium isotope contaminating the tip of the sleeve capping the sleeved-needle can reflux back from the vacuum-specimen tube side to the patient side.
A reusable blood-collection tube holder without disinfection between patients can cause a nosocomial HCV infection. Single-use disposable tube holders should be used according to the recommendations by Occupational Safety and Health Administration and World Health Organization.
While much is known about dyslexia in school-age children and adolescents, less is known about its effects on quality of life in adults. Using data from the Connecticut Longitudinal Study, we provide the first estimates of the monetary value of improving reading, speaking, and cognitive skills to dyslexic and nondyslexic adults. Using a stated-preference survey, we find that dyslexic and nondyslexic individuals value improvements in their skills in reading speed, reading aloud, pronunciation, memory, and information retrieval at about the same rate. Because dyslexics have lower self-reported levels on these skills, their total willingness to pay to achieve a high level of skill is substantially greater than for nondyslexics. However, dyslexic individuals’ willingness to pay (averaging $3000 for an improvement in all skills simultaneously) is small compared with the difference in earnings between dyslexic and nondyslexic adults. We estimate that dyslexic individuals earn 15% less per year (about $8000) than nondyslexic individuals. Although improvements in reading, speaking, and cognitive skills in adulthood are unlikely to eliminate the earnings difference that reflects differences in educational attainment and other factors, stated-preference estimates of the value of cognitive skills may substantially underestimate the value derived from effects on lifetime earnings and health.
Objectives: Performance on neurocognitive tasks develops with age, but it is still unknown whether this performance differs between children from different cultures. We compared cross-sectionally the development of neurocognitive functions in 3- to 15-year-old children from three countries: Finland, Italy, and the United States (N=2745). Methods: Language, face memory, emotion recognition, theory of mind, and visuospatial processing subtests from the NEPSY-II standardizations in Finland, Italy, and the United States were used to evaluate if children and adolescents from different linguistic and cultural backgrounds differ in performance on these measures. Results: We found significant differences in performance on the tasks between the countries. Generally, the differences were more pronounced in the younger age groups. Some subtests showed greater country effects than others, performance on these subtests being higher, in general, in one country over the others, or showed different patterns of age associated changes in test performance. Conclusions: Significant differences in neurocognitive performance between children from Finland, Italy, and the United States were found. These findings may be due to cultural or educational differences that impact test performance, or due to factors associated with the adaptation of measures from one culture to another. The finding of performance differences across countries on similar tasks indicate that cross-cultural and background variables impact performance on neuropsychological measures. Therefore, clinicians need to consider a child’s cultural background when evaluating performance on neuropsychological assessments. The results also indicate that future cross-cultural studies are needed to further examine the underlying cultural factors that influence neurocognitive performance. (JINS, 2017, 23, 367–380)
The origins of contemporary exclusion of surgical methods from patenting lie in the complexities of managing credit claims in operative surgery, recognized in the nineteenth century. While surgical methods were not deemed patentable, surgeons were nevertheless embedded within patent culture. In an atmosphere of heightened awareness about the importance of ‘inventors’, how surgeons should be recognized and rewarded for their inventions was an important question. I examine an episode during the 1840s which seemed to concretize the inapplicability of patents to surgical practice, before looking at alternatives to patenting, used by surgeons to gain social and financial credit for inventions.
Previous studies of language contact in multilingual urban neighborhoods in Europe claim the emergence of new varieties spoken by immigrant-background youth. This paper examines the sociolinguistic conditioning of variation in allophones of Swedish /ε:/ of young people of immigrant and nonimmigrant background in Stockholm and Gothenburg. Although speaker background and sex condition the variation, their effects differ in each city. In Stockholm there are no significant social differences and the allophonic difference appears to have been neutralized. Gothenburg speakers are divided into three groups, based on speaker origin and sex, each of which orients toward different norms. Our conclusions appeal to dialectal diffusion and the desire to mark ethnic identity in a diverse sociolinguistic context. These results demonstrate that not only language contact but also dialect change should be considered together when investigating language variation in modern-day cities.
We report results of an experimental investigation into the effects of small-scale (mm–cm) heterogeneities on solute spreading and mixing in a Berea sandstone core. Pulse-tracer tests have been carried out in the Péclet number regime
and are supplemented by a unique combination of two imaging techniques. X-ray computed tomography (CT) is used to quantify subcore-scale heterogeneities in terms of permeability contrasts at a spatial resolution of approximately
, while [11C] positron emission tomography (PET) is applied to image the spatial and temporal evolution of the full tracer plume non-invasively. To account for both advective spreading and local (Fickian) mixing as driving mechanisms for solute transport, a streamtube model is applied that is based on the one-dimensional advection–dispersion equation. We refer to our modelling approach as semideterministic, because the spatial arrangement of the streamtubes and the corresponding solute travel times are known from the measured rock’s permeability map, which required only small adjustments to match the measured tracer breakthrough curve. The model reproduces the three-dimensional PET measurements accurately by capturing the larger-scale tracer plume deformation as well as subcore-scale mixing, while confirming negligible transverse dispersion over the scale of the experiment. We suggest that the obtained longitudinal dispersivity (
cm) is rock rather than sample specific, because of the ability of the model to decouple subcore-scale permeability heterogeneity effects from those of local dispersion. As such, the approach presented here proves to be very valuable, if not necessary, in the context of reservoir core analyses, because rock samples can rarely be regarded as ‘uniformly heterogeneous’.