To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The reported incidence of Clostridoides difficile infection (CDI) has increased in recent years, partly due to broadening adoption of nucleic acid amplification tests (NAATs) replacing enzyme immunoassay (EIA) methods. Our aim was to quantify the impact of this switch on reported CDI rates using a large, multihospital, empirical dataset.
We analyzed 9 years of retrospective CDI data (2009–2017) from 47 hospitals in the southeastern United States; 37 hospitals switched to NAAT during this period, including 24 with sufficient pre- and post-switch data for statistical analyses. Poisson regression was used to quantify the NAAT-over-EIA incidence rate ratio (IRR) at hospital and network levels while controlling for longitudinal trends, the proportion of intensive care unit patient days, changes in surveillance methodology, and previously detected infection cluster periods. We additionally used change-point detection methods to identify shifts in the mean and/or slope of hospital-level CDI rates, and we compared results to recorded switch dates.
For hospitals that transitioned to NAAT, average unadjusted CDI rates increased substantially after the test switch from 10.9 to 23.9 per 10,000 patient days. Individual hospital IRRs ranged from 0.75 to 5.47, with a network-wide IRR of 1.75 (95% confidence interval, 1.62–1.89). Reported CDI rates significantly changed 1.6 months on average after switching to NAAT testing (standard deviation, 1.9 months).
Hospitals that switched from EIA to NAAT testing experienced an average postswitch increase of 75% in reported CDI rates after adjusting for other factors, and this increase was often gradual or delayed.
In this age of global warming, the automotive industry is seeking to minimize the energy required to manufacture and operate its products without sacrificing performance and safety or increasing cost. Toward this end, whether cars and trucks are powered by internal-combustion engines or batteries, lowering vehicle weight is a major contributor to reducing energy consumption by increasing fuel efficiency. “The industry is driven by fuel efficiency,” said David Matlock of the Colorado School of Mines, who has helped develop advanced high-strength steels (AHSSs) used in autos.
Facial transplantation is emerging as a therapeutic option for self-inflicted gunshot wounds. The self-inflicted nature of this injury raises questions about the appropriate role of self-harm in determining patient eligibility. Potential candidates for facial transplantation undergo extensive psychosocial screening. The presence of a self-inflicted gunshot wound warrants special attention to ensure that a patient is prepared to undergo a demanding procedure that poses significant risk, as well as stringent lifelong management. Herein, we explore the ethics of considering mechanism of injury in the patient selection process, referring to the precedent set forth in solid organ transplantation. We also consider the available evidence regarding outcomes of individuals transplanted for self-inflicted mechanisms of injury in both solid organ and facial transplantation. We conclude that while the presence of a self-inflicted gunshot wound is significant in the overall evaluation of the candidate, it does not on its own warrant exclusion from consideration for a facial transplantation.
Compulsory admission is commonly regarded as necessary and justified for patients whose psychiatric condition represents a severe danger to themselves and others. However, while studies on compulsory admissions have reported on various clinical and social outcomes, little research has focused specifically on dangerousness, which in many countries is the core reason for compulsory admission.
To study changes in dangerousness over time in adult psychiatric patients admitted by compulsory court order, and to relate these changes to these patients' demographic and clinical characteristics.
In this explorative prospective observational cohort study of adult psychiatric patients admitted by compulsory court order, demographic and clinical data were collected at baseline. At baseline and at 6 and 12 month follow-up, dangerousness was assessed using the Dangerousness Inventory, an instrument based on the eight types of dangerousness towards self or others specified in Dutch legislation on compulsory admissions. We used descriptive statistics and logistic regression to analyse the data.
We included 174 participants with a court-ordered compulsory admission. At baseline, the most common dangerousness criterion was inability to cope in society. Any type of severe or very severe dangerousness decreased from 86.2% at baseline to 36.2% at 6 months and to 28.7% at 12 months. Being homeless at baseline was the only variable which was significantly associated with persistently high levels of dangerousness.
Dangerousness decreased in about two-thirds of the patients after court-ordered compulsory admission. It persisted, however, in a substantial minority (approximately one-third).
Massive stars are the drivers of the chemical evolution of dwarf galaxies. We review here the basics of massive star evolution and the specificities of stellar evolution in low-Z environment. We discuss nucleosynthetic aspects and what observations could constrain our view on the first generations of stars.
Since 1997, execution in China has been increasingly performed by lethal injection. The current criteria for determination of death for execution by lethal injection (cessation of heartbeat, cessation of respiration, and dilated pupils) neither conform to current medical science nor to any standard of medical ethics. In practice, death is pronounced in China within tens of seconds after starting the lethal injection. At this stage, however, neither the common criteria for cardiopulmonary death (irreversible cessation of heartbeat and breathing) nor that of brain death (irreversible cessation of brain functions) have been met. To declare a still-living person dead is incompatible with human dignity, regardless of the processes following death pronouncement. This ethical concern is further aggravated if organs are procured from the prisoners. Analysis of postmortem blood thiopental level data from the United States indicates that thiopental, as used, may not provide sufficient surgical anesthesia. The dose of thiopental used in China is kept secret. It cannot be excluded that some of the organ explantation surgeries on prisoners subjected to lethal injection are performed under insufficient anesthesia in China. In such cases, the inmate may potentially experience asphyxiation and pain. Yet this can be easily overlooked by the medical professionals performing the explantation surgery because pancuronium prevents muscle responses to pain, resulting in an extremely inhumane situation. We call for an immediate revision of the death determination criteria in execution by lethal injection in China. Biological death must be ensured before death pronouncement, regardless of whether organ procurement is involved or not.
Introduction: The proportion of Canadians receiving anticoagulation medication is increasing. Falls in the elderly are the most common cause of minor head injury and an increasing proportion of these patients are prescribed anticoagulation. Emergency department (ED) guidelines advise performing a CT head scan for all anticoagulated head injured patients, but the risk of intracranial hemorrhage (ICH) after a minor head injury (patients who have a Glasgow comma score (GSC) of 15) is unclear. We conducted a systematic review and meta-analysis to determine the point incidence of ICH in anticoagulated ED patients presenting with a minor head injury. Methods: We systematically searched Pubmed, EMBASE, Cochrane database, DARE, google scholar and conference abstracts (May 2017). Experts were contacted. Meta-Analyses and Systematic Reviews of Observational Studies (MOOSE) guidelines were followed with two authors reviewing titles, four authors reviewing full text and four authors performing data extraction. We included all prospective studies recruiting consecutive anticoagulated ED patients presenting with a head injury. We obtained additional data from the authors of the included studies on the subset of GCS 15 patients. We performed a meta-analysis to estimate the point incidence of ICH among patients with a GCS score of 15 using a random effects model. Results: A total of five studies (and 4,080 GCS 15, anticoagulated patients) from the Netherlands, Italy, France, USA and UK were included in the analysis. One study contributed 2,871 patients. Direct oral anticoagulants were prescribed in only 60 (1.5%) patients. There was significant heterogeneity between studies with regards to mechanism of injury, CT scanning and follow up method (I2 =93%). The random effects pooled incidence of ICH was 8.9% (95% CI 5.0-13.8%). Conclusion: We found little data to reflect contemporary anticoagulant prescribing practice. Around 9% of warfarinized patients with a minor head injury develop ICH. Future studies should evaluate the safety of selective CT head scanning in this population.
Patient days and days present were compared to directly measured person time to quantify how choice of different denominator metrics may affect antimicrobial use rates. Overall, days present were approximately one-third higher than patient days. This difference varied among hospitals and units and was influenced by short length of stay.
Regulatory impact analyses (RIAs) weigh the benefits of regulations against the burdens they impose and are invaluable tools for informing decision makers. We offer 10 tips for nonspecialist policymakers and interested stakeholders who will be reading RIAs as consumers.
1.Core problem: Determine whether the RIA identifies the core problem (compelling public need) the regulation is intended to address.
2.Alternatives: Look for an objective, policy-neutral evaluation of the relative merits of reasonable alternatives.
3.Baseline: Check whether the RIA presents a reasonable “counterfactual” against which benefits and costs are measured.
4.Increments: Evaluate whether totals and averages obscure relevant distinctions and trade-offs.
5.Uncertainty: Recognize that all estimates involve uncertainty, and ask what effect key assumptions, data, and models have on those estimates.
6.Transparency: Look for transparency and objectivity of analytical inputs.
7.Benefits: Examine how projected benefits relate to stated objectives.
8.Costs: Understand what costs are included.
9.Distribution: Consider how benefits and costs are distributed.
10.Symmetrical treatment: Ensure that benefits and costs are presented symmetrically.
To evaluate the impact of multidrug-resistant gram-negative rod (MDR-GNR) infections on mortality and healthcare resource utilization in community hospitals.
Two matched case-control analyses.
Six community hospitals participating in the Duke Infection Control Outreach Network from January 1, 2010, through December 31, 2012.
Adult patients admitted to study hospitals during the study period.
Patients with MDR-GNR bloodstream and urinary tract infections were compared with 2 groups: (1) patients with infections due to nonMDR-GNR and (2) control patients representative of the nonpsychiatric, non-obstetric hospitalized population. Four outcomes were assessed: mortality, direct cost of hospitalization, length of stay, and 30-day readmission rates. Multivariable regression models were created to estimate the effect of MDR status on each outcome measure.
No mortality difference was seen in either analysis. Patients with MDR-GNR infections had 2.03 higher odds of 30-day readmission compared with patients with nonMDR-GNR infections (95% CI, 1.04–3.97, P=.04). There was no difference in hospital direct costs between patients with MDR-GNR infections and patients with nonMDR-GNR infections. Hospitalizations for patients with MDR-GNR infections cost $5,320.03 more (95% CI, $2,366.02–$8,274.05, P<.001) and resulted in 3.40 extra hospital days (95% CI, 1.41–5.40, P<.001) than hospitalizations for control patients.
Our study provides novel data regarding the clinical and financial impact of MDR gram-negative bacterial infections in community hospitals. There was no difference in mortality between patients with MDR-GNR infections and patients with nonMDR-GNR infections or control patients.
Archaeological projects that are described as orphaned or legacy collections are generally older materials that do not meet modern “best practice” curation standards and require considerable resources to be preserved for future research. Rehabilitation and curation of these projects allows for better inventory control of the artifacts, and accompanying documentation ensures that cultural heritage is preserved and plays an important part in the repatriation process. Procedures and methods for rehousing archaeological legacy collections are outlined. Using the 1984–1987 Arizona Archaeological and Historical Society (AAHS) volunteer excavations at Redtail Village (AZ AA:12:149 [ASM]) as a case study, we propose a process for rehabilitating legacy collections and offer solutions for preserving important archaeological resources for future research.
To describe the epidemiology of complex surgical site infection (SSI) following commonly performed surgical procedures in community hospitals and to characterize trends of SSI prevalence rates over time for MRSA and other common pathogens
We prospectively collected SSI data at 29 community hospitals in the southeastern United States from 2008 through 2012. We determined the overall prevalence rates of SSI for commonly performed procedures during this 5-year study period. For each year of the study, we then calculated prevalence rates of SSI stratified by causative organism. We created log-binomial regression models to analyze trends of SSI prevalence over time for all pathogens combined and specifically for MRSA.
A total of 3,988 complex SSIs occurred following 532,694 procedures (prevalence rate, 0.7 infections per 100 procedures). SSIs occurred most frequently after small bowel surgery, peripheral vascular bypass surgery, and colon surgery. Staphylococcus aureus was the most common pathogen. The prevalence rate of SSI decreased from 0.76 infections per 100 procedures in 2008 to 0.69 infections per 100 procedures in 2012 (prevalence rate ratio [PRR], 0.90; 95% confidence interval [CI], 0.82–1.00). A more substantial decrease in MRSA SSI (PRR, 0.69; 95% CI, 0.54–0.89) was largely responsible for this overall trend.
The prevalence of MRSA SSI decreased from 2008 to 2012 in our network of community hospitals. This decrease in MRSA SSI prevalence led to an overall decrease in SSI prevalence over the study period.
To determine the association (1) between shorter operative duration and surgical site infection (SSI) and (2) between surgeon median operative duration and SSI risk among first-time hip and knee arthroplasties.
Retrospective cohort study
A total of 43 community hospitals located in the southeastern United States.
Adults who developed SSIs according to National Healthcare Safety Network criteria within 365 days of first-time knee or hip arthroplasties performed between January 1, 2008 and December 31, 2012.
Log-binomial regression models estimated the association (1) between operative duration and SSI outcome and (2) between surgeon median operative duration and SSI outcome. Hip and knee arthroplasties were evaluated in separate models. Each model was adjusted for American Society of Anesthesiology score and patient age.
A total of 25,531 hip arthroplasties and 42,187 knee arthroplasties were included in the study. The risk of SSI in knee arthroplasties with an operative duration shorter than the 25th percentile was 0.40 times the risk of SSI in knee arthroplasties with an operative duration between the 25th and 75th percentile (risk ratio [RR], 0.40; 95% confidence interval [CI], 0.38–0.56; P<.01). Short operative duration did not demonstrate significant association with SSI for hip arthroplasties (RR, 1.04; 95% CI, 0.79–1.37; P=.36). Knee arthroplasty surgeons with shorter median operative durations had a lower risk of SSI than surgeons with typical median operative durations (RR, 0.52; 95% CI, 0.43–0.64; P<.01).
Short operative durations were not associated with a higher SSI risk for knee or hip arthroplasty procedures in our analysis.
Infect. Control Hosp. Epidemiol. 2015;36(12):1431–1436