To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Surgical antimicrobial prophylaxis (SAP) is commonly administered in orthopedic procedures. Research regarding SAP appropriateness for specific orthopedic procedures is limited and is required to facilitate targeted orthopedic prescriber behavior change.
To describe SAP prescribing and appropriateness for orthopedic procedures in Australian hospitals.
Design, setting, and participants:
Multicenter, national, quality improvement study with retrospective analysis of data collected from Australian hospitals via Surgical National Antimicrobial Prescribing Survey (Surgical NAPS) audits from January 1, 2016, to April 15, 2019, were analyzed.
Logistic regression identified hospital, patient and surgical factors associated with appropriateness. Adjusted appropriateness was calculated from the multivariable model. Additional subanalyses were conducted on smaller subsets to calculate the adjusted appropriateness for specific orthopedic procedures.
In total, 140 facilities contributed to orthopedic audits in the Surgical NAPS, including 4,032 orthopedic surgical episodes and 6,709 prescribed doses. Overall appropriateness was low, 58.0% (n = 3,894). This differed for prescribed procedural (n = 3,978, 64.7%) and postprocedural doses (n = 2,731, 48.3%). The most common reasons for inappropriateness, when prophylaxis was required, was timing for procedural doses (50.9%) and duration for postprocedural prescriptions (49.8%). The adjusted appropriateness of each orthopedic procedure group was low for procedural SAP (knee surgery, 54.1% to total knee joint replacement, 74.1%). The adjusted appropriateness for postprocedural prescription was also low (from hand surgery, 40.7%, to closed reduction fractures, 68.7%).
Orthopedic surgical specialties demonstrated differences across procedural and postprocedural appropriateness. The metric of appropriateness identifies targets for quality improvement and is meaningful for clinicians. Targeted quality improvement projects for orthopedic specialties need to be developed to support optimization of antimicrobial use.
The steep rise in the rate of psychiatric hospital detentions in England is poorly understood.
To identify explanations for the rise in detentions in England since 1983; to test their plausibility and support from evidence; to develop an explanatory model for the rise in detentions.
Hypotheses to explain the rise in detentions were identified from previous literature and stakeholder consultation. We explored associations between national indicators for potential explanatory variables and detention rates in an ecological study. Relevant research was scoped and the plausibility of each hypothesis was rated. Finally, a logic model was developed to illustrate likely contributory factors and pathways to the increase in detentions.
Seventeen hypotheses related to social, service, legal and data-quality factors. Hypotheses supported by available evidence were: changes in legal approaches to patients without decision-making capacity but not actively objecting to admission; demographic changes; increasing psychiatric morbidity. Reductions in the availability or quality of community mental health services and changes in police practice may have contributed to the rise in detentions. Hypothesised factors not supported by evidence were: changes in community crisis care, compulsory community treatment and prescribing practice. Evidence was ambiguous or lacking for other explanations, including the impact of austerity measures and reductions in National Health Service in-patient bed numbers.
Better data are needed about the characteristics and service contexts of those detained. Our logic model highlights likely contributory factors to the rise in detentions in England, priorities for future research and potential policy targets for reducing detentions.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
Serotonin and sympathomimetic toxicity (SST) after ingestion of amphetamine-based drugs can lead to severe morbidity and death. There have been evaluations of the safety and efficacy of on-site treatment protocols for SST at music festivals.
The study aimed to examine the safety and efficacy of treating patients with SST on-site at a music festival using a protocol adapted from hospital-based treatment of SST.
The study is an audit of presentations with SST over a one-year period. The primary outcome was need for ambulance transport to hospital. The threshold for safety was prospectively defined as less than 10% of patients requiring ambulance transport to hospital.
The protocol suggested patients be treated with a combination of benzodiazepines; cold intravenous (IV) fluid; specific therapies (cyproheptadine, chlorpromazine, and clonidine); rapid sequence intubation; and cooling with ice, misted water, and convection techniques.
One patient of 13 (7.7%) patients with mild or moderate SST required ambulance transport to hospital. Two of seven further patients with severe SST required transport to hospital.
On-site treatment may be a safe, efficacious, and efficient alternative to urgent transport to hospital for patients with mild and moderate SST. The keys to success of the protocol tested included inclusive and clear education of staff at all levels of the organization, robust referral pathways to senior clinical staff, and the rapid delivery of therapies aimed at rapidly lowering body temperature. Further collaborative research is required to define the optimal approach to patients with SST at music festivals.
Multispectral imaging – the acquisition of spatially contiguous imaging data in a modest number (~3–16) of spectral bandpasses – has proven to be a powerful technique for augmenting panchromatic imaging observations on Mars focused on geologic and/or atmospheric context. Specifically, multispectral imaging using modern digital CCD photodetectors and narrowband filters in the 400–1100 nm wavelength region on the Mars Pathfinder, Mars Exploration Rover, Phoenix, and Mars Science Laboratory missions has provided new information on the composition and mineralogy of fine-grained regolith components (dust, soils, sand, spherules, coatings), rocky surface regions (cobbles, pebbles, boulders, outcrops, and fracture-filling veins), meteorites, and airborne dust and other aerosols. Here we review recent scientific results from Mars surface-based multispectral imaging investigations, including the ways that these observations have been used in concert with other kinds of measurements to enhance the overall scientific return from Mars surface missions.
Mexican Americans suffer from a disproportionate burden of modifiable risk factors, which may contribute to the health disparities in mild cognitive impairment (MCI) and Alzheimer’s disease (AD).
The purpose of this study was to elucidate the impact of comorbid depression and diabetes on proteomic outcomes among community-dwelling Mexican American adults and elders.
Data from participants enrolled in the Health and Aging Brain among Latino Elders study was utilized. Participants were 50 or older and identified as Mexican American (N = 514). Cognition was assessed via neuropsychological test battery and diagnoses of MCI and AD adjudicated by consensus review. The sample was stratified into four groups: Depression only, Neither depression nor diabetes, Diabetes only, and Comorbid depression and diabetes. Proteomic profiles were created via support vector machine analyses.
In Mexican Americans, the proteomic profile of MCI may change based upon the presence of diabetes. The profile has a strong inflammatory component and diabetes increases metabolic markers in the profile.
Medical comorbidities may impact the proteomics of MCI and AD, which lend support for a precision medicine approach to treating this disease.
For patients with possible Staphylococcus aureus infection, providers must decide whether to treat empirically for methicillin-resistant S. aureus (MRSA). Nares MRSA colonization screening tests could inform decisions regarding empiric MRSA-active antibiotic use.1,2
Among 469 US military veterans with an Escherichia coli clinical isolate (2012–2013), we explored healthcare and non-healthcare risk factors for having E. coli sequence type 131 and its H30 subclone (ST131-H30). Overall, 66 (14%) isolates were ST131; 51 (77%) of these were ST131-H30. After adjustment for healthcare-associated factors, ST131 remained positively associated with medical lines and nursing home residence. After adjustment for environmental factors, ST131 remained associated with wild animal contact (positive), meat consumption (negative) and pet cat exposure (negative). Thus, ST131 was associated predominantly with healthcare-associated exposures, while non-ST131 E. coli were associated with some environmental exposures.
We developed a decision analytic model to evaluate the impact of a preoperative Staphylococcus aureus decolonization bundle on surgical site infections (SSIs), health-care–associated costs (HCACs), and deaths due to SSI.
Our model population comprised US adults undergoing elective surgery. We evaluated 3 self-administered preoperative strategies: (1) the standard of care (SOC) consisting of 2 disinfectant soap showers; (2) the “test-and-treat” strategy consisting of the decolonization bundle including chlorhexidine gluconate (CHG) soap, CHG mouth rinse, and mupirocin nasal ointment for 5 days) if S. aureus was found at any of 4 screened sites (nasal, throat, axillary, perianal area), otherwise the SOC; and (3) the “treat-all” strategy consisting of the decolonization bundle for all patients, without S. aureus screening. Model parameters were derived primarily from a randomized controlled trial that measured the efficacy of the decolonization bundle for eradicating S. aureus.
Under base-case assumptions, the treat-all strategy yielded the fewest SSIs and the lowest HCACs, followed by the test-and-treat strategy. In contrast, the SOC yielded the most SSIs and the highest HCACs. Consequently, relative to the SOC, the average savings per operation was $217 for the treat-all strategy and $123 for the test-and-treat strategy, and the average savings per per SSI prevented was $21,929 for the treat-all strategy and $15,166 for the test-and-treat strategy. All strategies were sensitive to the probability of acquiring an SSI and the increased risk if SSI if the patient was colonized with SA.
We predict that the treat-all strategy would be the most effective and cost-saving strategy for preventing SSIs. However, because this strategy might select more extensively for mupirocin-resistant S. aureus and cause more medication adverse effects than the test-and-treat approach or the SOC, additional studies are needed to define its comparative benefits and harms.
To determine the efficacy in eradicating Staphylococcus aureus (SA) carriage of a 5-day preoperative decolonization bundle compared to 2 disinfectant soap showers, with both regimens self-administered at home.
Open label, single-center, randomized clinical trial.
Ambulatory orthopedic, urologic, neurologic, colorectal, cardiovascular, and general surgery clinics at a tertiary-care referral center in the United States.
Patients at the University of Minnesota Medical Center planning to have elective surgery and not on antibiotics.
Consenting participants were screened for SA colonization using nasal, throat, axillary, and perianal swab cultures. Carriers of SA were randomized, stratified by methicillin resistance status, to a decolonization bundle group (5 days of nasal mupirocin, chlorhexidine gluconate [CHG] bathing, and CHG mouthwash) or control group (2 preoperative showers with antiseptic soap). Colonization status was reassessed preoperatively. The primary endpoint was absence of SA at all 4 screened body sites.
Of 427 participants screened between August 31, 2011, and August 9, 2016, 127 participants (29.7%) were SA carriers. Of these, 121 were randomized and 110 were eligible for efficacy analysis (57 decolonization bundle group, 53 control group). Overall, 90% of evaluable participants had methicillin-susceptible SA strains. Eradication of SA at all body sites was achieved for 41 of 57 participants (71.9%) in the decolonization bundle group and for 13 of 53 participants (24.5%) in the control group, a difference of 47.4% (95% confidence interval [CI], 29.1%–65.7%; P<.0001).
An outpatient preoperative antiseptic decolonization bundle aimed at 4 body sites was significantly more effective in eradicating SA than the usual disinfectant showers (ie, the control).
We surveyed resident physicians at 2 academic medical centers regarding urinary testing and treatment as they progressed through training. Demographics and self-reported confidence were compared to overall knowledge using clinical vignette-based questions. Overall knowledge was 40% in 2011 and increased to 48%, 55%, and 63% in subsequent years (P<.001).
Just war, international law, and world order are all historically conditioned realities that interrelate with one another in complex ways. This paper explores their historical development and current status while critically examining their interrelationship. It begins with exploring just war as a basic frame for analysis and interconnection with the other two realities. Just war is not an abstract body of moral thought but instead a practically informed morality of war rooted in Christian thought and law, Roman law, and the practice of statecraft. The essay notes the importance of the ideas of jus gentium and jus naturale in just war's fundamental formation, as well as the parallel between its three basic features—sovereign authority, just cause, and the end of peace—and the three goods or ends of politics as classically defined, namely, order, justice, and peace. The essay then moves out to explore the historical and thematic relations between just war tradition and international law, especially the law of war, arguing that these together define a moral and legal structure that is normative for world order. The final section of the paper considers the functioning of the institutions of world order in the context of challenges from rival cultural understandings of war, law, and world order and from the rise of nonstate actors in the international sphere, arguing for dialogical efforts aimed at strengthening both the moral and legal basis for world order against contemporary threats to that order.
In the latter half of the twentieth century, lasting memories of two world wars and astonishment over the power of nuclear weapons left both policymakers and scholars of war largely preoccupied with the possibility of a catastrophic World War III. Instead, however, the face of war since 1945 has been that of regionally limited small wars and insurgencies fought with conventional weapons. Many of these conflicts began as armed rebellions against colonial regimes, but often later evolved into armed conflicts between and among various subgroups seeking control of state government. Such conflicts have usually been asymmetrical, with the party holding the reins of state power using aircraft, artillery, and armored vehicles, while those fighting against the regime have been limited to weapons that individuals can carry, such as automatic rifles, mortars, rocket-propelled grenade launchers, and improvised weapons of various sorts. The asymmetries have also typically gone deeper, with the fighters on the former side wearing uniforms and those on the latter often not; those on the former side making use of fortified bases and those on the latter side protecting themselves by blending in with the civilian population. Further, there have frequently been asymmetries in how each side has fought, with the militarily weaker side relying on stealth tactics, deception, and attacks against nonmilitary targets of more general public value, including direct attacks on people protected as noncombatants under the laws of war. The particular range of tactics classified as terrorism begins at this point, with the specific, direct, and intentional targeting of noncombatants. Such attacks not only have been the means of choice for transnational nonstate actors, including al-Qaeda and the self-styled Islamic State, but have also been used to considerable effect in local civil wars.
Local food systems are frequently touted as economic development strategies for rural communities. In this study, we estimated the local economic impacts of local compared with conventionally produced and marketed food in two regions of Missouri and one region in Nebraska. We found that local food systems generated substantial increases in value added for their local economies.
This study explored the combined impact of depression and inflammation on memory functioning among Mexican-American adults and elders.
Data were analyzed from 381 participants of the Health and Aging Brain study among Latino Elders (HABLE). Fasting serum samples were collected and assayed in duplicate using electrochemiluminesce on the SECTOR Imager 2400A from Meso Scale Discovery. Positive DepE (depression endophenotype) was codified as any score >1 on a five-point scale based on the GDS-30. Inflammation was determined by TNFα levels and categorized by tertiles (1st, 2nd, 3rd). WMS-III LMI and LMII as well as CERAD were utilized as measures of memory. ANOVAs examined group differences between positive DepE and inflammation tertiles with neuropsychological scale scores as outcome variables. Logistic regressions were used to examine level of inflammation and DepE positive status on the risk for MCI.
Positive DepE as well as higher inflammation were both independently found to be associated with lower memory scores. Among DepE positive, those who were high in inflammation (3rd tertile) were found to perform significantly worse on WMS-III LM I (F = 4.75, p = 0.003), WMS-III LM II (F = 8.18, p < 0.001), and CERAD List Learning (F = 17.37, p < 0.001) when compared to those low on inflammation (1st tertile). The combination of DepE positive and highest tertile of inflammation was associated with increased risk for MCI diagnosis (OR = 6.06; 95% CI = 3.9–11.2, p < 0.001).
Presence of elevated inflammation and positive DepE scores increased risk for worse memory among Mexican-American older adults. Additionally, the combination of DepE and high inflammation was associated with increased risk for MCI diagnosis. This work suggests that depression and inflammation are independently associated with worse memory among Mexican-American adults and elders; however, the combination of both increases risk for poorer memory beyond either alone.
The American Heart Association (AHA; Dallas, Texas USA) and European Resuscitation Council (Niel, Belgium) cardiac arrest (CA) guidelines recommend the intraosseous (IO) route when intravenous (IV) access cannot be obtained. Vasopressin has been used as an alternative to epinephrine to treat ventricular fibrillation (VF).
Limited data exist on the pharmacokinetics and resuscitative effects of vasopressin administered by the humeral IO (HIO) route for treatment of VF. The purpose of this study was to evaluate the effects of HIO and IV vasopressin, on the occurrence, odds, and time of return of spontaneous circulation (ROSC) and pharmacokinetic measures in a swine model of VF.
Twenty-seven Yorkshire-cross swine (60 to 80 kg) were assigned randomly to three groups: HIO (n=9), IV (n=9), and a control group (n=9). Ventricular fibrillation was induced and untreated for two minutes. Chest compressions began at two minutes post-arrest and vasopressin (40 U) administered at four minutes post-arrest. Serial blood specimens were collected for four minutes, then the swine were resuscitated until ROSC or 29 post-arrest minutes elapsed.
Fisher’s Exact test determined ROSC was significantly higher in the HIO 5/7 (71.5%) and IV 8/11 (72.7%) groups compared to the control 0/9 (0.0%; P=.001). Odds ratios of ROSC indicated no significant difference between the treatment groups (P=.68) but significant differences between the HIO and control, and the IV and control groups (P=.03 and .01, respectively). Analysis of Variance (ANOVA) indicated the mean time to ROSC for HIO and IV was 621.20 seconds (SD=204.21 seconds) and 554.50 seconds (SD=213.96 seconds), respectively, with no significant difference between the groups (U=11; P=.22). Multivariate Analysis of Variance (MANOVA) revealed the maximum plasma concentration (Cmax) and time to maximum concentration (Tmax) of vasopressin in the HIO and IV groups was 71753.9 pg/mL (SD=26744.58 pg/mL) and 61853.7 pg/mL (SD=22745.04 pg/mL); 111.42 seconds (SD=51.3 seconds) and 114.55 seconds (SD=55.02 seconds), respectively. Repeated measures ANOVA indicated no significant difference in plasma vasopressin concentrations between the treatment groups over four minutes (P=.48).
The HIO route delivered vasopressin effectively in a swine model of VF. Occurrence, time, and odds of ROSC, as well as pharmacokinetic measurements of HIO vasopressin, were comparable to IV.
BurgertJM, JohnsonAD, Garcia-BlancoJ, FultonLV, LoughrenMJ. The Resuscitative and Pharmacokinetic Effects of Humeral Intraosseous Vasopressin in a Swine Model of Ventricular Fibrillation. Prehosp Disaster Med. 2017;32(3):305–310.
The Numeniini is a tribe of 13 wader species (Scolopacidae, Charadriiformes) of which seven are Near Threatened or globally threatened, including two Critically Endangered. To help inform conservation management and policy responses, we present the results of an expert assessment of the threats that members of this taxonomic group face across migratory flyways. Most threats are increasing in intensity, particularly in non-breeding areas, where habitat loss resulting from residential and commercial development, aquaculture, mining, transport, disturbance, problematic invasive species, pollution and climate change were regarded as having the greatest detrimental impact. Fewer threats (mining, disturbance, problematic native species and climate change) were identified as widely affecting breeding areas. Numeniini populations face the greatest number of non-breeding threats in the East Asian-Australasian Flyway, especially those associated with coastal reclamation; related threats were also identified across the Central and Atlantic Americas, and East Atlantic flyways. Threats on the breeding grounds were greatest in Central and Atlantic Americas, East Atlantic and West Asian flyways. Three priority actions were associated with monitoring and research: to monitor breeding population trends (which for species breeding in remote areas may best be achieved through surveys at key non-breeding sites), to deploy tracking technologies to identify migratory connectivity, and to monitor land-cover change across breeding and non-breeding areas. Two priority actions were focused on conservation and policy responses: to identify and effectively protect key non-breeding sites across all flyways (particularly in the East Asian- Australasian Flyway), and to implement successful conservation interventions at a sufficient scale across human-dominated landscapes for species’ recovery to be achieved. If implemented urgently, these measures in combination have the potential to alter the current population declines of many Numeniini species and provide a template for the conservation of other groups of threatened species.
Inequalities in college access are a major concern for policymakers in both developed and developing countries. Policymakers in China have largely tried to address these inequalities by helping disadvantaged students successfully transition from high school to college. However, they have paid less attention to the possibility that inequalities in college access may also arise earlier in the pathway to college. The purpose of this paper is to understand where inequalities emerge along the pathway to college in China, focusing on three major milestones after junior high. By analysing administrative data on over 300,000 students from one region of China, we find that the largest inequalities in college access emerge at the first post-compulsory milestone along the pathway to college: when students transition from junior high to high school. In particular, only 60 per cent of students from poor counties take the high school entrance exam (compared to nearly 100 per cent of students from non-poor counties). Furthermore, students from poor counties are about one and a half times less likely to attend academic high school and elite academic high school than students from non-poor counties.