We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Venovenous collaterals are abnormal connections between the systemic and pulmonary venous systems. They are commonly seen in the Fontan circulation and may lead to significant hypoxaemia. Transcatheter closure of venovenous collaterals is a potential but controversial treatment as the long-term benefits and outcomes are not well understood.
Methods:
This retrospective cohort study utilised data from the Australian and New Zealand Fontan Registry. Patients who underwent transcatheter venovenous collateral occlusion for hypoxemia from the year 2000 onwards were included. Atriopulmonary and Kawashima-type Fontan circulations were excluded to reflect a more contemporary Fontan cohort.
Results:
Nineteen patients (age 19.3 ± 7.8 years, 53% female) underwent transcatheter venovenous collateral occlusion. Compared to baseline, mean oxygen saturation was improved at latest follow-up (90.5% vs 87.0%; p = 0.003). Nine patients achieved a clinically significant response (defined as an increase of at least 5% to 90% or greater), and this was associated with lower baseline Fontan pressures (12.9 v 15.6 mmHg; p = 0.02). No heart failure hospitalisations, arrhythmia, transplant referrals, or mortality were observed during the median follow-up period of 4 years. Two patients experienced thromboembolic events and five patients underwent re-intervention.
Conclusion:
Transcatheter occlusion of venovenous collaterals in Fontan patients with chronic hypoxaemia resulted in a modest increase in oxygenation over a median follow-up of 4 years and longer-term prognosis did not appear to be adversely affected. Lower Fontan pressures at baseline were associated with a greater improvement in oxygenation.
To characterize the relationship between chlorhexidine gluconate (CHG) skin concentration and skin microbial colonization.
Design:
Serial cross-sectional study.
Setting/participants:
Adult patients in medical intensive care units (ICUs) from 7 hospitals; from 1 hospital, additional patients colonized with carbapenemase-producing Enterobacterales (CPE) from both ICU and non-ICU settings. All hospitals performed routine CHG bathing in the ICU.
Methods:
Skin swab samples were collected from adjacent areas of the neck, axilla, and inguinal region for microbial culture and CHG skin concentration measurement using a semiquantitative colorimetric assay. We used linear mixed effects multilevel models to analyze the relationship between CHG concentration and microbial detection. We explored threshold effects using additional models.
Results:
We collected samples from 736 of 759 (97%) eligible ICU patients and 68 patients colonized with CPE. On skin, gram-positive bacteria were cultured most frequently (93% of patients), followed by Candida species (26%) and gram-negative bacteria (20%). The adjusted odds of microbial recovery for every twofold increase in CHG skin concentration were 0.84 (95% CI, 0.80–0.87; P < .001) for gram-positive bacteria, 0.93 (95% CI, 0.89–0.98; P = .008) for Candida species, 0.96 (95% CI, 0.91–1.02; P = .17) for gram-negative bacteria, and 0.94 (95% CI, 0.84–1.06; P = .33) for CPE. A threshold CHG skin concentration for reduced microbial detection was not observed.
Conclusions:
On a cross-sectional basis, higher CHG skin concentrations were associated with less detection of gram-positive bacteria and Candida species on the skin, but not gram-negative bacteria, including CPE. For infection prevention, targeting higher CHG skin concentrations may improve control of certain pathogens.
OBJECTIVES/GOALS: Contingency management (CM) procedures yield measurable reductions in cocaine use. This poster describes a trial aimed at using CM as a vehicle to show the biopsychosocial health benefits of reduced use, rather than total abstinence, the currently accepted metric for treatment efficacy. METHODS/STUDY POPULATION: In this 12-week, randomized controlled trial, CM was used to reduce cocaine use and evaluate associated improvements in cardiovascular, immune, and psychosocial well-being. Adults aged 18 and older who sought treatment for cocaine use (N=127) were randomized into three groups in a 1:1:1 ratio: High Value ($55) or Low Value ($13) CM incentives for cocaine-negative urine samples or a non-contingent control group. They completed outpatient sessions three days per week across the 12-week intervention period, totaling 36 clinic visits and four post-treatment follow-up visits. During each visit, participants provided observed urine samples and completed several assays of biopsychosocial health. RESULTS/ANTICIPATED RESULTS: Preliminary findings from generalized linear mixed effect modeling demonstrate the feasibility of the CM platform. Abstinence rates from cocaine use were significantly greater in the High Value group (47% negative; OR = 2.80; p = 0.01) relative to the Low Value (23% negative) and Control groups (24% negative;). In the planned primary analysis, the level of cocaine use reduction based on cocaine-negative urine samples will serve as the primary predictor of cardiovascular (e.g., endothelin-1 levels), immune (e.g., IL-10 levels) and psychosocial (e.g., Addiction Severity Index) outcomes using results from the fitted models. DISCUSSION/SIGNIFICANCE: This research will advance the field by prospectively and comprehensively demonstrating the beneficial effects of reduced cocaine use. These outcomes can, in turn, support the adoption of reduced cocaine use as a viable alternative endpoint in cocaine treatment trials.
To assess whether measurement and feedback of chlorhexidine gluconate (CHG) skin concentrations can improve CHG bathing practice across multiple intensive care units (ICUs).
Design:
A before-and-after quality improvement study measuring patient CHG skin concentrations during 6 point-prevalence surveys (3 surveys each during baseline and intervention periods).
Setting:
The study was conducted across 7 geographically diverse ICUs with routine CHG bathing.
Participants:
Adult patients in the medical ICU.
Methods:
CHG skin concentrations were measured at the neck, axilla, and inguinal region using a semiquantitative colorimetric assay. Aggregate unit-level CHG skin concentration measurements from the baseline period and each intervention period survey were reported back to ICU leadership, which then used routine education and quality improvement activities to improve CHG bathing practice. We used multilevel linear models to assess the impact of intervention on CHG skin concentrations.
Results:
We enrolled 681 (93%) of 736 eligible patients; 92% received a CHG bath prior to survey. At baseline, CHG skin concentrations were lowest on the neck, compared to axillary or inguinal regions (P < .001). CHG was not detected on 33% of necks, 19% of axillae, and 18% of inguinal regions (P < .001 for differences in body sites). During the intervention period, ICUs that used CHG-impregnated cloths had a 3-fold increase in patient CHG skin concentrations as compared to baseline (P < .001).
Conclusions:
Routine CHG bathing performance in the ICU varied across multiple hospitals. Measurement and feedback of CHG skin concentrations can be an important tool to improve CHG bathing practice.
To determine the proportion of hospitals that implemented 6 leading practices in their antimicrobial stewardship programs (ASPs). Design: Cross-sectional observational survey.
Setting:
Acute-care hospitals.
Participants:
ASP leaders.
Methods:
Advance letters and electronic questionnaires were initiated February 2020. Primary outcomes were percentage of hospitals that (1) implemented facility-specific treatment guidelines (FSTG); (2) performed interactive prospective audit and feedback (PAF) either face-to-face or by telephone; (3) optimized diagnostic testing; (4) measured antibiotic utilization; (5) measured C. difficile infection (CDI); and (6) measured adherence to FSTGs.
Results:
Of 948 hospitals invited, 288 (30.4%) completed the questionnaire. Among them, 82 (28.5%) had <99 beds, 162 (56.3%) had 100–399 beds, and 44 (15.2%) had ≥400+ beds. Also, 230 (79.9%) were healthcare system members. Moreover, 161 hospitals (54.8%) reported implementing FSTGs; 214 (72.4%) performed interactive PAF; 105 (34.9%) implemented procedures to optimize diagnostic testing; 235 (79.8%) measured antibiotic utilization; 258 (88.2%) measured CDI; and 110 (37.1%) measured FSTG adherence. Small hospitals performed less interactive PAF (61.0%; P = .0018). Small and nonsystem hospitals were less likely to optimize diagnostic testing: 25.2% (P = .030) and 21.0% (P = .0077), respectively. Small hospitals were less likely to measure antibiotic utilization (67.8%; P = .0010) and CDI (80.3%; P = .0038). Nonsystem hospitals were less likely to implement FSTGs (34.3%; P < .001).
Conclusions:
Significant variation exists in the adoption of ASP leading practices. A minority of hospitals have taken action to optimize diagnostic testing and measure adherence to FSTGs. Additional efforts are needed to expand adoption of leading practices across all acute-care hospitals with the greatest need in smaller hospitals.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Airway management is a controversial topic in modern Emergency Medical Services (EMS) systems. Among many concerns regarding endotracheal intubation (ETI), unrecognized esophageal intubation and observations of unfavorable neurologic outcomes in some studies raise the question of whether alternative airway techniques should be first-line in EMS airway management protocols. Supraglottic airway devices (SADs) are simpler to use, provide reliable oxygenation and ventilation, and may thus be an alternative first-line airway device for paramedics. In 2019, Alachua County Fire Rescue (ACFR; Alachua, Florida USA) introduced a novel protocol for advanced airway management emphasizing first-line use of a second-generation SAD (i-gel) for patients requiring medication-facilitated airway management (referred to as “rapid sequence airway” [RSA] protocol).
Study Objective:
This was a one-year quality assurance review of care provided under the RSA protocol looking at compliance and first-pass success rate of first-line SAD use.
Methods:
Records were obtained from the agency’s electronic medical record (EMR), searching for the use of the RSA protocol, advanced airway devices, or either ketamine or rocuronium. If available, hospital follow-up data regarding patient condition and emergency department (ED) airway exchange were obtained.
Results:
During the first year, 33 advanced airway attempts were made under the protocol by 23 paramedics. Overall, compliance with the airway device sequence as specified in the protocol was 72.7%. When ETI was non-compliantly used as first-line airway device, the first-pass success rate was 44.4% compared to 87.5% with adherence to first-line SAD use. All prehospital SADs were exchanged in the ED in a delayed fashion and almost exclusively per physician preference alone. In no case was the SAD exchanged for suspected dislodgement evidenced by lack of capnography.
Conclusion:
First-line use of a SAD was associated with a high first-pass attempt success rate in a real-life cohort of prehospital advanced airway encounters. No SAD required emergent exchange upon hospital arrival.
We present a 53-year-old male with the rare constellation of stress cardiomyopathy, dextrocardia with situs inversus and anomalous coronary anatomy. This case highlights the difficulties faced when managing patients with uncommon disorders and demonstrates a rare overlap of acquired and CHD.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Dust particles in an ice core from East Rongbuk Glacier on the northern slope of Qomolangma (Mount Everest; 28°01′ N, 86°58′ E; 6518 m a.s.l.), central Himalaya, have been identified as mica using a combination of scanning electron microscope-based techniques and energy-dispersive X-ray spectroscopy to identify the elements present, and electron backscatter diffraction to identify the crystal type. This technique for identifying individual crystalline dust particles in samples of glacial ice could be especially useful in the future for identifying water-soluble crystals in ice, for studying the strain history (glaciotectonics) of basal ice or in studies of ice–mica composites used as analogs of quartz-mica rocks.
Bahiagrass is used for roadsides, pastures, and lawns in the southeastern United States mainly because of drought and nematode tolerance. Metsulfuron is a sulfonylurea herbicide, which selectively controls bahiagrass in bermudagrass. Certain cultivars of bahiagrass were observed to be tolerant to recommended rates of metsulfuron. Therefore, research was conducted to investigate the susceptibility of five major bahiagrass cultivars to metsulfuron applied at increasing rates to 42 g ai/ha. Five bahiagrass cultivars were evaluated: ‘Pensacola’, ‘Tifton-9’, ‘Argentine’, ‘Common’, and ‘Paraguayan’. Argentine, Common, and Paraguayan cultivars showed a four- to fivefold increased tolerance to metsulfuron compared with Pensacola. Because of yearly inconsistencies, results for Tifton-9 were inconclusive.
A fossil pecan, Carya illinoensis (Wang.) K. Koch, from floodplain sediments of the Mississippi River near Muscatine, Iowa, was accelerator-dated at 7280 ± 120 yr B.P. This discovery indicates that pecan was at or near its present northern limit by that time. Carya pollen profiles from the Mississippi River Trench indicate that hickory pollen percentages were much higher in the valley than at upland locations during the early Holocene. Pecan, the hickory with the most restricted riparian habitat, is the likely candidate for producing these peaks in Carya pollen percentages. Therefore, pecan may have reached its northern limit as early as 10,300 yr B.P. Its abundance in Early Archaic archaeological sites and the co-occurrence of early Holocene Carya pollen peaks with the arrival of the Dalton artifact complex in the Upper Mississippi Valley suggest that humans may have played a role in the early dispersal of pecan.
To describe the epidemiology of complex surgical site infection (SSI) following commonly performed surgical procedures in community hospitals and to characterize trends of SSI prevalence rates over time for MRSA and other common pathogens
METHODS
We prospectively collected SSI data at 29 community hospitals in the southeastern United States from 2008 through 2012. We determined the overall prevalence rates of SSI for commonly performed procedures during this 5-year study period. For each year of the study, we then calculated prevalence rates of SSI stratified by causative organism. We created log-binomial regression models to analyze trends of SSI prevalence over time for all pathogens combined and specifically for MRSA.
RESULTS
A total of 3,988 complex SSIs occurred following 532,694 procedures (prevalence rate, 0.7 infections per 100 procedures). SSIs occurred most frequently after small bowel surgery, peripheral vascular bypass surgery, and colon surgery. Staphylococcus aureus was the most common pathogen. The prevalence rate of SSI decreased from 0.76 infections per 100 procedures in 2008 to 0.69 infections per 100 procedures in 2012 (prevalence rate ratio [PRR], 0.90; 95% confidence interval [CI], 0.82–1.00). A more substantial decrease in MRSA SSI (PRR, 0.69; 95% CI, 0.54–0.89) was largely responsible for this overall trend.
CONCLUSIONS
The prevalence of MRSA SSI decreased from 2008 to 2012 in our network of community hospitals. This decrease in MRSA SSI prevalence led to an overall decrease in SSI prevalence over the study period.
To determine the association (1) between shorter operative duration and surgical site infection (SSI) and (2) between surgeon median operative duration and SSI risk among first-time hip and knee arthroplasties.
DESIGN
Retrospective cohort study
SETTING
A total of 43 community hospitals located in the southeastern United States.
PATIENTS
Adults who developed SSIs according to National Healthcare Safety Network criteria within 365 days of first-time knee or hip arthroplasties performed between January 1, 2008 and December 31, 2012.
METHODS
Log-binomial regression models estimated the association (1) between operative duration and SSI outcome and (2) between surgeon median operative duration and SSI outcome. Hip and knee arthroplasties were evaluated in separate models. Each model was adjusted for American Society of Anesthesiology score and patient age.
RESULTS
A total of 25,531 hip arthroplasties and 42,187 knee arthroplasties were included in the study. The risk of SSI in knee arthroplasties with an operative duration shorter than the 25th percentile was 0.40 times the risk of SSI in knee arthroplasties with an operative duration between the 25th and 75th percentile (risk ratio [RR], 0.40; 95% confidence interval [CI], 0.38–0.56; P<.01). Short operative duration did not demonstrate significant association with SSI for hip arthroplasties (RR, 1.04; 95% CI, 0.79–1.37; P=.36). Knee arthroplasty surgeons with shorter median operative durations had a lower risk of SSI than surgeons with typical median operative durations (RR, 0.52; 95% CI, 0.43–0.64; P<.01).
CONCLUSIONS
Short operative durations were not associated with a higher SSI risk for knee or hip arthroplasty procedures in our analysis.
Infect. Control Hosp. Epidemiol. 2015;36(12):1431–1436
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
Long-acting injectable formulations of antipsychotics are treatment alternatives to oral agents.
Aims
To assess the efficacy of aripiprazole once-monthly compared with oral aripiprazole for maintenance treatment of schizophrenia.
Method
A 38-week, double-blind, active-controlled, non-inferiority study; randomisation (2:2:1) to aripiprazole once-monthly 400 mg, oral aripiprazole (10–30 mg/day) or aripiprazole once-monthly 50mg (a dose below the therapeutic threshold for assay sensitivity). (Trial registration: clinicaltrials.gov, NCT00706654.)
Results
A total of 1118 patients were screened, and 662 responders to oral aripiprazole were randomised. Kaplan–Meier estimated impending relapse rates at week 26 were 7.12% for aripiprazole once-monthly 400mg and 7.76% for oral aripiprazole. This difference (−0.64%, 95% CI −5.26 to 3.99) excluded the predefined non-inferiority margin of 11.5%. Treatments were superior to aripiprazole once-monthly 50mg (21.80%, P⩽0.001).
Conclusions
Aripiprazole once-monthly 400mg was non-inferior to oral aripiprazole, and the reduction in Kaplan–Meier estimated impending relapse rate at week 26 was statistically significant v. aripiprazole once-monthly 50 mg.