We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We sought to determine the value of an audit-and-feedback monitoring method in facilitating meaningful practice changes to improve vancomycin dosing and monitoring.
The study was conducted in 7 not-for-profit, acute-care hospitals within a health system in southern Florida.
Methods:
The preimplementation period (September 1, 2019, through August 31, 2020) was compared to the postimplementation period (September 1, 2020, through May 31, 2022). All vancomycin serum-level results were screened for inclusion. The primary end point was the rate of fallout, defined as vancomycin serum level ≥25 µg/mL with acute kidney injury (AKI) and off-protocol dosing and monitoring. Secondary end points included the rate of fallout with respect to AKI severity, rate of vancomycin serum levels ≥25 µg/mL, and average number of serum-level evaluations per unique vancomycin patient.
Results:
In total, 27,611 vancomycin levels were analyzed from 13,910 unique patients. There were 2,209 vancomycin serum levels ≥25 µg/mL (8%) among 1,652 unique patients (11.9%). AKI was identified in 379 unique patients (23%) with a vancomycin levels ≥25 µg/mL. In total, 60 fallouts (35.2%) occurred in the 12-month preimplementation period (∼5 per month) and 41 fallouts (19.6%) occurred in the 21-month postimplementation period (∼2 per month; P = .0006). Failure was the most common AKI severity in both periods (risk: 35% vs 24.3%, P = .25; injury: 28.3% vs 19.5%, P = .30; failure: 36.7% vs 56%, P = .053). Overall, the number of evaluations of vancomycin serum levels per unique patient remained consistent throughout both periods (2 vs 2; P = .53).
Conclusions:
Implementation of a monthly quality assurance tool for elevated outlier vancomycin levels can improve dosing and monitoring practices resulting in enhanced patient safety.
Illicit substance use is dangerous in both acute and chronic forms, frequently resulting in lethal poisoning, addiction, and other negative consequences. Similar to research in other psychiatric conditions, whose ultimate goal is to enable effective prevention and treatment, studies in substance use are focused on factors elevating the risk for the disorder. The rapid growth of the substance use problem despite the effort invested in fighting it, however, suggests the need in changing the research approach. Instead of attempting to identify risk factors, whose neutralization is often infeasible if not impossible, it may be more promising to systematically reverse the perspective to the factors enhancing the aspect of liability to disorder that shares the same dimension but is opposite to risk, that is, resistance to substance use. Resistance factors, which enable the majority of the population to remain unaffected despite the ubiquity of psychoactive substances, may be more amenable to translation. While the resistance aspect of liability is symmetric to risk, the resistance approach requires substantial changes in sampling (high-resistance rather than high-risk) and using quantitative indices of liability. This article provides an overview and a practical approach to research in resistance to substance use/addiction, currently implemented in a NIH-funded project. The project benefits from unique opportunities afforded by the data originating from two longitudinal twin studies, the Virginia Twin Study of Adolescent and Behavioral Development and the Minnesota Twin Family Study. The methodology described is also applicable to other psychiatric disorders.
The Residual Lesion Score is a novel tool for assessing the achievement of surgical objectives in congenital heart surgery based on widely available clinical and echocardiographic characteristics. This article describes the methodology used to develop the Residual Lesion Score from the previously developed Technical Performance Score for five common congenital cardiac procedures using the RAND Delphi methodology.
Methods:
A panel of 11 experts from the field of paediatric and congenital cardiology and cardiac surgery, 2 co-chairs, and a consultant were assembled to review and comment on validity and feasibility of measuring the sub-components of intraoperative and discharge Residual Lesion Score for five congenital cardiac procedures. In the first email round, the panel reviewed and commented on the Residual Lesion Score and provided validity and feasibility scores for sub-components of each of the five procedures. In the second in-person round, email comments and scores were reviewed and the Residual Lesion Score revised. The modified Residual Lesion Score was scored independently by each panellist for validity and feasibility and used to develop the “final” Residual Lesion Score.
Results:
The Residual Lesion Score sub-components with a median validity score of ≥7 and median feasibility score of ≥4 that were scored without disagreement and with low absolute deviation from the median were included in the “final” Residual Lesion Score.
Conclusion:
Using the RAND Delphi methodology, we were able to develop Residual Lesion Score modules for five important congenital cardiac procedures for the Pediatric Heart Network’s Residual Lesion Score study.
We compared the effectiveness of 4 sampling methods to recover Staphylococcus aureus, Klebsiella pneumoniae and Clostridioides difficile from contaminated environmental surfaces: cotton swabs, RODAC culture plates, sponge sticks with manual agitation, and sponge sticks with a stomacher. Organism type was the most important factor in bacterial recovery.
We developed an agent-based model using a trial emulation approach to quantify effect measure modification of spillover effects of pre-exposure prophylaxis (PrEP) for HIV among men who have sex with men (MSM) in the Atlanta-Sandy Springs-Roswell metropolitan area, Georgia. PrEP may impact not only the individual prescribed, but also their partners and beyond, known as spillover. We simulated a two-stage randomised trial with eligible components (≥3 agents with ≥1 HIV+ agent) first randomised to intervention or control (no PrEP). Within intervention components, agents were randomised to PrEP with coverage of 70%, providing insight into a high PrEP coverage strategy. We evaluated effect modification by component-level characteristics and estimated spillover effects on HIV incidence using an extension of randomisation-based estimators. We observed an attenuation of the spillover effect when agents were in components with a higher prevalence of either drug use or bridging potential (if an agent acts as a mediator between ≥2 connected groups of agents). The estimated spillover effects were larger in magnitude among components with either higher HIV prevalence or greater density (number of existing partnerships compared to all possible partnerships). Consideration of effect modification is important when evaluating the spillover of PrEP among MSM.
This chapter presents a comprehensive review of the interaction between circum-Caribbean indigenous peoples and nonhuman primates before and at early European contact. It fills significant gaps in contemporary scholarly literature by providing an updated archaeological history of the social and symbolic roles of monkeys in this region. We begin by describing the zooarchaeological record of primates in the insular and coastal circum-Caribbean Ceramic period archaeological sites. Drawing from the latest archaeological investigations that use novel methods and techniques, we also review other biological evidence of the presence of monkeys. In addition, we compile a list of indigenously crafted portable material imagery and review rock art that allegedly depicts primates in the Caribbean. Our investigation is supplemented by the inclusion of written documentary sources, specifically, ethnoprimatological information derived from early ethnohistorical sources on the multifarious interactions between humans and monkeys in early colonial societies. Finally, we illustrate certain patterns that may have characterized interactions between humans and monkeys in past societies of the circum-Caribbean region (300–1500 CE), opening avenues for future investigations of this topic.
Keywords:
Archaeoprimatology, Ceramic period, Greater and Lesser Antilles, Island and coastal archaeology, Saladoid, Taíno, Trinidad, Venezuela
The COVID-19 pandemic resulted in millions of deaths worldwide and is considered a significant mass-casualty disaster (MCD). The surge of patients and scarcity of resources negatively impacted hospitals, patients and medical practice. We hypothesized ICUs during this MCD had a higher acuity of illness, and subsequently had increased lengths of stay (LOS), complication rates, death rates and costs of care. The purpose of this study was to investigate those outcomes.
Methods:
This was a multicenter, retrospective study that compared intensive care admissions in 2020 to those in 2019 to evaluate patient outcomes and cost of care. Data were obtained from the Vizient Clinical Data Base/Resource Manager (Vizient Inc., Irvine, Texas, USA).
Results:
Data included the number of ICU admissions, patient outcomes, case mix index and summary of cost reports. Quality outcomes were also collected, and a total of 1304981 patients from 333 hospitals were included. For all medical centers, there was a significant increase in LOS index, ICU LOS, complication rate, case mix index, total cost, and direct cost index.
Conclusion:
The MCD caused by COVID-19 was associated with increased adverse outcomes and cost-of-care for ICU patients.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
One in six nursing home residents and staff with positive SARS-CoV-2 tests ≥90 days after initial infection had specimen cycle thresholds (Ct) <30. Individuals with specimen Ct<30 were more likely to report symptoms but were not different from individuals with high Ct value specimens by other clinical and testing data.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Bloodstream infections (BSIs) are a frequent cause of morbidity in patients with acute myeloid leukemia (AML), due in part to the presence of central venous access devices (CVADs) required to deliver therapy.
Objective:
To determine the differential risk of bacterial BSI during neutropenia by CVAD type in pediatric patients with AML.
Methods:
We performed a secondary analysis in a cohort of 560 pediatric patients (1,828 chemotherapy courses) receiving frontline AML chemotherapy at 17 US centers. The exposure was CVAD type at course start: tunneled externalized catheter (TEC), peripherally inserted central catheter (PICC), or totally implanted catheter (TIC). The primary outcome was course-specific incident bacterial BSI; secondary outcomes included mucosal barrier injury (MBI)-BSI and non-MBI BSI. Poisson regression was used to compute adjusted rate ratios comparing BSI occurrence during neutropenia by line type, controlling for demographic, clinical, and hospital-level characteristics.
Results:
The rate of BSI did not differ by CVAD type: 11 BSIs per 1,000 neutropenic days for TECs, 13.7 for PICCs, and 10.7 for TICs. After adjustment, there was no statistically significant association between CVAD type and BSI: PICC incident rate ratio [IRR] = 1.00 (95% confidence interval [CI], 0.75–1.32) and TIC IRR = 0.83 (95% CI, 0.49–1.41) compared to TEC. When MBI and non-MBI were examined separately, results were similar.
Conclusions:
In this large, multicenter cohort of pediatric AML patients, we found no difference in the rate of BSI during neutropenia by CVAD type. This may be due to a risk-profile for BSI that is unique to AML patients.
To determine associations of alcohol use with cognitive aging among middle-aged men.
Method:
1,608 male twins (mean 57 years at baseline) participated in up to three visits over 12 years, from 2003–2007 to 2016–2019. Participants were classified into six groups based on current and past self-reported alcohol use: lifetime abstainers, former drinkers, very light (1–4 drinks in past 14 days), light (5–14 drinks), moderate (15–28 drinks), and at-risk drinkers (>28 drinks in past 14 days). Linear mixed-effects regressions modeled cognitive trajectories by alcohol group, with time-based models evaluating rate of decline as a function of baseline alcohol use, and age-based models evaluating age-related differences in performance by current alcohol use. Analyses used standardized cognitive domain factor scores and adjusted for sociodemographic and health-related factors.
Results:
Performance decreased over time in all domains. Relative to very light drinkers, former drinkers showed worse verbal fluency performance, by –0.21 SD (95% CI –0.35, –0.07), and at-risk drinkers showed faster working memory decline, by 0.14 SD (95% CI 0.02, –0.20) per decade. There was no evidence of protective associations of light/moderate drinking on rate of decline. In age-based models, light drinkers displayed better memory performance at advanced ages than very light drinkers (+0.14 SD; 95% CI 0.02, 0.20 per 10-years older age); likely attributable to residual confounding or reverse association.
Conclusions:
Alcohol consumption showed minimal associations with cognitive aging among middle-aged men. Stronger associations of alcohol with cognitive aging may become apparent at older ages, when cognitive abilities decline more rapidly.
Disruptive behavior disorders (DBD) are heterogeneous at the clinical and the biological level. Therefore, the aims were to dissect the heterogeneous neurodevelopmental deviations of the affective brain circuitry and provide an integration of these differences across modalities.
Methods
We combined two novel approaches. First, normative modeling to map deviations from the typical age-related pattern at the level of the individual of (i) activity during emotion matching and (ii) of anatomical images derived from DBD cases (n = 77) and controls (n = 52) aged 8–18 years from the EU-funded Aggressotype and MATRICS consortia. Second, linked independent component analysis to integrate subject-specific deviations from both modalities.
Results
While cases exhibited on average a higher activity than would be expected for their age during face processing in regions such as the amygdala when compared to controls these positive deviations were widespread at the individual level. A multimodal integration of all functional and anatomical deviations explained 23% of the variance in the clinical DBD phenotype. Most notably, the top marker, encompassing the default mode network (DMN) and subcortical regions such as the amygdala and the striatum, was related to aggression across the whole sample.
Conclusions
Overall increased age-related deviations in the amygdala in DBD suggest a maturational delay, which has to be further validated in future studies. Further, the integration of individual deviation patterns from multiple imaging modalities allowed to dissect some of the heterogeneity of DBD and identified the DMN, the striatum and the amygdala as neural signatures that were associated with aggression.
OBJECTIVES/GOALS: MiaA is a highly conserved prenyl transferase that catalyzes synthesis of the i6A37 tRNA modification in E. coli. While transcriptional regulation of MiaA is well characterized, there is no information on the MiaA post-transcriptional regulation. The aim of this study is to characterize the post-transcriptional regulation of the MiaA gene in E. coli. METHODS/STUDY POPULATION: To characterize the post-transcriptional regulation of miaA, we executed a targeted genetic screen of an E. coli small RNA library on a miaA-lacZ translational reporter fusion strain to identify small RNAs (sRNAs) that modulate MiaA translation or transcription termination. We also measured MiaA mRNA levels and miaA-lacZ activity in the absence or over-expression of candidate sRNA regulators of MiaA. We also measured MiaA mRNA levels in the absence of RNaseE and PNPase, two enzymes involved in mRNA turnover. Finally, we measured the ability of purified recombinant CsrA to bind to the MiaA mRNA transcript in vitro. RESULTS/ANTICIPATED RESULTS: We identified the carbon sensing sRNA CsrB and its cognate protein interaction partner CsrA, as potential post-transcriptional regulators of MiaA. Over-expression of CsrB fully repressed miaA-lacZ activity and MiaA mRNA levels. The absence of CsrA resulted in a defective miaA-lacZ activity and a 10-fold decrease in MiaA mRNA levels. We also identified an increase in the MiaA mRNA half-life particularly in the absence of RNaseE. Our results demonstrate an additional layer of regulation for the miaA operon by the CsrA/CsrB protein-sRNA system. DISCUSSION/SIGNIFICANCE: MiaA is a highly conserved bacterial protein. Our data may represent phenomena in an array of bacteria that could be targeted by novel antibiotics. The human MiaA homologue, TRIT1, plays a role in mitochondrial disorders. We anticipate that information garnered from MiaA studies will elucidate TRIT1 function and its role in mitochondrial disorders.
Copy number variants (CNVs) have been associated with the risk of schizophrenia, autism and intellectual disability. However, little is known about their spectrum of psychopathology in adulthood.
Methods
We investigated the psychiatric phenotypes of adult CNV carriers and compared probands, who were ascertained through clinical genetics services, with carriers who were not. One hundred twenty-four adult participants (age 18–76), each bearing one of 15 rare CNVs, were recruited through a variety of sources including clinical genetics services, charities for carriers of genetic variants, and online advertising. A battery of psychiatric assessments was used to determine psychopathology.
Results
The frequencies of psychopathology were consistently higher for the CNV group compared to general population rates. We found particularly high rates of neurodevelopmental disorders (NDDs) (48%), mood disorders (42%), anxiety disorders (47%) and personality disorders (73%) as well as high rates of psychiatric multimorbidity (median number of diagnoses: 2 in non-probands, 3 in probands). NDDs [odds ratio (OR) = 4.67, 95% confidence interval (CI) 1.32–16.51; p = 0.017) and psychotic disorders (OR = 6.8, 95% CI 1.3–36.3; p = 0.025) occurred significantly more frequently in probands (N = 45; NDD: 39[87%]; psychosis: 8[18%]) than non-probands (N = 79; NDD: 20 [25%]; psychosis: 3[4%]). Participants also had somatic diagnoses pertaining to all organ systems, particularly conotruncal cardiac malformations (in individuals with 22q11.2 deletion syndrome specifically), musculoskeletal, immunological, and endocrine diseases.
Conclusions
Adult CNV carriers had a markedly increased rate of anxiety and personality disorders not previously reported and high rates of psychiatric multimorbidity. Our findings support in-depth psychiatric and medical assessments of carriers of CNVs and the establishment of multidisciplinary clinical services.
Many short gamma-ray bursts (GRBs) originate from binary neutron star mergers, and there are several theories that predict the production of coherent, prompt radio signals either prior, during, or shortly following the merger, as well as persistent pulsar-like emission from the spin-down of a magnetar remnant. Here we present a low frequency (170–200 MHz) search for coherent radio emission associated with nine short GRBs detected by the Swift and/or Fermi satellites using the Murchison Widefield Array (MWA) rapid-response observing mode. The MWA began observing these events within 30–60 s of their high-energy detection, enabling us to capture any dispersion delayed signals emitted by short GRBs for a typical range of redshifts. We conducted transient searches at the GRB positions on timescales of 5 s, 30 s, and 2 min, resulting in the most constraining flux density limits on any associated transient of 0.42, 0.29, and 0.084 Jy, respectively. We also searched for dispersed signals at a temporal and spectral resolution of 0.5 s and 1.28 MHz, but none were detected. However, the fluence limit of 80–100 Jy ms derived for GRB 190627A is the most stringent to date for a short GRB. Assuming the formation of a stable magnetar for this GRB, we compared the fluence and persistent emission limits to short GRB coherent emission models, placing constraints on key parameters including the radio emission efficiency of the nearly merged neutron stars (
$\epsilon_r\lesssim10^{-4}$
), the fraction of magnetic energy in the GRB jet (
$\epsilon_B\lesssim2\times10^{-4}$
), and the radio emission efficiency of the magnetar remnant (
$\epsilon_r\lesssim10^{-3}$
). Comparing the limits derived for our full GRB sample (along with those in the literature) to the same emission models, we demonstrate that our fluence limits only place weak constraints on the prompt emission predicted from the interaction between the relativistic GRB jet and the interstellar medium for a subset of magnetar parameters. However, the 30-min flux density limits were sensitive enough to theoretically detect the persistent radio emission from magnetar remnants up to a redshift of
$z\sim0.6$
. Our non-detection of this emission could imply that some GRBs in the sample were not genuinely short or did not result from a binary neutron star merger, the GRBs were at high redshifts, these mergers formed atypical magnetars, the radiation beams of the magnetar remnants were pointing away from Earth, or the majority did not form magnetars but rather collapse directly into black holes.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.