To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We compared the effectiveness of 4 sampling methods to recover Staphylococcus aureus, Klebsiella pneumoniae and Clostridioides difficile from contaminated environmental surfaces: cotton swabs, RODAC culture plates, sponge sticks with manual agitation, and sponge sticks with a stomacher. Organism type was the most important factor in bacterial recovery.
Serial position scores on verbal memory tests are sensitive to early Alzheimer’s disease (AD)-related neuropathological changes that occur in the entorhinal cortex and hippocampus. The current study examines longitudinal change in serial position scores as markers of subtle cognitive decline in older adults who may be in preclinical or at-risk states for AD.
This study uses longitudinal data from the Religious Orders Study and the Rush Memory and Aging Project. Participants (n = 141) were included if they did not have dementia at enrollment, completed follow-up assessments, and died and were classified as Braak stage I or II. Memory tests were used to calculate serial position (primacy, recency), total recall, and episodic memory composite scores. A neuropathological evaluation quantified AD, vascular, and Lewy body pathologies. Mixed effects models were used to examine change in memory scores. Neuropathologies and covariates (age, sex, education, APOE e4) were examined as moderators.
Primacy scores declined (β = −.032, p < .001), whereas recency scores increased (β = .021, p = .012). No change was observed in standard memory measures. Greater neurofibrillary tangle density and atherosclerosis explained 10.4% of the variance in primacy decline. Neuropathologies were not associated with recency change.
In older adults with hippocampal neuropathologies, primacy score decline may be a sensitive marker of early AD-related changes. Tangle density and atherosclerosis had additive effects on decline. Recency improvement may reflect a compensatory mechanism. Monitoring for changes in serial position scores may be a useful in vivo method of tracking incipient AD.
Several Miscanthus species are cultivated in the U.S. Midwest and Northeast, and feral populations can displace the native plant community and potentially negatively affect ecosystem processes. The monetary cost of eradicating feral Miscanthus populations is unknown, but quantifying eradication costs will inform decisions on whether eradication is a feasible goal and should be considered when totaling the economic damage of invasive species. We managed experimental populations of eulaliagrass (Miscanthus sinensis Andersson) and the giant Miscanthus hybrid (Miscanthus × giganteus J.M. Greef & Deuter ex Hodkinson & Renvoize) in three floodplain forest and three old field sites in central Illinois with the goal of eradication. We recorded the time invested in eradication efforts and tracked survival of Miscanthus plants over a 5-yr period, then estimated the costs associated with eradicating these Miscanthus populations. Finally, we used these estimates to predict the total monetary costs of eradicating existing M. sinensis populations reported on EDDMapS. Miscanthus populations in the old field sites were harder to eradicate, resulting in an average of 290% greater estimated eradication costs compared with the floodplain forest sites. However, the cost and time needed to eradicate Miscanthus populations were similar between Miscanthus species. On-site eradication costs ranged from $390 to $3,316 per site (or $1.3 to $11 m−2) in the old field sites, compared with only $85 to $547 (or $0.92 to $1.82 m−2) to eradicate populations within the floodplain forests, with labor comprising the largest share of these costs. Using our M. sinensis eradication cost estimates in Illinois, we predict that the potential costs to eradicate populations reported on EDDMapS would range from $10 to $37 million, with a median predicted cost of $22 million. The monetary costs of eradicating feral Miscanthus populations should be weighed against the benefits of cultivating these species to provide a comprehensive picture of the relative costs and benefits of adding these species to our landscapes.
Racial/ethnic differences in mental health outcomes after a traumatic event have been reported. Less is known about factors that explain these differences. We examined whether pre-, peri-, and post-trauma risk factors explained racial/ethnic differences in acute and longer-term posttraumatic stress disorder (PTSD), depression, and anxiety symptoms in patients hospitalized following traumatic injury or illness.
PTSD, depression, and anxiety symptoms were assessed during hospitalization and 2 and 6 months later among 1310 adult patients (6.95% Asian, 14.96% Latinx, 23.66% Black, 4.58% multiracial, and 49.85% White). Individual growth curve models examined racial/ethnic differences in PTSD, depression, and anxiety symptoms at each time point and in their rate of change over time, and whether pre-, peri-, and post-trauma risk factors explained these differences.
Latinx, Black, and multiracial patients had higher acute PTSD symptoms than White patients, which remained higher 2 and 6 months post-hospitalization for Black and multiracial patients. PTSD symptoms were also found to improve faster among Latinx than White patients. Risk factors accounted for most racial/ethnic differences, although Latinx patients showed lower 6-month PTSD symptoms and Black patients lower acute and 2-month depression and anxiety symptoms after accounting for risk factors. Everyday discrimination, financial stress, past mental health problems, and social constraints were related to these differences.
Racial/ethnic differences in risk factors explained most differences in acute and longer-term PTSD, depression, and anxiety symptoms. Understanding how these risk factors relate to posttraumatic symptoms could help reduce disparities by facilitating early identification of patients at risk for mental health problems.
The rapid spread of coronavirus disease 2019 (COVID-19) required swift preparation to protect healthcare personnel (HCP) and patients, especially considering shortages of personal protective equipment (PPE). Due to the lack of a pre-existing biocontainment unit, we needed to develop a novel approach to placing patients in isolation cohorts while working with the pre-existing physical space.
To prevent disease transmission to non–COVID-19 patients and HCP caring for COVID-19 patients, to optimize PPE usage, and to provide a comfortable and safe working environment.
An interdisciplinary workgroup developed a combination of approaches to convert existing spaces into COVID-19 containment units with high-risk zones (HRZs). We developed standard workflow and visual management in conjunction with updated staff training and workflows. The infection prevention team created PPE standard practices for ease of use, conservation, and staff safety.
The interventions resulted in 1 possible case of patient-to-HCP transmission and zero cases of patient-to-patient transmission. PPE usage decreased with the HRZ model while maintaining a safe environment of care. Staff on the COVID-19 units were extremely satisfied with PPE availability (76.7%) and efforts to protect them from COVID-19 (72.7%). Moreover, 54.8% of HCP working in the COVID-19 unit agreed that PPE monitors played an essential role in staff safety.
The HRZ model of containment unit is an effective method to prevent the spread of COVID-19 with several benefits. It is easily implemented and scaled to accommodate census changes. Our experience suggests that other institutions do not need to modify existing physical structures to create similarly protective spaces.
The COVID-19 pandemic resulted in millions of deaths worldwide and is considered a significant mass-casualty disaster (MCD). The surge of patients and scarcity of resources negatively impacted hospitals, patients and medical practice. We hypothesized ICUs during this MCD had a higher acuity of illness, and subsequently had increased lengths of stay (LOS), complication rates, death rates and costs of care. The purpose of this study was to investigate those outcomes.
This was a multicenter, retrospective study that compared intensive care admissions in 2020 to those in 2019 to evaluate patient outcomes and cost of care. Data were obtained from the Vizient Clinical Data Base/Resource Manager (Vizient Inc., Irvine, Texas, USA).
Data included the number of ICU admissions, patient outcomes, case mix index and summary of cost reports. Quality outcomes were also collected, and a total of 1304981 patients from 333 hospitals were included. For all medical centers, there was a significant increase in LOS index, ICU LOS, complication rate, case mix index, total cost, and direct cost index.
The MCD caused by COVID-19 was associated with increased adverse outcomes and cost-of-care for ICU patients.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Disruptive behavior disorders (DBD) are heterogeneous at the clinical and the biological level. Therefore, the aims were to dissect the heterogeneous neurodevelopmental deviations of the affective brain circuitry and provide an integration of these differences across modalities.
We combined two novel approaches. First, normative modeling to map deviations from the typical age-related pattern at the level of the individual of (i) activity during emotion matching and (ii) of anatomical images derived from DBD cases (n = 77) and controls (n = 52) aged 8–18 years from the EU-funded Aggressotype and MATRICS consortia. Second, linked independent component analysis to integrate subject-specific deviations from both modalities.
While cases exhibited on average a higher activity than would be expected for their age during face processing in regions such as the amygdala when compared to controls these positive deviations were widespread at the individual level. A multimodal integration of all functional and anatomical deviations explained 23% of the variance in the clinical DBD phenotype. Most notably, the top marker, encompassing the default mode network (DMN) and subcortical regions such as the amygdala and the striatum, was related to aggression across the whole sample.
Overall increased age-related deviations in the amygdala in DBD suggest a maturational delay, which has to be further validated in future studies. Further, the integration of individual deviation patterns from multiple imaging modalities allowed to dissect some of the heterogeneity of DBD and identified the DMN, the striatum and the amygdala as neural signatures that were associated with aggression.
CHD is an important phenotypic feature of chromosome 22q11.2 copy number variants. Biventricular repair is usually possible, however there are rare reports of patients with chromosome 22q copy number variants and functional single ventricle cardiac disease.
This is a single centre retrospective review of patients with chromosome 22q copy number variants who underwent staged single ventricle reconstructive surgery between 1 July, 1984 and 31 December, 2020.
Seventeen patients met inclusion criteria. The most common diagnosis was hypoplastic left heart syndrome (n = 8) and vascular anomalies were present in 13 patients. A microdeletion of the chromosome 22 A-D low-copy repeat was present in 13 patients, and the remaining had a duplication. About half of the patients had documented craniofacial abnormalities and/or hypocalcaemia, and developmental delay was very common. Fifteen patients had a Norwood operation, 10 patients had a superior cavopulmonary anastomosis, and 7 patients had a Fontan. Two patients had cardiac transplantation after Fontan. Overall survival is 64% at 1 year, and 58% at 5 and 10 years. Most deaths occurred following Norwood operation (n = 5).
CHD necessitating single ventricle reconstruction associated with chromosome 22q copy number variants is not common, but typically occurs as a variant of hypoplastic left heart syndrome with the usual cytogenetic microdeletion. The most common neonatal surgical intervention performed is the Norwood, where most of the mortality burden occurs. Associated anomalies and medical issues may cause additional morbidity after cardiac surgery, but survival is similar to infants with other types of single ventricle disease.
Copy number variants (CNVs) have been associated with the risk of schizophrenia, autism and intellectual disability. However, little is known about their spectrum of psychopathology in adulthood.
We investigated the psychiatric phenotypes of adult CNV carriers and compared probands, who were ascertained through clinical genetics services, with carriers who were not. One hundred twenty-four adult participants (age 18–76), each bearing one of 15 rare CNVs, were recruited through a variety of sources including clinical genetics services, charities for carriers of genetic variants, and online advertising. A battery of psychiatric assessments was used to determine psychopathology.
The frequencies of psychopathology were consistently higher for the CNV group compared to general population rates. We found particularly high rates of neurodevelopmental disorders (NDDs) (48%), mood disorders (42%), anxiety disorders (47%) and personality disorders (73%) as well as high rates of psychiatric multimorbidity (median number of diagnoses: 2 in non-probands, 3 in probands). NDDs [odds ratio (OR) = 4.67, 95% confidence interval (CI) 1.32–16.51; p = 0.017) and psychotic disorders (OR = 6.8, 95% CI 1.3–36.3; p = 0.025) occurred significantly more frequently in probands (N = 45; NDD: 39[87%]; psychosis: 8[18%]) than non-probands (N = 79; NDD: 20 [25%]; psychosis: 3[4%]). Participants also had somatic diagnoses pertaining to all organ systems, particularly conotruncal cardiac malformations (in individuals with 22q11.2 deletion syndrome specifically), musculoskeletal, immunological, and endocrine diseases.
Adult CNV carriers had a markedly increased rate of anxiety and personality disorders not previously reported and high rates of psychiatric multimorbidity. Our findings support in-depth psychiatric and medical assessments of carriers of CNVs and the establishment of multidisciplinary clinical services.
A national survey characterized training and career development for translational researchers through Clinical and Translational Science Award (CTSA) T32/TL1 programs. This report summarizes program goals, trainee characteristics, and mentorship practices.
A web link to a voluntary survey was emailed to 51 active TL1 program directors and administrators. Descriptive analyses were performed on aggregate data. Qualitative data analysis used open coding of text followed by an axial coding strategy based on the grounded theory approach.
Fifty out of 51 (98%) invited CTSA hubs responded. Training program goals were aligned with the CTSA mission. The trainee population consisted of predoctoral students (50%), postdoctoral fellows (30%), and health professional students in short-term (11%) or year-out (9%) research training. Forty percent of TL1 programs support both predoctoral and postdoctoral trainees. Trainees are diverse by academic affiliation, mostly from medicine, engineering, public health, non-health sciences, pharmacy, and nursing. Mentor training is offered by most programs, but mandatory at less than one-third of them. Most mentoring teams consist of two or more mentors.
CTSA TL1 programs are distinct from other NIH-funded training programs in their focus on clinical and translational research, cross-disciplinary approaches, emphasis on team science, and integration of multiple trainee types. Trainees in nearly all TL1 programs were engaged in all phases of translational research (preclinical, clinical, implementation, public health), suggesting that the CTSA TL1 program is meeting the mandate of NCATS to provide training to develop the clinical and translational research workforce.
Well-defined reconstruction parameters are essential to quantify the size, shape, and distribution of nanoscale features in atom probe tomography (APT) datasets. However, the reconstruction parameters of many minerals are difficult to estimate because intrinsic spatial markers, such as crystallographic planes, are not usually present within the datasets themselves. Using transmission and/or scanning electron microscopy imaging of needle-shaped specimens before and after atom probe analysis, we test various approaches to provide best-fit reconstruction parameters for voltage-based APT reconstructions. The results demonstrate that the length measurement of evaporated material, constrained by overlaying pre- and post-analysis images, yields more consistent reconstruction parameters than the measurement of final tip radius. Using this approach, we provide standardized parameters that may be used in APT reconstructions of 11 minerals. The adoption of standardized reconstruction parameters by the geoscience APT community will alleviate potential problems in the measurement of nanoscale features (e.g., clusters and interfaces) caused by the use of inappropriate parameters.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Environmental DNA (eDNA) surveying has potential to become a powerful tool for sustainable parasite control. As trematode parasites require an intermediate snail host that is often aquatic or amphibious to fulfil their lifecycle, water-based eDNA analyses can be used to screen habitats for the presence of snail hosts and identify trematode infection risk areas. The aim of this study was to identify climatic and environmental factors associated with the detection of Galba truncatula eDNA. Fourteen potential G. truncatula habitats on two farms were surveyed over a 9-month period, with eDNA detected using a filter capture, extraction and PCR protocol with data analysed using a generalized estimation equation. The probability of detecting G. truncatula eDNA increased in habitats where snails were visually detected, as temperature increased, and as water pH decreased (P < 0.05). Rainfall was positively associated with eDNA detection in watercourse habitats on farm A, but negatively associated with eDNA detection in watercourse habitats on farm B (P < 0.001), which may be explained by differences in watercourse gradient. This study is the first to identify factors associated with trematode intermediate snail host eDNA detection. These factors should be considered in standardized protocols to evaluate the results of future eDNA surveys.
The 2020 update of the Canadian Stroke Best Practice Recommendations (CSBPR) for the Secondary Prevention of Stroke includes current evidence-based recommendations and expert opinions intended for use by clinicians across a broad range of settings. They provide guidance for the prevention of ischemic stroke recurrence through the identification and management of modifiable vascular risk factors. Recommendations address triage, diagnostic testing, lifestyle behaviors, vaping, hypertension, hyperlipidemia, diabetes, atrial fibrillation, other cardiac conditions, antiplatelet and anticoagulant therapies, and carotid and vertebral artery disease. This update of the previous 2017 guideline contains several new or revised recommendations. Recommendations regarding triage and initial assessment of acute transient ischemic attack (TIA) and minor stroke have been simplified, and selected aspects of the etiological stroke workup are revised. Updated treatment recommendations based on new evidence have been made for dual antiplatelet therapy for TIA and minor stroke; anticoagulant therapy for atrial fibrillation; embolic strokes of undetermined source; low-density lipoprotein lowering; hypertriglyceridemia; diabetes treatment; and patent foramen ovale management. A new section has been added to provide practical guidance regarding temporary interruption of antithrombotic therapy for surgical procedures. Cancer-associated ischemic stroke is addressed. A section on virtual care delivery of secondary stroke prevention services in included to highlight a shifting paradigm of care delivery made more urgent by the global pandemic. In addition, where appropriate, sex differences as they pertain to treatments have been addressed. The CSBPR include supporting materials such as implementation resources to facilitate the adoption of evidence into practice and performance measures to enable monitoring of uptake and effectiveness of recommendations.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Salt marshes have been lost or degraded as the intensity of human impacts to coastal landscapes has increased due to agriculture, transportation, urban and industrial development, and climate change. Because salt marshes have limited distribution and embody a variety of ecological functions that are important to humans (see ecosystem services, Chapter 15), many societies have recognized the need to preserve remaining marshes, restore those that have been degraded, and create new marshes in areas where they have been lost. An emerging and critical threat to tidal marshes across the globe is increasing rates of sea level rise and other aspects of climate change, which complicates but also heightens the urgency for restoration. By restoration we mean re-establishing natural conditions and the processes needed to support their functions, especially self-maintenance (see Box 17.1). Typically, salt marshes are self-maintaining, with salt tolerant plants, mineral sediments, and tidal flooding interacting to maintain elevation and ecological functions under dynamic conditions (Chapters 4, 7, 8).
ABSTRACT IMPACT: A machine learning approach using electronic health records can combine descriptive, population-level factors of pressure injury outcomes. OBJECTIVES/GOALS: Pressure injuries cause 60,000 deaths and cost $26 billion annually in the US, but prevention is laborious. We used clinical data to develop a machine learning algorithm for predicting pressure injury risk and prescribe the timing of intervention to help clinicians balance competing priorities. METHODS/STUDY POPULATION: We obtained 94,745 electronic health records with 7,000 predictors to calibrate a predictive algorithm of pressure injury risk. Machine learning was used to mine features predicting changes in pressure injury risk; random forests outperformed neural networks, boosting and bagging in feature selection. These features were fit to multilevel ordered logistic regression to create an algorithm that generated empirical Bayes estimates informing a decision-rule for follow-up based on individual risk trajectories over time. We used cross-validation to verify predictive validity, and constrained optimization to select a best-fit algorithm that reduced the time required to trigger patient follow-up. RESULTS/ANTICIPATED RESULTS: The algorithm significantly improved prediction of pressure injury risk (p<0.001) with an area under the ROC curve of 0.60 compared to the Braden Scale, a traditional clinician instrument of pressure injury risk. At a specificity of 0.50, the model achieved a sensitivity of 0.63 within 2.5 patient-days. Machine learning identified categorical increases in risk when patients were prescribed vasopressors (OR=16.4, p<0.001), beta-blockers (OR=4.8, p<0.001), erythropoietin stimulating agents (OR=3.0, p<0.001), or were ordered a urinalysis screen (OR=9.1, p<0.001), lipid panel (OR=5.7, p<0.001) or pre-albumin panel (OR=2.0, p<0.001). DISCUSSION/SIGNIFICANCE OF FINDINGS: This algorithm could help hospitals conserve resources within a critical period of patient vulnerability for pressure injury not reimbursed by Medicare. Savings generated by this approach could justify investment in machine learning to develop electronic warning systems for many iatrogenic injuries.
ABSTRACT IMPACT: Leverage community engagement to continue moving translational science and research forward. OBJECTIVES/GOALS: Engaging community in translational research improves innovation and speeds the movement of evidence into practice. Yet, it is unclear how community is engaged across the translational research spectrum or the degree of community-engagement used. We conducted a scoping review to fill this gap. METHODS/STUDY POPULATION: We used the PRISMA model search strategy with a range of databases (e.g., PubMed/Medline, Scopus) to identify articles published between January 2008 and November 2018 (n=167) and eliminated studies that did not use any level of community-engagement (n=102). Studies were coded for translational stage-corresponding to T0 (basic science), T1 (basic science to clinical research in humans; n=6), T2 (clinical efficacy and effectiveness research, n=45), T3 (dissemination and implementation research, n=95), and T4 (population health, n=21) as well as the degree of community engagement from least to most intensive (i.e., outreach, consultation, involvement, collaboration, shared leadership). RESULTS/ANTICIPATED RESULTS: The final number of eligible articles was 65. There was a relatively balanced distribution across levels of community engagement across articles (i.e., outreach, n=14; consultation, n=13; involvement, n=7; collaboration, n=15; shared leadership, n=16). Within these articles, the depth of community engagement varied with higher engagement typically occurring at later stages of translational research (T3 and T4), but more specifically in the dissemination and implementation science stage (T3). However, shared leadership, the most intensive form of engagement, was found in T2, T3, and T4 studies suggesting the value of community-engagement across the translational research spectrum. DISCUSSION/SIGNIFICANCE OF FINDINGS: A strong understanding of how various levels of community engagement are used in translational research, and the outcomes they produce, may to expedite the translation of knowledge into practice and enable practice-based needs to inform policy.