To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Glioblastomas (GBMs) are heterogeneous, treatment-resistant tumors that are driven by populations of cancer stem cells (CSCs). In this study, we perform an epigenetic-focused functional genomics screen in GBM organoids and identify WDR5 as an essential epigenetic regulator in the SOX2-enriched, therapy resistant cancer stem cell niche. METHODS/STUDY POPULATION: Despite their importance for tumor growth, few molecular mechanisms critical for CSC population maintenance have been exploited for therapeutic development. We developed a spatially resolved loss-of-function screen in GBM patient-derived organoids to identify essential epigenetic regulators in the SOX2-enriched, therapy resistant niche. Our niche-specific screens identified WDR5, an H3K4 histone methyltransferase responsible for activating specific gene expression, as indispensable for GBM CSC growth and survival. RESULTS/ANTICIPATED RESULTS: In GBM CSC models, WDR5 inhibitors blocked WRAD complex assembly and reduced H3K4 trimethylation and expression of genes involved in CSC-relevant oncogenic pathways. H3K4me3 peaks lost with WDR5 inhibitor treatment occurred disproportionally on POU transcription factor motifs, required for stem cell maintenance and including the POU5F1(OCT4)::SOX2 motif. We incorporated a SOX2/OCT4 motif driven GFP reporter system into our CSC cell models and found that WDR5 inhibitor treatment resulted in dose-dependent silencing of stem cell reporter activity. Further, WDR5 inhibitor treatment altered the stem cell state, disrupting CSC in vitro growth and self-renewal as well as in vivo tumor growth. DISCUSSION/SIGNIFICANCE: Our results unveiled the role of WDR5 in maintaining the CSC state in GBM and provide a rationale for therapeutic development of WDR5 inhibitors for GBM and other advanced cancers. This conceptual and experimental framework can be applied to many cancers, and can unmask unique microenvironmental biology and rationally designed combination therapies.
The objective of this study was to determine antibiotic appropriateness based on Loeb minimum criteria (LMC) in patients with and without altered mental status (AMS).
Retrospective, quasi-experimental study assessing pooled data from 3 periods pertaining to the implementation of a UTI management guideline.
Academic medical center in Lexington, Kentucky.
Adult patients aged ≥18 years with a collected urinalysis receiving antimicrobial therapy for a UTI indication.
Appropriateness of UTI management was assessed in patients prior to an institutional UTI guideline, after guideline introduction and education, and after implementation of a prospective audit-and-feedback stewardship intervention from September to November 2017–2019. Patient data were pooled and compared between patients noted to have AMS versus those with classic UTI symptoms. Loeb minimum criteria were used to determine whether UTI diagnosis and treatment was warranted.
In total, 600 patients were included in the study. AMS was one of the most common indications for testing across the 3 periods (19%–30.5%). Among those with AMS, 25 patients (16.7%) met LMC, significantly less than the 151 points (33.6%) without AMS (P < .001).
Patients with AMS are prescribed antibiotic therapy without symptoms indicative of UTI at a higher rate than those without AMS, according to LMC. Further antimicrobial stewardship efforts should focus on prescriber education and development of clearly defined criteria for patients with and without AMS.
Research among non-industrial societies suggests that body kinematics adopted during running vary between groups according to the cultural importance of running. Among groups in which running is common and an important part of cultural identity, runners tend to adopt what exercise scientists and coaches consider to be good technique for avoiding injury and maximising performance. In contrast, among groups in which running is not particularly culturally important, people tend to adopt suboptimal technique. This paper begins by describing key elements of good running technique, including landing with a forefoot or midfoot strike pattern and leg oriented roughly vertically. Next, we review evidence from non-industrial societies that cultural attitudes about running associate with variation in running techniques. Then, we present new data from Tsimane forager–horticulturalists in Bolivia. Our findings suggest that running is neither a common activity among the Tsimane nor is it considered an important part of cultural identity. We also demonstrate that when Tsimane do run, they tend to use suboptimal technique, specifically landing with a rearfoot strike pattern and leg protracted ahead of the knee (called overstriding). Finally, we discuss processes by which culture might influence variation in running techniques among non-industrial societies, including self-optimisation and social learning.
In the USA, as many as 20 % of recruits sustain stress fractures during basic training. In addition, approximately one-third of female recruits develop Fe deficiency upon completion of training. Fe is a cofactor in bone collagen formation and vitamin D activation, thus we hypothesised Fe deficiency may be contributing to altered bone microarchitecture and mechanics during 12-weeks of increased mechanical loading. Three-week old female Sprague Dawley rats were assigned to one of four groups: Fe-adequate sedentary, Fe-deficient sedentary, Fe-adequate exercise and Fe-deficient exercise. Exercise consisted of high-intensity treadmill running (54 min 3×/week). After 12-weeks, serum bone turnover markers, femoral geometry and microarchitecture, mechanical properties and fracture toughness and tibiae mineral composition and morphometry were measured. Fe deficiency increased the bone resorption markers C-terminal telopeptide type I collagen and tartate-resistant acid phosphatase 5b (TRAcP 5b). In exercised rats, Fe deficiency further increased bone TRAcP 5b, while in Fe-adequate rats, exercise increased the bone formation marker procollagen type I N-terminal propeptide. In the femur, exercise increased cortical thickness and maximum load. In the tibia, Fe deficiency increased the rate of bone formation, mineral apposition and Zn content. These data show that the femur and tibia structure and mechanical properties are not negatively impacted by Fe deficiency despite a decrease in tibiae Fe content and increase in serum bone resorption markers during 12-weeks of high-intensity running in young growing female rats.
Community research advisory councils (C-RAC) bring together community members with interest in research to support design, evaluation, and dissemination of research in the communities they represent. There are few ways for early career researchers, such as TL1 trainees, to develop skills in community-engaged research, and there are limited opportunities for C-RAC members to influence early career researchers. In our novel training collaboration, TL1 trainees presented their research projects to C-RAC members who provided feedback. We present on initial evidence of student learning and summarize lessons learned that TL1 programs and C-RACs can incorporate into future collaborations.
The schizophrenia polygenic risk score (SCZ-PRS) is an emerging tool in psychiatry.
We aimed to evaluate the utility of SCZ-PRS in a young, transdiagnostic, clinical cohort.
SCZ-PRSs were calculated for young people who presented to early-intervention youth mental health clinics, including 158 patients of European ancestry, 113 of whom had longitudinal outcome data. We examined associations between SCZ-PRS and diagnosis, clinical stage and functioning at initial assessment, and new-onset psychotic disorder, clinical stage transition and functional course over time in contact with services.
Compared with a control group, patients had elevated PRSs for schizophrenia, bipolar disorder and depression, but not for any non-psychiatric phenotype (for example cardiovascular disease). Higher SCZ-PRSs were elevated in participants with psychotic, bipolar, depressive, anxiety and other disorders. At initial assessment, overall SCZ-PRSs were associated with psychotic disorder (odds ratio (OR) per s.d. increase in SCZ-PRS was 1.68, 95% CI 1.08–2.59, P = 0.020), but not assignment as clinical stage 2+ (i.e. discrete, persistent or recurrent disorder) (OR = 0.90, 95% CI 0.64–1.26, P = 0.53) or functioning (R = 0.03, P = 0.76). Longitudinally, overall SCZ-PRSs were not significantly associated with new-onset psychotic disorder (OR = 0.84, 95% CI 0.34–2.03, P = 0.69), clinical stage transition (OR = 1.02, 95% CI 0.70–1.48, P = 0.92) or persistent functional impairment (OR = 0.84, 95% CI 0.52–1.38, P = 0.50).
In this preliminary study, SCZ-PRSs were associated with psychotic disorder at initial assessment in a young, transdiagnostic, clinical cohort accessing early-intervention services. Larger clinical studies are needed to further evaluate the clinical utility of SCZ-PRSs, especially among individuals with high SCZ-PRS burden.
To assess potential transmission of antibiotic-resistant organisms (AROs) using surrogate markers and bacterial cultures.
A 1,260-bed tertiary-care academic medical center.
The study included 25 patients (17 of whom were on contact precautions for AROs) and 77 healthcare personnel (HCP).
Fluorescent powder (FP) and MS2 bacteriophage were applied in patient rooms. HCP visits to each room were observed for 2–4 hours; hand hygiene (HH) compliance was recorded. Surfaces inside and outside the room and HCP skin and clothing were assessed for fluorescence, and swabs were collected for MS2 detection by polymerase chain reaction (PCR) and selective bacterial cultures.
Transfer of FP was observed for 20 rooms (80%) and 26 HCP (34%). Transfer of MS2 was detected for 10 rooms (40%) and 15 HCP (19%). Bacterial cultures were positive for 1 room and 8 HCP (10%). Interactions with patients on contact precautions resulted in fewer FP detections than interactions with patients not on precautions (P < .001); MS2 detections did not differ by patient isolation status. Fluorescent powder detections did not differ by HCP type, but MS2 was recovered more frequently from physicians than from nurses (P = .03). Overall, HH compliance was better among HCP caring for patients on contact precautions than among HCP caring for patients not on precautions (P = .003), among nurses than among other nonphysician HCP at room entry (P = .002), and among nurses than among physicians at room exit (P = .03). Moreover, HCP who performed HH prior to assessment had fewer fluorescence detections (P = .008).
Contact precautions were associated with greater HCP HH compliance and reduced detection of FP and MS2.
The number of medical mobile phone applications continues to grow. Although otorhinolaryngology-specific applications represent a small proportion, there are exciting innovations emerging for the specialty. This article will assess the number of applications available and review how they may be used in clinical practice.
The application stores of the two most popular mobile phone platforms, Apple and android, were searched using multiple search terms.
A total of 107 ENT applications were identified and categorised according to intended use. Eight applications were reviewed in more detail and assessed on whether a doctor or allied health professional was involved in their design and if they were evidence-based.
There are a number of ENT-specific smartphone applications currently available. As the technology progresses, their scope has extended beyond being purely for reference. Nevertheless, it remains difficult to assess the validity and security of these applications.
Sub-acute ruminal acidosis (SARA) can reduce the production efficiency and impair the welfare of cattle, potentially in all production systems. The aim of this study was to characterise measurable postmortem observations from divergently managed intensive beef finishing farms with high rates of concentrate feeding. At the time of slaughter, we obtained samples from 19 to 20 animals on each of 6 beef finishing units (119 animals in total) with diverse feeding practices, which had been subjectively classified as being high risk (three farms) or low risk (three farms) for SARA on the basis of the proportions of barley, silage and straw in the ration. We measured the concentrations of histamine, lipopolysaccharide (LPS), lactate and other short-chain fatty acids (SCFAs) in ruminal fluid, LPS and SCFA in caecal fluid. We also took samples of the ventral blind sac of the rumen for histopathology, immunohistopathology and gene expression. Subjective assessments were made of the presence of lesions on the ruminal wall, the colour of the lining of the ruminal wall and the shape of the ruminal papillae. Almost all variables differed significantly and substantially among farms. Very few pathological changes were detected in any of the rumens examined. The animals on the high-risk diets had lower concentrations of SCFA and higher concentrations of lactate and LPS in the ruminal fluid. Higher LPS concentrations were found in the caecum than the rumen but were not related to the risk status of the farm. The diameters of the stratum granulosum, stratum corneum and of the vasculature of the papillae, and the expression of the gene TLR4 in the ruminal epithelium were all increased on the high-risk farms. The expression of IFN-γ and IL-1β and the counts of cluster of differentiation 3 positive and major histocompatibility complex class two positive cells were lower on the high-risk farms. High among-farm variation and the unbalanced design inherent in this type of study in the field prevented confident assignment of variation in the dependent variables to individual dietary components; however, the CP percentage of the total mixed ration DM was the factor that was most consistently associated with the variables of interest. Despite the strong effect of farm on the measured variables, there was wide inter-animal variation.
This study evaluated in a rigorous 18-month randomized controlled trial the efficacy of an enhanced vocational intervention for helping individuals with a recent first schizophrenia episode to return to and remain in competitive work or regular schooling.
Individual Placement and Support (IPS) was adapted to meet the goals of individuals whose goals might involve either employment or schooling. IPS was combined with a Workplace Fundamentals Module (WFM) for an enhanced, outpatient, vocational intervention. Random assignment to the enhanced integrated rehabilitation program (N = 46) was contrasted with equally intensive clinical treatment at UCLA, including social skills training groups, and conventional vocational rehabilitation by state agencies (N = 23). All patients were provided case management and psychiatric services by the same clinical team and received oral atypical antipsychotic medication.
The IPS–WFM combination led to 83% of patients participating in competitive employment or school in the first 6 months of intensive treatment, compared with 41% in the comparison group (p < 0.005). During the subsequent year, IPS–WFM continued to yield higher rates of schooling/employment (92% v. 60%, p < 0.03). Cumulative number of weeks of schooling and/or employment was also substantially greater with the IPS–WFM intervention (45 v. 26 weeks, p < 0.004).
The results clearly support the efficacy of an enhanced intervention focused on recovery of participation in normative work and school settings in the initial phase of schizophrenia, suggesting potential for prevention of disability.
Corrugated mats of palygorskite occurring in regolith overlying a fissured limestone were examined by X-ray diffraction, infrared spectroscopy, and electron optical techniques after heating at 50°C intervals to 700°C. Palygorskite ‘anhydride’ formed at 400°C and sillimanite formed at 500°C. The palygorskite is believed to have formed in joints in limestone of Upper Oligocene age (Duntroonian) prior to uplift and subsequent weathering. It is highly crystalline, and appears not to have altered or weathered since precipitation.
Root-knot nematodes represent a serious threat to world coffee production, especially Meloidogyne incognita and M. paranaensis. Most cultivars of Coffea arabica are highly susceptible to these parasites and cultivation in infested areas has only been possible with the use of resistant C. canephora rootstocks. In this research, three elite clones of C. canephora, selected in areas infested by M. incognita and M. paranaensis, were evaluated in controlled conditions to assess levels of resistance against two populations of M. paranaensis, four populations of M. incognita and a mixed population of both species. The three clones were resistant to both species, but CcK1 and CcR2 were considered most promising because their vegetative growth was not impaired by nematodes.
Cover crop–based, organic rotational no-till (CCORNT) corn and soybean systems have been developed in the mid-Atlantic region to build soil health, increase management flexibility, and reduce labor. In this system, a roller-crimped cover crop mulch provides within-season weed suppression in no-till corn and soybean. A cropping system experiment was conducted in Pennsylvania, Maryland, and Delaware to test the cumulative effects of a multitactic weed management approach in a 3-yr hairy vetch/triticale–corn–cereal rye–soybean–winter wheat CCORNT rotation. Treatments included delayed planting dates (early, intermediate, late) and supplemental weed control using high-residue (HR) cultivation in no-till corn and soybean phases. In the no-till corn phase, HR cultivation decreased weed biomass relative to the uncultivated control by 58%, 23%, and 62% in Delaware, Maryland, and Pennsylvania, respectively. In the no-till soybean phase, HR cultivation decreased weed biomass relative to the uncultivated treatment planted in narrow rows (19 to 38 cm) by 20%, 41%, and 78% in Delaware, Maryland, and Pennsylvania, respectively. Common ragweed was more dominant in soybean (39% of total biomass) compared with corn (10% of total biomass), whereas giant foxtail and smooth pigweed were more dominant in corn, comprising 46% and 22% of total biomass, respectively. Common ragweed became less abundant as corn and soybean planting dates were delayed, whereas giant foxtail and smooth pigweed increased as a percentage of total biomass as planting dates were delayed. At the Pennsylvania location, inconsistent termination of cover crops with the roller-crimper resulted in volunteer cover crops in other phases of the rotation. Our results indicate that HR cultivation is necessary to achieve adequate weed control in CCORNT systems. Integration of winter grain or perennial forages into CCORNT systems will also be an important management tactic for truncating weed seedbank population increases.
To evaluate healthcare worker (HCW) risk of self-contamination when donning and doffing personal protective equipment (PPE) using fluorescence and MS2 bacteriophage.
Prospective pilot study.
A total of 36 HCWs were included in this study: 18 donned/doffed contact precaution (CP) PPE and 18 donned/doffed Ebola virus disease (EVD) PPE.
HCWs donned PPE according to standard protocols. Fluorescent liquid and MS2 bacteriophage were applied to HCWs. HCWs then doffed their PPE. After doffing, HCWs were scanned for fluorescence and swabbed for MS2. MS2 detection was performed using reverse transcriptase PCR. The donning and doffing processes were videotaped, and protocol deviations were recorded.
Overall, 27% of EVD PPE HCWs and 50% of CP PPE HCWs made ≥1 protocol deviation while donning, and 100% of EVD PPE HCWs and 67% of CP PPE HCWs made ≥1 protocol deviation while doffing (P=.02). The median number of doffing protocol deviations among EVD PPE HCWs was 4, versus 1 among CP PPE HCWs. Also, 15 EVD PPE protocol deviations were committed by doffing assistants and/or trained observers. Fluorescence was detected on 8 EVD PPE HCWs (44%) and 5 CP PPE HCWs (28%), most commonly on hands. MS2 was recovered from 2 EVD PPE HCWs (11%) and 3 CP PPE HCWs (17%).
Protocol deviations were common during both EVD and CP PPE doffing, and some deviations during EVD PPE doffing were committed by the HCW doffing assistant and/or the trained observer. Self-contamination was common. PPE donning/doffing are complex and deserve additional study.
Oxyfluorfen (0.28, 0.42, 0.56, and 0.84 kg ai ha−1) under clear polyethylene film was evaluated for weed control, crop injury, and effects on yields in transplanted muskmelon, cucumber, and summer squash. Numerous narrowleaf and broadleaf weeds were effectively suppressed by 0.42 ha−1 of oxyfluorfen. Crop injury, occurring soon after transplanting, was reported in New York and North Carolina. Injury was usually transient, and injured crops frequently grew more vigorously than those grown on untreated black polyethylene mulch. Muskmelons were consistently the most tolerant of the three crops. At high rates, yields of squash and cucumber in 1988 were reduced in New York and North Carolina, respectively. In greenhouse studies, positioning the cotyledons under the polyethylene film caused greater injury in all three crops than when cotyledons remained above the plastic.