To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Increased computing capacity and the spread of computational knowledge has generated the expectation that organizations and municipalities use large quantities of data to drive decision making. However, municipalities may lack the resources to meaningfully use their data for decision making. Relatedly, political science and public administration programs face the challenge of training students for success in this environment. We believe one remedy is the adoption of coproduction as a pedagogical strategy. This article presents a case study of a partnership between a university research team and a municipal emergency communications center as a demonstration of how coproduction can be harnessed as a teaching tool. Findings from this project were presented at the Southern Political Science Association Annual Meeting, January 8–11, 2020, in San Juan, Puerto Rico.
Clostridioides difficile infection (CDI) is one of the leading causes of hospital–onset infections. Clinically distinguishing true CDI versus colonization with C. difficile is challenging and often requires reliable and rapid molecular testing methods. At our academic center, we implemented a 2-step testing algorithm to help identify true CDI cases. The University of Mississippi Medical Center is a 700+ bed academic facility located in Jackson, Mississippi. Hospital-onset (HO) CDI was defined based on NHSN Laboratory Identified (LabID) event as the last positive C. difficile test result performed on a specimen using a multistep testing algorithm collected >3 calendar days after admission to the facility. HO-CDI data were collected from all inpatient units except the NICU and newborn nursery. HO-CDI outcomes were assessed based on standardized infection ratio (SIR) data. In May 2020, we implemented a 2-step testing algorithm (Figure 1). All patients with diarrhea underwent C. difficile PCR testing. Those with positive C. difficile PCR test were reflexed to undergo enzyme immunoassay (EIA) glutamate dehydrogenase antigen (Ag) testing and toxin A and B testing. The final results were reported as colonization (C. difficile PCR+/EIA Ag+/Toxin A/B−) or true CDI case (C. difficile PCR+/EIA +/Toxin A/B +) or negative (C. difficile PCR−). All patients with colonization or true infection were placed under contact isolation precautions until diarrhea resolution for 48 hours. During the preintervention period (October 2019–April 2020), 25 HO-CDI cases were reported compared to 8 cases in the postintervention period (June 2020–December 2020). A reduction in CDI SIR occurred in the postintervention period (Q3 2020–Q4 2020, SIR 0.265) compared to preintervention period (Q4 2019–Q1 2020, SIR 0.338) (Figure 2). We successfully reduced our NHSN HO-CDI SIR below the national average after implementing a 2-step testing algorithm for CDI. The 2-step testing algorithm was useful for antimicrobial stewardship to guide appropriate CDI treatment for true cases and for infection prevention to continue isolation of infected and colonized cases to reduce the spread of C. difficile spores.
To compare the risk of hospitalization for adult Medicaid beneficiaries with bipolar I disorder (BPD-I) when treated with lurasidone compared to other atypical antipsychotics (AAPs) as monotherapy.
Using IBM MarketScan Multi-State Medicaid Claims database, a retrospective cohort study was conducted on adult BPD-I patients who initiated an AAP (index date) between January 1, 2014 and June 30, 2019. Patients were required to be continuously enrolled during the 12-month pre- and 24-month post-index date. Marginal structural models were performed to estimate the risk of hospitalization (all-cause, BPD-I-related, and psychiatric-related) associated with each AAP and the average length of stay.
The analysis included 8262 adult BPD-I patients, of whom AAP use was divided between lurasidone (14%), aripiprazole (17%), olanzapine (8%), quetiapine (29%), risperidone (10%), no/minimal (1%) or other (21%) during each month of post-index period. The adjusted odds ratios (aORs) for all-cause hospitalization were significantly higher for olanzapine (aOR=1.60, 95% CI=1.09–2.10) and quetiapine (aOR=1.54, 95% CI=1.18–1.89), compared to lurasidone. The aORs for BPD-I-related hospitalization were significantly higher for quetiapine (aOR=1.57, 95% CI=1.10–2.04) and risperidone (aOR=1.80, 95% CI=1.04–2.56) compared to lurasidone. The average length of hospital stay was more than twice as high for quetiapine compared to lurasidone (aRR=2.12, 95% CI=1.32–2.92). The risk of psychiatric-related hospitalization was numerically lower for lurasidone compared to all other AAPs.
Over a 24-month follow-up period, lurasidone-treated adult BPD-1 patients had significantly lower risk of all-cause hospitalization than those treated with olanzapine and quetiapine, lower risk of BPD-I-related hospitalization than quetiapine and risperidone, and fewer hospital days than quetiapine in a Medicaid population.
In this study we report a new record of a cryptogenic polychaete from southern Africa. The species was found inhabiting sand tubes in intertidal sand flats in the Knysna Estuary on the southern coast of South Africa. Morphological comparisons using light and scanning electron microscopy showed extensive taxonomic similarities with Dipolydora socialis described from other localities and from museum vouchers. In addition, 18S rRNA and COI barcodes were generated for the species. Genetic analysis of the assembled polydorid dataset corroborated the morphological data in delineating the species as a taxonomic unit with >99% genetic similarity to available sequences of D. socialis in the GenBank database. Dipolydora socialis has been reported as having a widespread distribution, and since it can reside within tubes associated with fouling communities or as a shell borer, several vectors may have been responsible for its global spread and introduction to southern Africa. Finally, considering the many cryptic complexes that are currently being uncovered within polychaetes, including spionids, future taxonomic studies should incorporate additional genetic data from other regions of the world to determine whether D. socialis may also be part of a larger species complex.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Existing peer-reviewed literature describing emergency medical technician (EMT) acquisition and transmission of 12-lead electrocardiograms (12L-ECGs), in the absence of a paramedic, is largely limited to feasibility studies.
The objective of this retrospective observational study was to describe the impact of EMT-acquired 12L-ECGs in Suffolk County, New York (USA), both in terms of the diagnostic quality of the transmitted 12L-ECGs and the number of prehospital percutaneous coronary intervention (PCI)-center notifications made as a result of transmitted 12L-ECGs demonstrating a ST-elevation myocardial infarction (STEMI).
A pre-existing database was queried for Emergency Medical Services (EMS) calls on which an EMT acquired a 12L-ECG from program initiation (January 2017) through December 31, 2019. Scanned copies of the 12L-ECGs were requested in order to be reviewed by a blinded emergency physician.
Of the 665 calls, 99 had no 12L-ECG available within the database. For 543 (96%) of the available 12L-ECGs, the quality was sufficient to diagnose the presence or absence of a STEMI. Eighteen notifications were made to PCI-centers about a concern for STEMI. The median time spent on scene and transporting to the hospital were 18 and 11 minutes, respectively. The median time from PCI-center notification to EMS arrival at the emergency department (ED) was seven minutes (IQR 5-14).
In the event a cardiac monitor is available, after a limited educational intervention, EMTs are capable of acquiring a diagnostically useful 12L-ECG and transmitting it to a remote medical control physician for interpretation. This allows for prehospital PCI-center activation for a concern of a 12L-ECG with a STEMI, in the event that a paramedic is not available to care for the patient.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Psychodermatology is an emerging field at the interface between psychiatry, psychology and dermatology. There is a strong bidirectional relationship between a number of dermatological disorders and psychiatric disorders. This article provides an overview of psychiatric disorders with dermatological symptoms, and dermatological disorders with secondary psychophysiological consequences. The principles of management and our insights into establishing a psychodermatology service in the UK are discussed.
Clinical and Translational Science Award (CTSA) TL1 trainees and KL2 scholars were surveyed to determine the immediate impact of the COVID-19 pandemic on training and career development. The most negative impact was lack of access to research facilities, clinics, and human subjects, plus for KL2 scholars lack of access to team members and need for homeschooling. TL1 trainees reported having more time to think and write. Common strategies to maintain research productivity involved time management, virtual connections with colleagues, and shifting to research activities not requiring laboratory/clinic settings. Strategies for mitigating the impact of the COVID-19 pandemic on training and career development are described.
Knowledge of crop–weed interference effects on weed biology along with yield penalties can be used for the development of integrated weed management (IWM) tactics. Nevertheless, little is known about the beneficial effects of soybean [Glycine max (L.) Merr.] density, an important aspect of IWM, on late Palmer amaranth (Amaranthus palmeri S. Watson) establishment time. Two field experiments were conducted in 2014 and 2015 to investigate how various soybean densities and A. palmeri establishment timings in weeks after crop emergence (WAE) affect height, biomass, and seed production of the weed but also crop yield in drill-seeded soybean. Soybean density had a significant impact on dry weight and seed production of A. palmeri that established within the first 2 wk of crop emergence, but not for establishment timings of the weed 4 wk and later in relation to crop emergence. Differential performance of A. palmeri gender was observed, regarding greater biomass production of female than male plants under crop presence, and merits further investigation. Grain yield reductions were recorded at earlier A. palmeri establishment timings (i.e., 0 and 1 WAE) compared with 8 WAE establishment timing in 2014 and 2015. High soybean densities resulted in greater soybean yields compared with low soybean density, but no grain yield benefits were observed between medium and high soybean densities. Crop budget analysis revealed the benefits of moderate seeding rate (i.e., 250, 000 seeds ha−1) increases in comparison to lower (i.e., 125,000 seeds ha−1) or high (i.e., 400,000 seeds ha−1) on crop revenue, net income returns, and breakeven price. Earlier A. palmeri establishment timings (i.e., 0, 1, and 2 WAE) resulted in lower crop revenue and net income returns compared with later establishment timings of the weed.
Reconstructions of prehistoric vegetation composition help establish natural baselines, variability, and trajectories of forest dynamics before and during the emergence of intensive anthropogenic land use. Pollen–vegetation models (PVMs) enable such reconstructions from fossil pollen assemblages using process-based representations of taxon-specific pollen production and dispersal. However, several PVMs and variants now exist, and the sensitivity of vegetation inferences to PVM selection, variant, and calibration domain is poorly understood. Here, we compare the reconstructions, parameter estimates, and structure of a Bayesian hierarchical PVM, STEPPS, both to observations and to REVEALS, a widely used PVM, for the pre–Euro-American settlement-era vegetation in the northeastern United States (NEUS). We also compare NEUS-based STEPPS parameter estimates to those for the upper midwestern United States (UMW). Both PVMs predict the observed macroscale patterns of vegetation composition in the NEUS; however, reconstructions of minor taxa are less accurate and predictions for some taxa differ between PVMs. These differences can be attributed to intermodel differences in structure and parameter estimates. Estimates of pollen productivity from STEPPS broadly agree with estimates produced for use in REVEALS, while comparison between pollen dispersal parameter estimates shows no significant relationship. STEPPS parameter estimates are similar between the UMW and NEUS, suggesting that STEPPS parameter estimates are transferable between floristically similar regions and scales.
Older adults often have atypical presentation of illness and are particularly vulnerable to influenza and its sequelae, making the validity of influenza case definitions particularly relevant. We sought to assess the performance of influenza-like illness (ILI) and severe acute respiratory illness (SARI) criteria in hospitalized older adults.
Prospective cohort study.
The Serious Outcomes Surveillance Network of the Canadian Immunization Research Network undertakes active surveillance for influenza among hospitalized adults.
Data were pooled from 3 influenza seasons: 2011/12, 2012/13, and 2013/14. The ILI and SARI criteria were defined clinically, and influenza was laboratory confirmed. Frailty was measured using a validated frailty index.
Of 11,379 adult inpatients (7,254 aged ≥65 years), 4,942 (2,948 aged ≥65 years) had laboratory-confirmed influenza. Their median age was 72 years (interquartile range [IQR], 58–82) and 52.6% were women. The sensitivity of ILI criteria was 51.1% (95% confidence interval [CI], 49.6–52.6) for younger adults versus 44.6% (95% CI, 43.6–45.8) for older adults. SARI criteria were met by 64.1% (95% CI, 62.7–65.6) of younger adults versus 57.1% (95% CI, 55.9–58.2) of older adults with laboratory-confirmed influenza. Patients with influenza who were prefrail or frail were less likely to meet ILI and SARI case definitions.
A substantial proportion of older adults, particularly those who are frail, are missed by standard ILI and SARI case definitions. Surveillance using these case definitions is biased toward identifying younger cases, and does not capture the true burden of influenza. Because of the substantial fraction of cases missed, surveillance definitions should not be used to guide diagnosis and clinical management of influenza.
Mono-2-ethylhexyl phthalate (MEHP) is the primary metabolite of the ubiquitous plasticizer and toxicant, di-2-ethylhexyl phthalate. MEHP exposure has been linked to abnormal development, increased oxidative stress, and metabolic syndrome in vertebrates. Nuclear factor, Erythroid 2 Like 2 (Nrf2), is a transcription factor that regulates gene expression in response to oxidative stress. We investigated the role of Nrf2a in larval steatosis following embryonic exposure to MEHP. Wild-type and nrf2a mutant (m) zebrafish embryos were exposed to 0 or 200 μg/l MEHP from 6 to either 96 (histology) or 120 hours post fertilization (hpf). At 120 hpf, exposures were ceased and fish were maintained in clean conditions until 15 days post fertilization (dpf). At 15 dpf, fish lengths and lipid content were examined, and the expression of genes involved in the antioxidant response and lipid processing was quantified. At 96 hpf, a subset of animals treated with MEHP had vacuolization in the liver. At 15 dpf, deficient Nrf2a signaling attenuated fish length by 7.7%. MEHP exposure increased hepatic steatosis and increased expression of peroxisome proliferator-activated receptor alpha target fabp1a1. Cumulatively, these data indicate that developmental exposure alone to MEHP may increase risk for hepatic steatosis and that Nrf2a does not play a major role in this phenotype.
Organic grain producers are interested in interseeding cover crops into corn (Zea mays L.) in regions that have a narrow growing season window for post-harvest establishment of cover crops. A field experiment was replicated across 2 years on three commercial organic farms in Pennsylvania to compare the effects of drill- and broadcast-interseeding to standard grower practices, which included post-harvest seeding cereal rye (Secale cereale L.) at the more southern location and winter fallow at the more northern locations. Drill- and broadcast-interseeding treatments occurred just after last cultivation and used a cover crop mixture of annual ryegrass [Lolium perenne L. ssp. multiflorum (Lam.) Husnot] + orchardgrass (Dactylis glomerata L.) + forage radish (Raphanus sativus L. ssp. longipinnatus). Higher mean fall cover crop biomass and forage radish abundance (% of total) was observed in drill-interseeding treatments compared with broadcast-interseeding. However, corn grain yield and weed suppression and N retention in late-fall and spring were similar among interseeding treatments, which suggests that broadcast-interseeding at last cultivation has the potential to produce similar production and conservation benefits at lower labor and equipment costs in organic systems. Post-harvest seeding cereal rye resulted in greater spring biomass production and N retention compared with interseeded cover crops at the southern location, whereas variable interseeding establishment success and dominance of winter-killed forage radish produced conditions that increased the likelihood of N loss at more northern locations. Additional research is needed to contrast conservation benefits and management tradeoffs between interseeding and post-harvest establishment methods.
Clonal Mycobacterium mucogenicum isolates (determined by molecular typing) were recovered from 19 bronchoscopic specimens from 15 patients. None of these patients had evidence of mycobacterial infection. Laboratory culture materials and bronchoscopes were negative for Mycobacteria. This pseudo-outbreak was caused by contaminated ice used to provide bronchoscopic lavage. Control was achieved by transitioning to sterile ice.
This study used repeated measures data to identify developmental profiles of elevated risk for ADHD (i.e., six or more inattentive and/or hyperactive-impulsive symptoms), with an interest in the age at which ADHD risk first emerged. Risk factors that were measured across the first 3 years of life were used to predict profile membership. Participants included 1,173 children who were drawn from the Family Life Project, an ongoing longitudinal study of children's development in low-income, nonmetropolitan communities. Four heuristic profiles of ADHD risk were identified. Approximately two thirds of children never exhibited elevated risk for ADHD. The remaining children were characterized by early childhood onset and persistent risk (5%), early childhood limited risk (10%), and middle childhood onset risk (19%). Pregnancy and delivery complications and harsh-intrusive caregiving behaviors operated as general risk for all ADHD profiles. Parental history of ADHD was uniquely predictive of early onset and persistent ADHD risk, and low primary caregiver education was uniquely predictive of early childhood limited ADHD risk. Results are discussed with respect to how changes to the age of onset criterion for ADHD in DSM5 may affect etiological research and the need for developmental models of ADHD that inform ADHD symptom persistence and desistance.