To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Contact tracing alone is often inadequate to determine the source of healthcare personnel (HCP) COVID-19 when SARS-CoV-2 is widespread in the community. We combined whole-genome sequencing (WGS) with traditional epidemiologic analysis to investigate the frequency with which patients or other HCP with symptomatic COVID-19 acted as the source of HCP infection at a large tertiary-care center early in the pandemic. Methods: Cohort samples were selected from patients and HCP with PCR-positive SARS-CoV-2 infection from a period with complete retention of samples (March 14, 2021–April 10, 2020) at Rush University Medical Center, a 664-bed hospital in Chicago, Illinois. During this period, testing was limited to symptomatic patients and HCP. Recommended respiratory equipment for HCP evolved under guidance, including a 19-day period when medical face masks were recommended for COVID-19 care except for aerosol-generating procedures. Viral RNA was extracted and sequenced (NovaSeq, Illumina) from remnant nasopharyngeal swab samples in M4RT viral transport medium. Genomes with >90% coverage underwent cluster detection using a 2 single-nucleotide variant genetic distance cutoff. Genomic clusters were independently evaluated for valid epidemiologic links by 2 infectious diseases physicians (with a third adjudicator) using metadata extracted from the electronic medical record and according to predetermined criteria (Table 1). Results: In total, 1,031 SARS-CoV-2 sequences were analyzed, identifying 49 genomic clusters with HCP (median, 8; range, 2–43 members per cluster; total, 268 patients and 115 HCP) (Fig. 1). Also, 20,190 flowsheet activities were documented for cohort HCP and patient interactions, including 686 instances in which a cohort HCP contributed to a cohort patient’s chart. Most HCP infections were considered not healthcare associated (88 of 115, 76.5%). We did not identify any strong linkages for patient-to-HCP transmission. Moreover, 13 HCP cases (11.3%) were attributed to patient source (weak linkage). Also, 14 HCP cases (12.2%) were attributed to HCP source (11 strong and 3 weak linkages). Weak linkages were due to lack of epidemiologic data for HCP location, particularly nonclinical staff (eg, an environmental service worker who lacked location documentation to rule out patient-specific contact). Agreement for epidemiologic linkage between the 2 evaluators was high (κ, 0.91). Conclusions: Using genomic and epidemiologic data, we found that most HCP COVID-19 infections were not healthcare associated. We found weak evidence to support symptomatic patient-to-HCP transmission of SARS-CoV-2 and stronger evidence for HCP-to-HCP transmission. Large genomic clusters without plausible epidemiologic links were identified, reflecting the limited utility of genomic surveillance alone to characterize chains of transmission of SARS-CoV-2 during extensive community spread.
OBJECTIVES/GOALS: We aim to determine whether non-neuronal, non-synaptic glutamate signaling mechanisms can be targeted to produce highly specific, narrow changes in brain function that would benefit CNS disorders. To do this, we investigated cognitive changes produced through manipulating the activity of the astrocytic glutamate release mechanism system xc-. METHODS/STUDY POPULATION: System xc- (Sxc) activity was eliminated by mutating the gene Slc7a11 through pronuclear injection of zinc-finger nucleases into Sprague Dawley rat embryos to create a line of rats lacking Sxc (MSxc rats). To confirm a lack of Sxc activity, we verified that tissue from MSxc rats had a complete lack of xCT, which is the regulatory subunit of Sxc that is encoded by Slc7a11. We also verified that astrocyte cultures generated from MSxc tissue lacked cystine-evoked glutamate release. Next, we measured development (body weight), CNS regulation of metabolism, and other indicators of generalized, non-specific brain function as well as behaviors that are reliant on executive function, such as cognitive flexibility, impulse control, decision-making, and response inhibition. RESULTS/ANTICIPATED RESULTS: Eliminating Sxc was not lethal and did not impair development or produce widespread changes in brain function as is commonly observed when deleting other glutamate mechanisms. MSxc rats did not differ from wildtype in growth rate, central regulation of metabolism as reflected by absolute or diurnal changes in core body temperature, locomotor activity in a familiar or novel environment, or simple forms of cognition such as novel object recognition, or operant responding (food and cocaine-reinforced). In contrast, behaviors that rely on executive function were impaired. MSxc rats displayed deficits in cocaine reinstatement and attentional set-shifting. We anticipate MSxc rats to also show impairments in decision-making in the rat gambling task and response inhibition in the stop-signal reaction time task. DISCUSSION/SIGNIFICANCE: Eliminating Sxc activity in rats produced deficits in behaviors reliant on executive function without impacting development or simple brain function. These results highlight the potential of targeting Sxc to enhance cognition without generating therapeutically limiting adverse effects resulting from non-specific changes in brain function.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
As elementary and secondary school educators increasingly adopt digital games to teach content in a range of subjects, and as education and game scholars turn their attention to ‘serious games’, it is worth noting that serious games are nothing new to Shakespeare classrooms. Non-digital games and playful performance practices have long been a standard part of teaching the dramas of Shakespeare. Indeed, the use of physical, play-based methods of teaching Shakespeare – or what we shall call ‘playful pedagogy’ – has become something of an industry in the world of Shakespeare education. Theatrical games and dramatic playfulness are central to the teacher-training programmes touted by Education departments in many well-established Shakespeare theatres. The Royal Shakespeare Company calls their programme ‘rehearsal room pedagogy’, Shakespeare’s Globe has its ‘Globe Strategies’, Chicago Shakespeare has its ‘drama-based strategies’, and there are similar initiatives at other theatres, including the American Shakespeare Center in Virginia and the Folger Shakespeare Library. Education departments of these and other Shakespeare theatres offer specialized workshops that train teachers to use playful pedagogy in their classrooms.
Language documentation faces a persistent and pervasive problem: How much material is enough to represent a language fully? How much text would we need to sample the full phoneme inventory of a language? In the phonetic/phonemic domain, what proportion of the phoneme inventory can we expect to sample in a text of a given length? Answering these questions in a quantifiable way is tricky, but asking them is necessary. The cumulative collection of Illustrative Texts published in the Illustration series in this journal over more than four decades (mostly renditions of the ‘North Wind and the Sun’) gives us an ideal dataset for pursuing these questions. Here we investigate a tractable subset of the above questions, namely: What proportion of a language’s phoneme inventory do these texts enable us to recover, in the minimal sense of having at least one allophone of each phoneme? We find that, even with this low bar, only three languages (Modern Greek, Shipibo and the Treger dialect of Breton) attest all phonemes in these texts. Unsurprisingly, these languages sit at the low end of phoneme inventory sizes (respectively 23, 24 and 36 phonemes). We then estimate the rate at which phonemes are sampled in the Illustrative Texts and extrapolate to see how much text it might take to display a language’s full inventory. Finally, we discuss the implications of these findings for linguistics in its quest to represent the world’s phonetic diversity, and for JIPA in its design requirements for Illustrations and in particular whether supplementary panphonic texts should be included.
This research communication reports the results from questionnaires used to identify the impact of recent research into the disinfection of cattle foot-trimming equipment to prevent bovine digital dermatitis (BDD) transmission on (a) biosecurity knowledge and (b) hygiene practice of foot health professionals. An initial questionnaire found that more than half of participating farmers, veterinary surgeons and commercial foot-trimmers were not considering hand or hoof-knife hygiene in their working practices. The following year, after the release of a foot-trimming hygiene protocol and a comprehensive knowledge exchange programme by the University of Liverpool, a second survey showed 35/80 (43.8%) farmers, veterinary surgeons and commercial foot-trimmers sampled considered they were now more aware of the risk of spreading BDD during foot- trimming. Furthermore, 36/80 (45.0%) had enhanced their hygiene practice in the last year, impacting an estimated 1383 farms and 5130 cows trimmed each week. Participants who reported having seen both the foot-trimming hygiene protocol we developed with AHDB Dairy and other articles about foot-trimming hygiene in the farming and veterinary press, were significantly more likely to have changed their working practices. Difficulties accessing water and cleaning facilities on farms were identified as the greatest barrier to improving biosecurity practices. Participants' preferred priority for future research was continued collection of evidence for the importance and efficacy of good foot-trimming hygiene practices.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Background: Central-line–associated bloodstream infections (CLABSIs) result in increased patient morbidity. Guidelines recommend against peripheral venous catheters when access is required for longer than 6 days, often leading to central venous catheter (CVC) placement. To improve vascular access device choice and reduce the potential risk of CLABSI, we implemented a quality improvement initiative comprised of a vascular access algorithm and introduction of a midline vascular access device (MVAD). We report complications associated with MVAD use including deep vein thrombosis (DVT), thrombophlebitis, and BSI. Methods: A prospective quality improvement assessment from October 2017 through March 2018. All MVADs were monitored for DVT, thrombophlebitis, and BSI. Insertion time and removal of MVAD were tracked, as well as presence of other vascular access devices. Results: From October 2017 through March 2018, 858 MVADs were inserted in 726 different patients, yielding 3,588 MVD days. In total, 6 primary BSIs occurred in patients with MVADs. In patients with only a MVAD, the rate was 0.72 BSI per 1,000 MVAD days, whereas patients with an MVAD as well as a CVC had a rate of 1.98 per 1,000 MVAD days. The overall CLABSI rate at the institution during this period of time was 1.24 per 1,000 CVC days. Also, 29 cases of thrombophlebitis occurred, for a rate of 3.84 per 1,000 catheter days in patients with only an MVAD compared to 4.63 per 1,000 catheter days in patients with an MVAD and a CVC. Also, 25 DVTs occurred during this time, resulting in a rate of 2.88 per 1,000 catheter days in patients with only an MVAD and 4.63 per 1,000 catheter days in patients with multiple vascular-access devices. A significant correlation was noted between MVAD indwell time and BSI (P = .0021) and thrombophlebitis (P = .0041). The median indwell time for patients experiencing BSI was 16.17 days ± 8.04 days, whereas the median indwell time for patients experiencing thrombophlebitis was 9.24 days ± 7.99 days. Conclusions: The implementation of a vascular-access algorithm including MVAD may effectively reduce CVC insertions and BSIs. The rate of BSI in MVAD was below that of CLABSI during the assessment period. Known complications associated with MVAD include DVTs and thrombophlebitis, which correlates with the duration of catheterization, and these risks appear to be further compounded in patients requiring multiple devices for vascular access. Further research into comparing the risk of vascular access of MVAD with CVC is warranted.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
The evidence of funerary archaeology, historical sources and poetry has been used to define a ‘heroic warrior ethos’ across Northern Europe during the first millennium AD. In northern Britain, burials of later prehistoric to early medieval date are limited, as are historical and literary sources. There is, however, a rich sculptural corpus, to which a newly discovered monolith with an image of a warrior can now be added. Comparative analysis reveals a materialisation of a martial ideology on carved stone monuments, probably associated with elite cemeteries, highlighting a regional expression of the warrior ethos in late Roman and post-Roman Europe.
Physical attraction is an important dimension of both romantic and companionate relationship of partners. This article presents a comprehensive cross-cultural validation of the short version of the Physical Attraction Scale (PAS-S) scale — the first and only multidimensional measure of physical attraction available for research and practice. The initial development of the scale was completed in a multisite study conducted with a large sample of university students, largely from the midwest and southeast of the United States. Results demonstrated a two-dimensional factor structure, excellent reliability, and evidence of content, convergent and discriminant validity. The following cross-cultural studies, which used the PAS-S, confirmed its robust factor structure, validity and reliability in the samples from 10 cultural regions in six countries. Therefore, this short version of the PAS-S can be recommended for cross-cultural practice and research. The versions of the scale in English, French, Portuguese, Russian and Georgian are provided in appendices. Based on the results of cross-cultural validation, authors recommend the PAS-S for research purposes and practical use in counselling and therapy. The scale provides a short and informative measure of (1) how a person feels attraction to their partner in close relationships and (2) which aspects of attraction are problematic.
Personal names appear in almost all texts about early medieval Insular societies, but it is more common to study the people behind the names or consider individual names on a case-by-case basis than to consider naming practices more broadly. For early medieval Scotland, we have literary sources such as saints’ Lives and poetry, and histories, most notably Bede's ‘Ecclesiastical History’, but these do not provide names covering the whole period. The genealogies of important kindreds of the Gaelic world provide a massive corpus of names, with an often impressive degree of coverage for Ireland, less for Scotland, and evidence for relationships between people. However, as these genealogies do not survive in early medieval manuscripts, they were also subject to later manipulation and fabrication. In addition, they have a major drawback: the genealogical genre tries to provide every generation of a person's ancestry, so they tend to state that individuals were the sons or fathers of others, usually either in the form ‘X son of Y son of Z’, or ‘these are the sons of X, that is Y and Z’. As a result, apart from an individual's own first name, these sources do not allow us to understand well how people were actually called by contemporaries. In medieval societies, where ancestry was significant, people could be identified not only by their parentage, but also by their grandparents or other ancestors, their kindred, or by a place, practices hidden by the form of the genealogical genre.
Such practices are, however, visible in the Gaelic chronicles, which contain the names of hundreds of individuals each century, with names comprising a substantial proportion of these texts overall. While the form of the personal names is affected by the nature of the event and how the annalist wanted to present it, as well as the overall tendency towards brevity in this genre, the chronicles display considerable variety in the form of personal names. The result is that these chronicles are major sources for personal names and naming practices in Scotland and Ireland in the period before A.D. 1100, before the growth of administrative documents, such as charters, produces a transformation in the evidence available for study.
Significant experimental evidence supports fat as a taste modality; however, the associated peripheral mechanisms are not well established. Several candidate taste receptors have been identified, but their expression pattern and potential functions in human fungiform papillae remain unknown. The aim of this study is to identify the fat taste candidate receptors and ion channels that were expressed in human fungiform taste buds and their association with oral sensory of fatty acids. For the expression analysis, quantitative RT-PCR (qRT-PCR) from RNA extracted from human fungiform papillae samples was used to determine the expression of candidate fatty acid receptors and ion channels. Western blotting analysis was used to confirm the presence of the proteins in fungiform papillae. Immunohistochemistry analysis was used to localise the expressed receptors or ion channels in the taste buds of fungiform papillae. The correlation study was analysed between the expression level of the expressed fat taste receptors or ion channels indicated by qRT-PCR and fat taste threshold, liking of fatty food and fat intake. As a result, qRT-PCR and western blotting indicated that mRNA and protein of CD36, FFAR4, FFAR2, GPR84 and delayed rectifying K+ channels are expressed in human fungiform taste buds. The expression level of CD36 was associated with the liking difference score (R −0·567, β=−0·04, P=0·04) between high-fat and low-fat food and FFAR2 was associated with total fat intake (ρ=−0·535, β=−0·01, P=0·003) and saturated fat intake (ρ=−0·641, β=−0·02, P=0·008).
We present a multi-frequency study of the intermediate spiral SAB(r)bc type galaxy NGC 6744, using available data from the Chandra X-Ray telescope, radio continuum data from the Australia Telescope Compact Array and Murchison Widefield Array, and Wide-field Infrared Survey Explorer infrared observations. We identify 117 X-ray sources and 280 radio sources. Of these, we find nine sources in common between the X-ray and radio catalogues, one of which is a faint central black hole with a bolometric radio luminosity similar to the Milky Way’s central black hole. We classify 5 objects as supernova remnant (SNR) candidates, 2 objects as likely SNRs, 17 as H ii regions, 1 source as an AGN; the remaining 255 radio sources are categorised as background objects and one X-ray source is classified as a foreground star. We find the star-formation rate (SFR) of NGC 6744 to be in the range 2.8–4.7 M⊙~yr − 1 signifying the galaxy is still actively forming stars. The specific SFR of NGC 6744 is greater than that of late-type spirals such as the Milky Way, but considerably less that that of a typical starburst galaxy.