We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Prisons are susceptible to outbreaks. Control measures focusing on isolation and cohorting negatively affect wellbeing. We present an outbreak of coronavirus disease 2019 (COVID-19) in a large male prison in Wales, UK, October 2020 to April 2021, and discuss control measures.
We gathered case-information, including demographics, staff-residence postcode, resident cell number, work areas/dates, test results, staff interview dates/notes and resident prison-transfer dates. Epidemiological curves were mapped by prison location. Control measures included isolation (exclusion from work or cell-isolation), cohorting (new admissions and work-area groups), asymptomatic testing (case-finding), removal of communal dining and movement restrictions. Facemask use and enhanced hygiene were already in place. Whole-genome sequencing (WGS) and interviews determined the genetic relationship between cases plausibility of transmission.
Of 453 cases, 53% (n = 242) were staff, most aged 25–34 years (11.5% females, 27.15% males) and symptomatic (64%). Crude attack-rate was higher in staff (29%, 95% CI 26–64%) than in residents (12%, 95% CI 9–15%).
Whole-genome sequencing can help differentiate multiple introductions from person-to-person transmission in prisons. It should be introduced alongside asymptomatic testing as soon as possible to control prison outbreaks. Timely epidemiological investigation, including data visualisation, allowed dynamic risk assessment and proportionate control measures, minimising the reduction in resident welfare.
Archaeological compliance is defined by state and federal legislation and the constrained, precise language in which it is written. Rules and policies operationalize the law but provide some flexibility in its interpretation and implementation. The pronounced use of “legal” and “scientific” language in archaeological compliance can be considered insensitive or offensive to some tribal members when discussing the disposition and care of the remains and belongings of their ancestors. The language we use constructs our reality and defines how we interpret our interactions of the lived experience. It is therefore necessary to revise the language employed in archaeological compliance to ensure that it reflects the values of the communities that these laws define to determine treatment and ultimate disposition of their ancestral remains and belongings. This article describes and encourages the use of a respectful terminology, developed in conjunction with compliance professionals and tribal representatives, to restructure the language we use and redefine our interactions as more considerate of tribal concerns for repatriation.
Among EvergreenHealth Home Care Service professionals, no coronavirus disease 2019 (COVID-19) cases were reported when they were instructed to use standard, contact, and droplet precautions with eye protection while providing home health care to patients diagnosed with laboratory-confirmed severe acute respiratory coronavirus virus 2 (SARS-CoV-2). These precautions might provide some level of protection against coronavirus disease 2019 (COVID-19) among home healthcare personnel.
The mental health impact of the initial years of military service is an under-researched area. This study is the first to explore mental health trajectories and associated predictors in military members across the first 3–4 years of their career to provide evidence to inform early interventions.
Methods
This prospective cohort study surveyed Australian Defence personnel (n = 5329) at four time-points across their early military career. Core outcomes were psychological distress (K10+) and posttraumatic stress symptoms [four-item PTSD Checklist (PCL-4)] with intra-individual, organizational and event-related trajectory predictors. Latent class growth analyses (LCGAs) identified subgroups within the sample that followed similar longitudinal trajectories for these outcomes, while conditional LCGAs examined the variables that influenced patterns of mental health.
Results
Three clear trajectories emerged for psychological distress: resilient (84.0%), worsening (9.6%) and recovery (6.5%). Four trajectories emerged for post-traumatic stress, including resilient (82.5%), recovery (9.6%), worsening (5.8%) and chronic subthreshold (2.3%) trajectories. Across both outcomes, prior trauma exposure alongside modifiable factors, such as maladaptive coping styles, and increased anger and sleep difficulties were associated with the worsening and chronic subthreshold trajectories, whilst members in the resilient trajectories were more likely to be male, report increased social support from family/friends and Australian Defence Force (ADF) sources, and use adaptive coping styles.
Conclusions
The emergence of symptoms of mental health problems occurs early in the military lifecycle for a significant proportion of individuals. Modifiable factors associated with wellbeing identified in this study are ideal targets for intervention, and should be embedded and consolidated throughout the military career.
The Variables and Slow Transients Survey (VAST) on the Australian Square Kilometre Array Pathfinder (ASKAP) is designed to detect highly variable and transient radio sources on timescales from 5 s to
$\sim\!5$
yr. In this paper, we present the survey description, observation strategy and initial results from the VAST Phase I Pilot Survey. This pilot survey consists of
$\sim\!162$
h of observations conducted at a central frequency of 888 MHz between 2019 August and 2020 August, with a typical rms sensitivity of
$0.24\ \mathrm{mJy\ beam}^{-1}$
and angular resolution of
$12-20$
arcseconds. There are 113 fields, each of which was observed for 12 min integration time, with between 5 and 13 repeats, with cadences between 1 day and 8 months. The total area of the pilot survey footprint is 5 131 square degrees, covering six distinct regions of the sky. An initial search of two of these regions, totalling 1 646 square degrees, revealed 28 highly variable and/or transient sources. Seven of these are known pulsars, including the millisecond pulsar J2039–5617. Another seven are stars, four of which have no previously reported radio detection (SCR J0533–4257, LEHPM 2-783, UCAC3 89–412162 and 2MASS J22414436–6119311). Of the remaining 14 sources, two are active galactic nuclei, six are associated with galaxies and the other six have no multi-wavelength counterparts and are yet to be identified.
We present the data and initial results from the first pilot survey of the Evolutionary Map of the Universe (EMU), observed at 944 MHz with the Australian Square Kilometre Array Pathfinder (ASKAP) telescope. The survey covers
$270 \,\mathrm{deg}^2$
of an area covered by the Dark Energy Survey, reaching a depth of 25–30
$\mu\mathrm{Jy\ beam}^{-1}$
rms at a spatial resolution of
$\sim$
11–18 arcsec, resulting in a catalogue of
$\sim$
220 000 sources, of which
$\sim$
180 000 are single-component sources. Here we present the catalogue of single-component sources, together with (where available) optical and infrared cross-identifications, classifications, and redshifts. This survey explores a new region of parameter space compared to previous surveys. Specifically, the EMU Pilot Survey has a high density of sources, and also a high sensitivity to low surface brightness emission. These properties result in the detection of types of sources that were rarely seen in or absent from previous surveys. We present some of these new results here.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Cow’s milk is a naturally nutrient-dense foodstuff. A significant source of many essential nutrients, its inclusion as a component of a healthy balanced diet has been long recommended. Beyond milk’s nutritional value, an increasing body of evidence illustrates cow’s milk may confer numerous benefits related to health. Evidence from adult populations suggests that cow’s milk may have a role in overall dietary quality, appetite control, hydration and cognitive function. Although evidence is limited compared with the adult literature, these benefits may be echoed in recent paediatric studies. This article, therefore, reviews the scientific literature to provide an evidence-based evaluation of the associated health benefits of cow’s milk consumption in primary-school-aged children (4–11 years). We focus on seven key areas related to nutrition and health comprising nutritional status, hydration, dental and bone health, physical stature, cognitive function, and appetite control. The evidence consistently demonstrates cow’s milk (plain and flavoured) improves nutritional status in primary-school-aged children. With some confidence, cow’s milk also appears beneficial for hydration, dental and bone health and beneficial to neutral concerning physical stature and appetite. Due to conflicting studies, reaching a conclusion has proven difficult concerning cow’s milk and cognitive function; therefore, a level of caution should be exercised when interpreting these results. All areas, however, would benefit from further robust investigation, especially in free-living school settings, to verify conclusions. Nonetheless, when the nutritional-, physical- and health-related impact of cow’s milk avoidance is considered, the evidence highlights the importance of increasing cow’s milk consumption.
The aim of this study was to provide insights learned from disaster research response (DR2) efforts following Hurricane Harvey in 2017 to launch DR2 activities following the Intercontinental Terminals Company (ITC) fire in Deer Park, Texas, in 2019.
Methods:
A multidisciplinary group of academic, community, and government partners launched a myriad of DR2 activities.
Results:
The DR2 response to Hurricane Harvey focused on enhancing environmental health literacy around clean-up efforts, measuring environmental contaminants in soil and water in impacted neighborhoods, and launching studies to evaluate the health impact of the disaster. The lessons learned after Harvey enabled rapid DR2 activities following the ITC fire, including air monitoring and administering surveys and in-depth interviews with affected residents.
Conclusions:
Embedding DR2 activities at academic institutions can enable rapid deployment of lessons learned from one disaster to enhance the response to subsequent disasters, even when those disasters are different. Our experience demonstrates the importance of academic institutions working with governmental and community partners to support timely disaster response efforts. Efforts enabled by such experience include providing health and safety training and consistent and reliable messaging, collecting time-sensitive and critical data in the wake of the event, and launching research to understand health impacts and improve resiliency.
Geomorphic mapping, landform and sediment analysis, and cosmogenic 10Be and 36Cl ages from erratics, moraine boulders, and glacially polished bedrock help define the timing of the Wisconsinan glaciations in the Chugach Mountains of south-central Alaska. The maximum extent of glaciation in the Chugach Mountains during the last glacial period (marine isotope stages [MIS] 5d through 2) occurred at ~50 ka during MIS 3. In the Williwaw Lakes valley and Thompson Pass areas of the Chugach Mountains, moraines date to ~26.7 ± 2.4, 25.4 ± 2.4, 18.8 ± 1.6, 19.3 ± 1.7, and 17.3 ± 1.5 ka, representing times of glacial retreat. These data suggest that glaciers retreated later in the Chugach Mountain than in other regions of Alaska. Reconstructed equilibrium-line altitude depressions range from 400 to 430 m for late Wisconsinan glacial advances in the Chugach Mountains, representing a possible temperature depression of 2.1–2.3°C. These reconstructed temperature depressions suggest that climate was warmer in this part of Alaska than in many other regions throughout Alaska and elsewhere in the world during the global last glacial maximum.
We introduce a weak Lefschetz-type result on Chow groups of complete intersections. As an application, we can reproduce some of the results in [P]. The purpose of this paper is not to reproduce all of [P] but rather illustrate why the aforementioned weak Lefschetz result is an interesting idea worth exploiting in itself. We hope the reader agrees.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with
$\sim$
15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination
$+41^\circ$
made over a 288-MHz band centred at 887.5 MHz.
To estimate the impact of California’s antimicrobial stewardship program (ASP) mandate on methicillin-resistant Staphylococcus aureus (MRSA) and Clostridioides difficile infection (CDI) rates in acute-care hospitals.
Population:
Centers for Medicare and Medicaid Services (CMS)–certified acute-care hospitals in the United States.
Data Sources:
2013–2017 data from the CMS Hospital Compare, Provider of Service File and Medicare Cost Reports.
Methods:
Difference-in-difference model with hospital fixed effects to compare California with all other states before and after the ASP mandate. We considered were standardized infection ratios (SIRs) for MRSA and CDI as the outcomes. We analyzed the following time-variant covariates: medical school affiliation, bed count, quality accreditation, number of changes in ownership, compliance with CMS requirements, % intensive care unit beds, average length of stay, patient safety index, and 30-day readmission rate.
Results:
In 2013, California hospitals had an average MRSA SIR of 0.79 versus 0.94 in other states, and an average CDI SIR of 1.01 versus 0.77 in other states. California hospitals had increases (P < .05) of 23%, 30%, and 20% in their MRSA SIRs in 2015, 2016, and 2017, respectively. California hospitals were associated with a 20% (P < .001) decrease in the CDI SIR only in 2017.
Conclusions:
The mandate was associated with a decrease in CDI SIR and an increase in MRSA SIR.
Background: Well-designed infection prevention programs include basic elements aimed at reducing the risk of transmission of infectious agents in healthcare settings. Although most acute-care facilities have robust infection prevention programs, data are sporadic and often lacking in other healthcare settings. Infection control assessment tools were developed by the CDC to assist health departments in assessing infection prevention preparedness across a wide spectrum of health care including acute care, long-term care, outpatient care, and hemodialysis. Methods: The North Carolina Division of Public Health collaborated with the North Carolina Statewide Program for Infection Control and Epidemiology (SPICE) to conduct a targeted number of on-site assessments for each healthcare setting. Three experienced infection preventionists recruited facilities, conducted on-site assessments, provided detailed assessment findings, and developed educational resources. Results: The goal of 250 assessments was exceeded, with 277 on-site assessments completed across 75% of North Carolina counties (Table 1). Compliance with key observations varied by domain and type of care setting (Table 2). Conclusions: Comprehensive on-site assessments of infection prevention programs are an effective way to identify gaps or breaches in infection prevention practices. Gaps identified in acute care primarily related to competency validation: however, gaps presenting a threat to patient safety (ie, reuse of single dose vials, noncompliance with sterilization and/or high-level disinfection processes) were identified in other care settings. Infection control assessment and response findings underscore the need for ongoing assessment, education, and collaboration among all healthcare settings.
We evaluated the impact of reflex urine culture screen results on antibiotic initiation. More patients with positive urine screen but negative culture received antibiotics than those with a negative screen (30.5 vs 7.1%). Urine screen results may inappropriately influence antibiotic initiation in patients with a low likelihood of infection.
This study investigated metabolic, endocrine, appetite and mood responses to a maximal eating occasion in fourteen men (mean: age 28 (sd 5) years, body mass 77·2 (sd 6·6) kg and BMI 24·2 (sd 2·2) kg/m2) who completed two trials in a randomised crossover design. On each occasion, participants ate a homogenous mixed-macronutrient meal (pizza). On one occasion, they ate until ‘comfortably full’ (ad libitum) and on the other, until they ‘could not eat another bite’ (maximal). Mean energy intake was double in the maximal (13 024 (95 % CI 10 964, 15 084) kJ; 3113 (95 % CI 2620, 3605) kcal) compared with the ad libitum trial (6627 (95 % CI 5708, 7547) kJ; 1584 (95 % CI 1364, 1804) kcal). Serum insulin incremental AUC (iAUC) increased approximately 1·5-fold in the maximal compared with ad libitum trial (mean: ad libitum 43·8 (95 % CI 28·3, 59·3) nmol/l × 240 min and maximal 67·7 (95 % CI 47·0, 88·5) nmol/l × 240 min, P < 0·01), but glucose iAUC did not differ between trials (ad libitum 94·3 (95 % CI 30·3, 158·2) mmol/l × 240 min and maximal 126·5 (95 % CI 76·9, 176·0) mmol/l × 240 min, P = 0·19). TAG iAUC was approximately 1·5-fold greater in the maximal v. ad libitum trial (ad libitum 98·6 (95 % CI 69·9, 127·2) mmol/l × 240 min and maximal 146·4 (95 % CI 88·6, 204·1) mmol/l × 240 min, P < 0·01). Total glucagon-like peptide-1, glucose-dependent insulinotropic peptide and peptide tyrosine–tyrosine iAUC were greater in the maximal compared with ad libitum trial (P < 0·05). Total ghrelin concentrations decreased to a similar extent, but AUC was slightly lower in the maximal v. ad libitum trial (P = 0·02). There were marked differences on appetite and mood between trials, most notably maximal eating caused a prolonged increase in lethargy. Healthy men have the capacity to eat twice the energy content required to achieve comfortable fullness at a single meal. Postprandial glycaemia is well regulated following initial overeating, with elevated postprandial insulinaemia probably contributing.
Daily use of high-potency cannabis has been reported to carry a high risk for developing a psychotic disorder. However, the evidence is mixed on whether any pattern of cannabis use is associated with a particular symptomatology in first-episode psychosis (FEP) patients.
Method
We analysed data from 901 FEP patients and 1235 controls recruited across six countries, as part of the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions (EU-GEI) study. We used item response modelling to estimate two bifactor models, which included general and specific dimensions of psychotic symptoms in patients and psychotic experiences in controls. The associations between these dimensions and cannabis use were evaluated using linear mixed-effects models analyses.
Results
In patients, there was a linear relationship between the positive symptom dimension and the extent of lifetime exposure to cannabis, with daily users of high-potency cannabis having the highest score (B = 0.35; 95% CI 0.14–0.56). Moreover, negative symptoms were more common among patients who never used cannabis compared with those with any pattern of use (B = −0.22; 95% CI −0.37 to −0.07). In controls, psychotic experiences were associated with current use of cannabis but not with the extent of lifetime use. Neither patients nor controls presented differences in depressive dimension related to cannabis use.
Conclusions
Our findings provide the first large-scale evidence that FEP patients with a history of daily use of high-potency cannabis present with more positive and less negative symptoms, compared with those who never used cannabis or used low-potency types.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.