To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Cow’s milk is a naturally nutrient-dense foodstuff. A significant source of many essential nutrients, its inclusion as a component of a healthy balanced diet has been long recommended. Beyond milk’s nutritional value, an increasing body of evidence illustrates cow’s milk may confer numerous benefits related to health. Evidence from adult populations suggests that cow’s milk may have a role in overall dietary quality, appetite control, hydration and cognitive function. Although evidence is limited compared to the adult literature, these benefits may be echoed in recent paediatric studies. This article, therefore, reviews the scientific literature to provide an evidence-based evaluation of the associated health benefits of cow’s milk consumption in primary-school aged children (4-11 years). We focus on seven key areas related to nutrition and health comprising nutritional status, hydration, dental and bone health, physical stature, cognitive function, and appetite control. The evidence consistently demonstrates cow’s milk (plain and flavoured) improves nutritional status in primary-school aged children. With some confidence, cow’s milk also appears beneficial for hydration, dental and bone health and beneficial to neutral concerning physical stature and appetite. Due to conflicting studies, reaching a conclusion has proven difficult concerning cow’s milk and cognitive function therefore a level of caution should be exercised when interpreting these results. All areas, however, would benefit from further robust investigation, especially in free-living school settings, to verify conclusions. Nonetheless, when the nutritional-, physical- and health-related impact of cow’s milk avoidance is considered, the evidence highlights the importance of increasing cow’s milk consumption.
The aim of this study was to provide insights learned from disaster research response (DR2) efforts following Hurricane Harvey in 2017 to launch DR2 activities following the Intercontinental Terminals Company (ITC) fire in Deer Park, Texas, in 2019.
A multidisciplinary group of academic, community, and government partners launched a myriad of DR2 activities.
The DR2 response to Hurricane Harvey focused on enhancing environmental health literacy around clean-up efforts, measuring environmental contaminants in soil and water in impacted neighborhoods, and launching studies to evaluate the health impact of the disaster. The lessons learned after Harvey enabled rapid DR2 activities following the ITC fire, including air monitoring and administering surveys and in-depth interviews with affected residents.
Embedding DR2 activities at academic institutions can enable rapid deployment of lessons learned from one disaster to enhance the response to subsequent disasters, even when those disasters are different. Our experience demonstrates the importance of academic institutions working with governmental and community partners to support timely disaster response efforts. Efforts enabled by such experience include providing health and safety training and consistent and reliable messaging, collecting time-sensitive and critical data in the wake of the event, and launching research to understand health impacts and improve resiliency.
Debate about the nature of climate and the magnitude of ecological change across Australia during the last glacial maximum (LGM; 26.5–19 ka) persists despite considerable research into the late Pleistocene. This is partly due to a lack of detailed paleoenvironmental records and reliable chronological frameworks. Geochemical and geochronological analyses of a 60 ka sedimentary record from Brown Lake, subtropical Queensland, are presented and considered in the context of climate-controlled environmental change. Optically stimulated luminescence dating of dune crests adjacent to prominent wetlands across North Stradbroke Island (Minjerribah) returned a mean age of 119.9 ± 10.6 ka; indicating relative dune stability soon after formation in Marine Isotope Stage 5. Synthesis of wetland sediment geochemistry across the island was used to identify dust accumulation and applied as an aridification proxy over the last glacial-interglacial cycle. A positive trend of dust deposition from ca. 50 ka was found with highest influx occurring leading into the LGM. Complexities of comparing sedimentary records and the need for robust age models are highlighted with local variation influencing the accumulation of exogenic material. An inter-site comparison suggests enhanced moisture stress regionally during the last glaciation and throughout the LGM, returning to a more positive moisture balance ca. 8 ka.
Geomorphic mapping, landform and sediment analysis, and cosmogenic 10Be and 36Cl ages from erratics, moraine boulders, and glacially polished bedrock help define the timing of the Wisconsinan glaciations in the Chugach Mountains of south-central Alaska. The maximum extent of glaciation in the Chugach Mountains during the last glacial period (marine isotope stages [MIS] 5d through 2) occurred at ~50 ka during MIS 3. In the Williwaw Lakes valley and Thompson Pass areas of the Chugach Mountains, moraines date to ~26.7 ± 2.4, 25.4 ± 2.4, 18.8 ± 1.6, 19.3 ± 1.7, and 17.3 ± 1.5 ka, representing times of glacial retreat. These data suggest that glaciers retreated later in the Chugach Mountain than in other regions of Alaska. Reconstructed equilibrium-line altitude depressions range from 400 to 430 m for late Wisconsinan glacial advances in the Chugach Mountains, representing a possible temperature depression of 2.1–2.3°C. These reconstructed temperature depressions suggest that climate was warmer in this part of Alaska than in many other regions throughout Alaska and elsewhere in the world during the global last glacial maximum.
We introduce a weak Lefschetz-type result on Chow groups of complete intersections. As an application, we can reproduce some of the results in [P]. The purpose of this paper is not to reproduce all of [P] but rather illustrate why the aforementioned weak Lefschetz result is an interesting idea worth exploiting in itself. We hope the reader agrees.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with
15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination
made over a 288-MHz band centred at 887.5 MHz.
To estimate the impact of California’s antimicrobial stewardship program (ASP) mandate on methicillin-resistant Staphylococcus aureus (MRSA) and Clostridioides difficile infection (CDI) rates in acute-care hospitals.
Centers for Medicare and Medicaid Services (CMS)–certified acute-care hospitals in the United States.
2013–2017 data from the CMS Hospital Compare, Provider of Service File and Medicare Cost Reports.
Difference-in-difference model with hospital fixed effects to compare California with all other states before and after the ASP mandate. We considered were standardized infection ratios (SIRs) for MRSA and CDI as the outcomes. We analyzed the following time-variant covariates: medical school affiliation, bed count, quality accreditation, number of changes in ownership, compliance with CMS requirements, % intensive care unit beds, average length of stay, patient safety index, and 30-day readmission rate.
In 2013, California hospitals had an average MRSA SIR of 0.79 versus 0.94 in other states, and an average CDI SIR of 1.01 versus 0.77 in other states. California hospitals had increases (P < .05) of 23%, 30%, and 20% in their MRSA SIRs in 2015, 2016, and 2017, respectively. California hospitals were associated with a 20% (P < .001) decrease in the CDI SIR only in 2017.
The mandate was associated with a decrease in CDI SIR and an increase in MRSA SIR.
Background: Well-designed infection prevention programs include basic elements aimed at reducing the risk of transmission of infectious agents in healthcare settings. Although most acute-care facilities have robust infection prevention programs, data are sporadic and often lacking in other healthcare settings. Infection control assessment tools were developed by the CDC to assist health departments in assessing infection prevention preparedness across a wide spectrum of health care including acute care, long-term care, outpatient care, and hemodialysis. Methods: The North Carolina Division of Public Health collaborated with the North Carolina Statewide Program for Infection Control and Epidemiology (SPICE) to conduct a targeted number of on-site assessments for each healthcare setting. Three experienced infection preventionists recruited facilities, conducted on-site assessments, provided detailed assessment findings, and developed educational resources. Results: The goal of 250 assessments was exceeded, with 277 on-site assessments completed across 75% of North Carolina counties (Table 1). Compliance with key observations varied by domain and type of care setting (Table 2). Conclusions: Comprehensive on-site assessments of infection prevention programs are an effective way to identify gaps or breaches in infection prevention practices. Gaps identified in acute care primarily related to competency validation: however, gaps presenting a threat to patient safety (ie, reuse of single dose vials, noncompliance with sterilization and/or high-level disinfection processes) were identified in other care settings. Infection control assessment and response findings underscore the need for ongoing assessment, education, and collaboration among all healthcare settings.
We evaluated the impact of reflex urine culture screen results on antibiotic initiation. More patients with positive urine screen but negative culture received antibiotics than those with a negative screen (30.5 vs 7.1%). Urine screen results may inappropriately influence antibiotic initiation in patients with a low likelihood of infection.
This study investigated metabolic, endocrine, appetite and mood responses to a maximal eating occasion in fourteen men (mean: age 28 (sd 5) years, body mass 77·2 (sd 6·6) kg and BMI 24·2 (sd 2·2) kg/m2) who completed two trials in a randomised crossover design. On each occasion, participants ate a homogenous mixed-macronutrient meal (pizza). On one occasion, they ate until ‘comfortably full’ (ad libitum) and on the other, until they ‘could not eat another bite’ (maximal). Mean energy intake was double in the maximal (13 024 (95 % CI 10 964, 15 084) kJ; 3113 (95 % CI 2620, 3605) kcal) compared with the ad libitum trial (6627 (95 % CI 5708, 7547) kJ; 1584 (95 % CI 1364, 1804) kcal). Serum insulin incremental AUC (iAUC) increased approximately 1·5-fold in the maximal compared with ad libitum trial (mean: ad libitum 43·8 (95 % CI 28·3, 59·3) nmol/l × 240 min and maximal 67·7 (95 % CI 47·0, 88·5) nmol/l × 240 min, P < 0·01), but glucose iAUC did not differ between trials (ad libitum 94·3 (95 % CI 30·3, 158·2) mmol/l × 240 min and maximal 126·5 (95 % CI 76·9, 176·0) mmol/l × 240 min, P = 0·19). TAG iAUC was approximately 1·5-fold greater in the maximal v. ad libitum trial (ad libitum 98·6 (95 % CI 69·9, 127·2) mmol/l × 240 min and maximal 146·4 (95 % CI 88·6, 204·1) mmol/l × 240 min, P < 0·01). Total glucagon-like peptide-1, glucose-dependent insulinotropic peptide and peptide tyrosine–tyrosine iAUC were greater in the maximal compared with ad libitum trial (P < 0·05). Total ghrelin concentrations decreased to a similar extent, but AUC was slightly lower in the maximal v. ad libitum trial (P = 0·02). There were marked differences on appetite and mood between trials, most notably maximal eating caused a prolonged increase in lethargy. Healthy men have the capacity to eat twice the energy content required to achieve comfortable fullness at a single meal. Postprandial glycaemia is well regulated following initial overeating, with elevated postprandial insulinaemia probably contributing.
Daily use of high-potency cannabis has been reported to carry a high risk for developing a psychotic disorder. However, the evidence is mixed on whether any pattern of cannabis use is associated with a particular symptomatology in first-episode psychosis (FEP) patients.
We analysed data from 901 FEP patients and 1235 controls recruited across six countries, as part of the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions (EU-GEI) study. We used item response modelling to estimate two bifactor models, which included general and specific dimensions of psychotic symptoms in patients and psychotic experiences in controls. The associations between these dimensions and cannabis use were evaluated using linear mixed-effects models analyses.
In patients, there was a linear relationship between the positive symptom dimension and the extent of lifetime exposure to cannabis, with daily users of high-potency cannabis having the highest score (B = 0.35; 95% CI 0.14–0.56). Moreover, negative symptoms were more common among patients who never used cannabis compared with those with any pattern of use (B = −0.22; 95% CI −0.37 to −0.07). In controls, psychotic experiences were associated with current use of cannabis but not with the extent of lifetime use. Neither patients nor controls presented differences in depressive dimension related to cannabis use.
Our findings provide the first large-scale evidence that FEP patients with a history of daily use of high-potency cannabis present with more positive and less negative symptoms, compared with those who never used cannabis or used low-potency types.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
Healthcare personnel who perform invasive procedures and are living with HIV or hepatitis B have been required to self-notify the NC state health department since 1992. State coordinated review of HCP utilizes a panel of experts to evaluate transmission risk and recommend infection prevention measures. We describe how this practice balances HCP privacy and patient safety and health.
To measure the association between statewide adoption of the Centers for Disease Control and Prevention’s (CDC’s) Core Elements for Hospital Antimicrobial Stewardship Programs (Core Elements) and hospital-associated methicillin-resistant Staphylococcus aureus bacteremia (MRSA) and Clostridioides difficile infection (CDI) rates in the United States. We hypothesized that states with a higher percentage of reported compliance with the Core Elements have significantly lower MRSA and CDI rates.
All US states.
Observational longitudinal study.
We used 2014–2016 data from Hospital Compare, Provider of Service files, Medicare cost reports, and the CDC’s Patient Safety Atlas website. Outcomes were MRSA standardized infection ratio (SIR) and CDI SIR. The key explanatory variable was the percentage of hospitals that meet the Core Elements in each state. We estimated state and time fixed-effects models with time-variant controls, and we weighted our analyses for the number of hospitals in the state.
The percentage of hospitals reporting compliance with the Core Elements between 2014 and 2016 increased in all states. A 1% increase in reported ASP compliance was associated with a 0.3% decrease (P < .01) in CDIs in 2016 relative to 2014. We did not find an association for MRSA infections.
Increasing documentation of the Core Elements may be associated with decreases in the CDI SIR. We did not find evidence of such an association for the MRSA SIR, probably due to the short length of the study and variety of stewardship strategies that ASPs may encompass.
Although death by neurologic criteria (brain death) is legally recognized throughout the United States, state laws and clinical practice vary concerning three key issues: (1) the medical standards used to determine death by neurologic criteria, (2) management of family objections before determination of death by neurologic criteria, and (3) management of religious objections to declaration of death by neurologic criteria. The American Academy of Neurology and other medical stakeholder organizations involved in the determination of death by neurologic criteria have undertaken concerted action to address variation in clinical practice in order to ensure the integrity of brain death determination. To complement this effort, state policymakers must revise legislation on the use of neurologic criteria to declare death. We review the legal history and current laws regarding neurologic criteria to declare death and offer proposed revisions to the Uniform Determination of Death Act (UDDA) and the rationale for these recommendations.
It has been hypothesised that refugees have an increased risk of suicide.
To investigate whether risk of suicide is higher among refugees compared with non-refugee migrants from the same areas of origin and with the Swedish-born population, and to examine whether suicide rates among migrants converge to the Swedish-born population over time.
A population-based cohort design using linked national registers to follow 1 457 898 people born between 1 January 1970 and 31 December 1984, classified by migrant status as refugees, non-refugee migrants or Swedish-born. Participants were followed from their 16th birthday or date of arrival in Sweden until death, emigration or 31 December 2015, whichever came first. Cox regression models estimated adjusted hazard ratios for suicide by migrant status, controlling for age, gender, region of origin and income.
There were no significant differences in suicide risk between refugee and non-refugee migrants (hazard ratio 1.28, 95% CI 0.93–1.76) and both groups had a lower risk of suicide than Swedish born. During their first 5 years in Sweden no migrants died by suicide; however, after 21–31 years their suicide risk was equivalent to the Swedish-born population (hazard ratio 0.94, 95% CI 0.79–1.22). After adjustment for income this risk was significantly lower for migrants than the Swedish-born population.
Being a refugee was not an additional risk factor for suicide. Our findings regarding temporal changes in suicide risk suggest that acculturation and socioeconomic deprivation may account for a convergence of suicide risk between migrants and the host population over time.
Point-of-care ultrasound (POCUS) is used increasingly during resuscitation. The aim of this study was to assess whether combining POCUS and electrocardiogram (ECG) rhythm findings better predicts outcomes during cardiopulmonary resuscitation in the emergency department (ED).
We completed a health records review on ED cardiac arrest patients who underwent POCUS. Primary outcome measurements included return of spontaneous circulation (ROSC), survival to hospital admission, and survival to hospital discharge.
POCUS was performed on 180 patients; 45 patients (25.0%; 19.2%–31.8%) demonstrated cardiac activity on initial ECG, and 21 (11.7%; 7.7%–17.2%) had cardiac activity on initial POCUS; 47 patients (26.1%; 20.2%–33.0%) achieved ROSC, 18 (10.0%; 6.3%–15.3%) survived to admission, and 3 (1.7%; 0.3%–5.0%) survived to hospital discharge. As a predictor of failure to achieve ROSC, ECG had a sensitivity of 82.7% (95% CI 75.2%–88.7%) and a specificity of 46.8% (32.1%–61.9%). Overall, POCUS had a higher sensitivity of 96.2% (91.4%–98.8%) but a similar specificity of 34.0% (20.9%–49.3%). In patients with ECG-asystole, POCUS had a sensitivity of 98.18% (93.59%–99.78%) and a specificity of 16.00% (4.54%–36.08%). In patients with pulseless electrical activity, POCUS had a sensitivity of 86.96% (66.41%–97.22%) and a specificity of 54.55% (32.21%–75.61%). Similar patterns were seen for survival to admission and discharge. Only 0.8% (0.0–4.7%) of patients with ECG-asystole and standstill on POCUS survived to hospital discharge.
The absence of cardiac activity on POCUS, or on both ECG and POCUS together, better predicts negative outcomes in cardiac arrest than ECG alone. No test reliably predicted survival.
Peripheral low-grade inflammation in depression is increasingly seen as a therapeutic target. We aimed to establish the prevalence of low-grade inflammation in depression, using different C-reactive protein (CRP) levels, through a systematic literature review and meta-analysis.
We searched the PubMed database from its inception to July 2018, and selected studies that assessed depression using a validated tool/scale, and allowed the calculation of the proportion of patients with low-grade inflammation (CRP >3 mg/L) or elevated CRP (>1 mg/L).
After quality assessment, 37 studies comprising 13 541 depressed patients and 155 728 controls were included. Based on the meta-analysis of 30 studies, the prevalence of low-grade inflammation (CRP >3 mg/L) in depression was 27% (95% CI 21–34%); this prevalence was not associated with sample source (inpatient, outpatient or population-based), antidepressant treatment, participant age, BMI or ethnicity. Based on the meta-analysis of 17 studies of depression and matched healthy controls, the odds ratio for low-grade inflammation in depression was 1.46 (95% CI 1.22–1.75). The prevalence of elevated CRP (>1 mg/L) in depression was 58% (95% CI 47–69%), and the meta-analytic odds ratio for elevated CRP in depression compared with controls was 1.47 (95% CI 1.18–1.82).
About a quarter of patients with depression show evidence of low-grade inflammation, and over half of patients show mildly elevated CRP levels. There are significant differences in the prevalence of low-grade inflammation between patients and matched healthy controls. These findings suggest that inflammation could be relevant to a large number of patients with depression.
Recent studies suggest psychotic and eating disorders can be comorbid and could have shared genetic liability. However, this comorbidity has been overlooked in the epidemiological literature.
To test whether polygenic risk scores (PRS) for schizophrenia are associated with disordered eating behaviours and body mass index (BMI) in the general population.
Using data from the Avon Longitudinal Study of Parents and Children and random-effects logistic and linear regression models, we investigated the association between PRS for schizophrenia and self-reported disordered eating behaviours (binge eating, purging, fasting and excessive exercise) and BMI at 14, 16 and 18 years.
Of the 6920 children with available genetic data, 4473 (64.6%) and 5069 (73.3%) had at least one disordered eating and one BMI outcome measurement, respectively. An s.d. increase in PRS was associated with greater odds of having binge eating behaviours (odds ratio, 1.36; 95% CI 1.16–1.60) and lower BMI (coefficient, −0.03; 95% CI, −0.06 to −0.01).
Our findings suggest the presence of shared genetic risk between schizophrenia and binge eating behaviours. Intermediate phenotypes such as impaired social cognition and irritability, previously shown to be positively correlated in this sample with schizophrenia PRS, could represent risk factors for both phenotypes. Shared genetic liability between binge eating and schizophrenia could also explain higher rates of metabolic syndrome in individuals with schizophrenia, as binge eating could be a mediator of this association in drug-naïve individuals. The finding of an association between greater PRS and lower BMI, although consistent with existing epidemiological and genetic literature, requires further investigation.