To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Suicide accounts for 2.2% of all years of life lost worldwide. We aimed to establish whether infectious epidemics are associated with any changes in the incidence of suicide or the period prevalence of self-harm, or thoughts of suicide or self-harm, with a secondary objective of establishing the frequency of these outcomes.
In this systematic review and meta-analysis, MEDLINE, Embase, PsycINFO and AMED were searched from inception to 9 September 2020. Studies of infectious epidemics reporting outcomes of (a) death by suicide, (b) self-harm or (c) thoughts of suicide or self-harm were identified. A random-effects model meta-analysis for the period prevalence of thoughts of suicide or self-harm was conducted.
In total, 1354 studies were screened with 57 meeting eligibility criteria, of which 7 described death by suicide, 9 by self-harm, and 45 thoughts of suicide or self-harm. The observation period ranged from 1910 to 2020 and included epidemics of Spanish Flu, severe acute respiratory syndrome, human monkeypox, Ebola virus disease and coronavirus disease 2019 (COVID-19). Regarding death by suicide, data with a clear longitudinal comparison group were available for only two epidemics: SARS in Hong Kong, finding an increase in suicides among the elderly, and COVID-19 in Japan, finding no change in suicides among children and adolescents. In terms of self-harm, five studies examined emergency department attendances in epidemic and non-epidemic periods, of which four found no difference and one showed a reduction during the epidemic. In studies of thoughts of suicide or self-harm, one large survey showed a substantial increase in period prevalence compared to non-epidemic periods, but smaller studies showed no difference. As a secondary objective, a meta-analysis of thoughts of suicide and self-harm found that the pooled prevalence was 8.0% overall (95% confidence interval (CI) 5.2–12.0%; 14 820 of 99 238 cases in 24 studies) over a time period of between seven days and six months. The quality assessment found 42 studies were of low quality, nine of moderate quality and six of high quality.
There is little robust evidence on the association of infectious epidemics with suicide, self-harm and thoughts of suicide or self-harm. There was an increase in suicides among the elderly in Hong Kong during SARS and no change in suicides among young people in Japan during COVID-19, but it is unclear how far these findings may be generalised. The development of up-to-date self-harm and suicide statistics to monitor the effect of the current pandemic is an urgent priority.
The purpose of this study was to investigate differences in the perception of disaster issues between disaster directors and general health care providers in Gyeonggi Province, South Korea.
The Gyeonggi provincial committee distributed a survey to acute care facility personnel. Survey topics included awareness of general disaster issues, hospital preparedness, and training priorities. The questionnaire comprised multiple choices and items scored on a 10-point Likert scale. We analyzed the discrepancies and characteristics of the responses.
Completed surveys were returned from 43 (67%) of 64 directors and 145 (55.6%) of 261 health care providers. In the field of general awareness, the topic of how to triage in disaster response showed the greatest discrepancies. In the domain of hospital level disaster preparedness, individual opinions varied most within the topics of incident command, manual preparation. The responses to “accept additional patients in disaster situation” showed the biggest differences (> 21 versus 6~10).
In this study, there were disaster topics with discrepancies and concordances in perception between disaster directors and general health care providers. The analysis would present baseline information for the development of better training programs for region-specific core competencies, knowledge, and skills required for the effective response.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
This study investigated metabolic, endocrine, appetite and mood responses to a maximal eating occasion in fourteen men (mean: age 28 (sd 5) years, body mass 77·2 (sd 6·6) kg and BMI 24·2 (sd 2·2) kg/m2) who completed two trials in a randomised crossover design. On each occasion, participants ate a homogenous mixed-macronutrient meal (pizza). On one occasion, they ate until ‘comfortably full’ (ad libitum) and on the other, until they ‘could not eat another bite’ (maximal). Mean energy intake was double in the maximal (13 024 (95 % CI 10 964, 15 084) kJ; 3113 (95 % CI 2620, 3605) kcal) compared with the ad libitum trial (6627 (95 % CI 5708, 7547) kJ; 1584 (95 % CI 1364, 1804) kcal). Serum insulin incremental AUC (iAUC) increased approximately 1·5-fold in the maximal compared with ad libitum trial (mean: ad libitum 43·8 (95 % CI 28·3, 59·3) nmol/l × 240 min and maximal 67·7 (95 % CI 47·0, 88·5) nmol/l × 240 min, P < 0·01), but glucose iAUC did not differ between trials (ad libitum 94·3 (95 % CI 30·3, 158·2) mmol/l × 240 min and maximal 126·5 (95 % CI 76·9, 176·0) mmol/l × 240 min, P = 0·19). TAG iAUC was approximately 1·5-fold greater in the maximal v. ad libitum trial (ad libitum 98·6 (95 % CI 69·9, 127·2) mmol/l × 240 min and maximal 146·4 (95 % CI 88·6, 204·1) mmol/l × 240 min, P < 0·01). Total glucagon-like peptide-1, glucose-dependent insulinotropic peptide and peptide tyrosine–tyrosine iAUC were greater in the maximal compared with ad libitum trial (P < 0·05). Total ghrelin concentrations decreased to a similar extent, but AUC was slightly lower in the maximal v. ad libitum trial (P = 0·02). There were marked differences on appetite and mood between trials, most notably maximal eating caused a prolonged increase in lethargy. Healthy men have the capacity to eat twice the energy content required to achieve comfortable fullness at a single meal. Postprandial glycaemia is well regulated following initial overeating, with elevated postprandial insulinaemia probably contributing.
Low rates of bystander cardiopulmonary resuscitation (CPR) were identified as a shortcoming in the “chain of survival” for out-of-hospital cardiac arrest (OHCA) care in the Korean city of Ansan. This study sought to evaluate the effect of an initiative to increase bystander CPR and quality of out-of-hospital resuscitation on outcome from OHCA. The post-intervention data were used to determine the next quality improvement (QI) target as part of the “Plan-Do-Study-Act” (PDSA) model for QI.
The study hypothesis was that bystander CPR, return of spontaneous circulation (ROSC), and survival to discharge after OHCA would increase in the post-intervention period.
This was a retrospective pre/post study. The data from the pre-intervention period were abstracted from 2008–2011 and the post-intervention period from 2012–2013. The effect of the intervention on the odds of ROSC and survival to hospital discharge was determined using a generalized estimating equation to account for confounders and the effect of clustering within medical centers. The analysis was then used to identify other factors associated with outcomes to determine the next targets for intervention in the chain of survival for cardiac arrest in this community.
Rates of documented bystander CPR increased from 13% in the pre-intervention period to 37% in the post-intervention period. The overall rate of ROSC decreased from 18.4% to 14.3% (risk difference −4.1%; 95% CI, −7.1%–1.0%), whereas survival to hospital discharge increased from 3.9% to 5.0% (risk difference 1.1%; 95% CI, −1.8%–3.8%), and survival with good neurologic outcome increased from 0.8% to 1.6% (risk difference 0.8%; 95% CI, −0.8%–2.4%). In multivariable analyses, there was no association between the intervention and the rate of ROSC or survival to hospital discharge. The designated level of the treating hospital was a significant predictor of both survival and ROSC.
In this case study, there were no observed improvements in outcomes from OHCA after the targeted intervention to improve out-of-hospital CPR. However, utilizing the PDSA model for QI, the designated level of the treating hospital was found to be a significant predictor of survival in the post-period, identifying the next target for intervention.
With few exceptions, today's tidal trees near Washington's Pacific coast postdate an earthquake that lowered the region by 1 m or more. The earthquake, which occurred in A.D. 1700, is the most recent to have ruptured much of the plate boundary at this central part of the Cascadia subduction zone. Because of the coseismic subsidence, lowland forests became tidal flats where thousands of trees died. Most of the trees killed were Sitka spruce (Picea sitchensis). In the centuries since the earthquake, tidal deposits have built new land that has been colonized by new Sitka spruce. All but several tens of the region's tidal spruce consequently postdate 1700, as shown by counts of annual rings in 121 of the largest spruce in tidal forests at Copalis River, Grays Harbor, and Willapa Bay. Forests began to return to each of these estuaries in the early 1700s and spread seaward in the late 1700s and 1800s. Annual rings in the oldest of the trees thus record a large fraction of the earthquake-recurrence interval that began with the 1700 earthquake.
Naismith SL, Rogers NL, Lewis SJG, Diamond K, Terpening Z, Norrie L, Hickie IB. Sleep disturbance in mild cognitive impairment: differential effects of current and remitted depression.
Objective: Although patients with mild cognitive impairment (MCI) commonly report sleep disturbance, the extent to which depressive symptoms contribute to this relationship is unclear. This study sought to delineate the contribution of current and remitted major depression (MD) to sleep disturbance in MCI.
Methods: Seventy-seven patients meeting criteria for MCI (mean age = 66.6 ± 8.8 years) were grouped according to those with no history of depression (MCI, n = 33), those meeting criteria for current MD [mild cognitive impairment and meeting criteria for current major depression (DEP-C), n = 14] and those with remitted MD [mild cognitive impairment and remitted major depression (DEP-R), n = 30]. Additionally, 17 healthy controls (CON) participated. Sleep was patient-rated using the Pittsburgh Sleep Quality Index and included assessment of sleep quality, duration, efficiency, disturbances, medications, sleep onset latency and daytime dysfunction. Depression severity was clinician-rated using the Hamilton Depression Rating Scale.
Results: Overall sleep disturbance was significantly greater in the DEP-C and DEP-R groups in comparison to the CON and MCI groups (p < 0.001). Only 12% of CON reported sleep disturbance, compared to 30% of MCI, 63% of DEP-R and 86% of DEP-C. Sub-scale analysis showed that the sleep disturbance in depressive groups was most evident across the domains of sleep quality, sleep efficiency, sleep latency and daytime dysfunction.
Conclusion: Sleep disturbance in MCI is strongly associated with a current or past diagnosis of MD. The finding that sleep complaints are still prominent in those with remitted depression, suggests that ‘trait' markers exist that may reflect underlying neurobiological changes within the sleep–wake system.
Excavations at Tinney's Lane, Sherborne in 2002 uncovered extensive evidence for Late Bronze Age settlement and pottery production, dating from a short time period probably within the 12th or 11th century cal bc. Well-preserved deposits of burnt stone, broken vessels, and burnt sherds, together with resulting debris redeposited in associated pits, were accompanied by a series of post-hole structures interpreted as round-houses and four-post settings. Environmental evidence in the form of charcoal, charred plant remains, and molluscs has provided important information concerning sources of fuel and water for pottery production as well as allowing a reconstruction of the local vegetation. Finds of fired clay, metal, stone, shale, flint, and bone include items from distant sources, informing topics such as site status and exchange, and include many categories of tools and equipment that would have been used within the pottery-making processes. Analysis of the spatial distribution of these finds amongst the structures and surviving layers of burning has allowed the definition of a series of industrial activity areas, each comprising one or more round-houses, a four-post structure, bonfire bases or pits used for firing, and other pits with specific related functions. Altogether the site has provided some of the best evidence for pottery production within prehistoric Britain.
Let ℋ be a Hilbert space, let ℬ = (ℋ, ℋ) be the B*-algebra of bounded linear operators from ℋ to ℋ with the uniform operator topology, and let ℒ be the subset of ℬ consisting of the self-adjoint operators. This article is concerned with the second order self-adjoint differential equation
Alfred D. Chandler, Jr., has maintained that the persistence of the personally managed firm in Britain may be a cause of that nation's long-run industrial decline. This article contributes to the debate over decline through a detailed exploration of the business role of personally managed firms in a strategic sector of the Second Industrial Revolution: the metal and metal-making trades of Sheffield. Our study shows that the business strategies of Sheffield firms, based on quality production and flexible technology, had close similarities to those of American companies described by scholars such as Philip Scranton. Many of the Sheffield firms were not lacking in enterprise; they demonstrated tenacity and, in certain key segments of the metal trades, enjoyed a high degree of business success. Our examination of personal capitalism in Sheffield suggests that the terms of the debate over Britain's industrial decline may require further refinement.
It is during the Middle Bronze Age in southern Britain that archaeological field evidence for settlement and cultivation becomes readily available for study. This evidence comprises earthwork enclosures and lynchet systems. Childe regarded the apparent changes in agricultural practices indicated by these field remains as representative of an ‘agricultural revolution’. He evoked a comparison between an earlier state where the ‘warriorherdsman's wife’ had ‘tilled a little wheat and barley with the hoe’, with the emergence of ‘villages of a size and permanence hitherto unprecedented in Britain’ and their accompanying field systems (Childe 1947, 186–9).
The distinction between a prehistoric archaeology dominated by burial and ceremonial monuments, and one dominated by settlement sites and the earthwork remains of cultivation is still drawn today (Bradley 1984, 160). The explanation for the apparent transformation needs careful consideration. At base, this distinction is partly a matter of archaeological visibility. Settlements and cultivation have occurred in all the periods since the Neolithic, and whilst writers such as Childe and Curwen regarded Neolithic and Early Bronze Age settlement to be of a non-permanent and shifting character associated with a heavily pastoral economy (Childe 1947; Curwen 1938), this view is at least questioned, if not totally rejected today. We must be certain of the processes which render settlement and agriculture so visible in our later prehistory and then set about explaining those processes. This will be the main theme of this chapter.