We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The incidence of infections from extended-spectrum β-lactamase (ESBL)–producing Enterobacterales (ESBL-E) is increasing in the United States. We describe the epidemiology of ESBL-E at 5 Emerging Infections Program (EIP) sites.
Methods
During October–December 2017, we piloted active laboratory- and population-based (New York, New Mexico, Tennessee) or sentinel (Colorado, Georgia) ESBL-E surveillance. An incident case was the first isolation from normally sterile body sites or urine of Escherichia coli or Klebsiella pneumoniae/oxytoca resistant to ≥1 extended-spectrum cephalosporin and nonresistant to all carbapenems tested at a clinical laboratory from a surveillance area resident in a 30-day period. Demographic and clinical data were obtained from medical records. The Centers for Disease Control and Prevention (CDC) performed reference antimicrobial susceptibility testing and whole-genome sequencing on a convenience sample of case isolates.
Results
We identified 884 incident cases. The estimated annual incidence in sites conducting population-based surveillance was 199.7 per 100,000 population. Overall, 800 isolates (96%) were from urine, and 790 (89%) were E. coli. Also, 393 cases (47%) were community-associated. Among 136 isolates (15%) tested at the CDC, 122 (90%) met the surveillance definition phenotype; 114 (93%) of 122 were shown to be ESBL producers by clavulanate testing. In total, 111 (97%) of confirmed ESBL producers harbored a blaCTX-M gene. Among ESBL-producing E. coli isolates, 52 (54%) were ST131; 44% of these cases were community associated.
Conclusions
The burden of ESBL-E was high across surveillance sites, with nearly half of cases acquired in the community. EIP has implemented ongoing ESBL-E surveillance to inform prevention efforts, particularly in the community and to watch for the emergence of new ESBL-E strains.
To assess preventability of hospital-onset bacteremia and fungemia (HOB), we developed and evaluated a structured rating guide accounting for intrinsic patient and extrinsic healthcare-related risks.
Design:
HOB preventability rating guide was compared against a reference standard expert panel.
Participants:
A 10-member panel of clinical experts was assembled as the standard of preventability assessment, and 2 physician reviewers applied the rating guide for comparison.
Methods:
The expert panel independently rated 82 hypothetical HOB scenarios using a 6-point Likert scale collapsed into 3 categories: preventable, uncertain, or not preventable. Consensus was defined as concurrence on the same category among ≥70% experts. Scenarios without consensus were deliberated and followed by a second round of rating.
Two reviewers independently applied the rating guide to adjudicate the same 82 scenarios in 2 rounds, with interim revisions. Interrater reliability was evaluated using the κ (kappa) statistic.
Results:
Expert panel consensus criteria were met for 52 scenarios (63%) after 2 rounds.
After 2 rounds, guide-based rating matched expert panel consensus in 40 of 52 (77%) and 39 of 52 (75%) cases for reviewers 1 and 2, respectively. Agreement rates between the 2 reviewers were 84% overall (κ, 0.76; 95% confidence interval [CI], 0.64–0.88]) and 87% (κ, 0.79; 95% CI, 0.65–0.94) for the 52 scenarios with expert consensus.
Conclusions:
Preventability ratings of HOB scenarios by 2 reviewers using a rating guide matched expert consensus in most cases with moderately high interreviewer reliability. Although diversity of expert opinions and uncertainty of preventability merit further exploration, this is a step toward standardized assessment of HOB preventability.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Attentional bias to threat has been implicated as a cognitive mechanism in anxiety disorders for youth. Yet, prior studies documenting this bias have largely relied on a method with questionable reliability (i.e. dot-probe task) and small samples, few of which included adolescents. The current study sought to address such limitations by examining relations between anxiety – both clinically diagnosed and dimensionally rated – and attentional bias to threat.
Methods
The study included a community sample of adolescents and employed eye-tracking methodology intended to capture possible biases across the full range of both automatic (i.e. vigilance bias) and controlled attentional processes (i.e. avoidance bias, maintenance bias). We examined both dimensional anxiety (across the full sample; n = 215) and categorical anxiety in a subset case-control analysis (n = 100) as predictors of biases.
Results
Findings indicated that participants with an anxiety disorder oriented more slowly to angry faces than matched controls. Results did not suggest a greater likelihood of initial orienting to angry faces among our participants with anxiety disorders or those with higher dimensional ratings of anxiety. Greater anxiety severity was associated with greater dwell time to neutral faces.
Conclusions
This is the largest study to date examining eye-tracking metrics of attention to threat among healthy and anxious youth. Findings did not support the notion that anxiety is characterized by heightened vigilance or avoidance/maintenance of attention to threat. All effects detected were extremely small. Links between attention to threat and anxiety among adolescents may be subtle and highly dependent on experimental task dimensions.
For many years, archaeologists have relied on Munsell Soil Color Charts (MSCC) as tools for standardizing the recording of soil and sediment colors in the field and artifacts such as pottery in the lab. Users have identified multiple potential sources of discrepancy in results, such as differences in inter-operator perception, light source, or moisture content of samples. In recent years, researchers have developed inexpensive digital methods for color identification, but these typically cannot be done in real time. Now, a field-ready digital color-matching instrument is marketed to archaeologists as a replacement for MSCC, but the accuracy and overall suitability of this device for archaeological research has not been demonstrated. Through three separate field and laboratory trials, we found systematic mismatches in the results obtained via device, including variable accuracy against standardized MSCC chips, which should represent ideal samples. At the same time, the instrument was consistent in its readings. This leads us to question whether using the “subjective” human eye or the “objective” digital eye is preferable for data recording of color. We discuss how project goals and limitations should be considered when deciding which color-recording method to employ in field and laboratory settings, and we identify optimal procedures.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
OBJECTIVES/GOALS: In 2017, new guidelines recommended multi-step algorithms for CDI diagnosis, and clinical centers rapidly implemented changes despite limited pediatric data. We assessed a multi-step algorithm using NAAT followed by EIA for ability to differentiate symptomatic CDI from colonization in children. METHODS/STUDY POPULATION: We prospectively enrolled pediatric patients with cancer, cystic fibrosis, or inflammatory bowel disease who were not being tested or treated for CDI and obtained a stool sample for NAAT. If positive by NAAT (colonized), EIA was performed. Children with symptomatic CDI who tested positive by NAAT via the clinical laboratory were also enrolled and EIA performed on residual stool. A functional cell cytotoxicity neutralization assay (CCNA) was performed in addition. RESULTS/ANTICIPATED RESULTS: Of the 138 asymptomatic children enrolled, 24 (17%) were colonized. An additional 37 children with symptomatic CDI were enrolled. Neither EIA positivity (41% versus 21%, P = 0.11) or CCNA positivity (49% versus 46%, P = 0.84) were significantly different between symptomatic versus colonized children. When both EIA and CCNA were positive, children were more commonly symptomatic than colonized (33% versus 13%, P = 0.04). DISCUSSION/SIGNIFICANCE OF IMPACT: A multi-step testing algorithm with NAAT and EIA failed to differentiate symptomatic CDI from colonization in our pediatric cohort. As multi-step algorithms are moved into clinical care, pediatric providers will need to be aware of the continued limitations in diagnostic testing.
Introduction: Emergency care serves as an important health resource for First Nations (FN) persons. Previous reporting shows that FN persons visit emergency departments at almost double the rate of non-FN persons. Working collaboratively with FN partners, academic researchers and health authority staff, the objective of this study is to investigate FN emergency care patient visit statistics in Alberta over a five year period. Methods: Through a population-based retrospective cohort study for the period from April 1, 2012 to March 31, 2017, patient demographics and emergency care visit characteristics for status FN patients in Alberta were analyzed and compared to non-FN statistics. Frequencies and percentages (%) describe patients and visits by categorical variables (e.g., Canadian Triage Acuity Scale (CTAS)). Means and standard deviations (medians and interquartile ranges (IQR)) describe continuous variables (e.g., distances) as appropriate for the data distribution. These descriptions are repeated for the FN and non-FN populations, separately. Results: The data set contains 11,686,288 emergency facility visits by 3,024,491 unique persons. FN people make up 4.8% of unique patients and 9.4% of emergency care visits. FN persons live further from emergency facilities than their non-FN counterparts (FN median 6 km, IQR 1-24; vs. non-FN median 4 km, IQR 2-8). FN visits arrive more often by ground ambulance (15.3% vs. 10%). FN visits are more commonly triaged as less acute (59% CTAS levels 4 and 5, compared to non-FN 50.4%). More FN visits end in leaving without completing treatment (6.7% vs. 3.6%). FN visits are more often in the evening – 4:01pm to 12:00am (43.6% vs. 38.1%). Conclusion: In a collaborative validation session, FN Elders and health directors contextualized emergency care presentation in evenings and receiving less acute triage scores as related to difficulties accessing primary care. They explained presentation in evenings, arrival by ambulance, and leaving without completing treatment in terms of issues accessing transport to and from emergency facilities. Many factors interact to determine FN patients’ emergency care visit characteristics and outcomes. Further research needs to separate the impact of FN identity from factors such as reasons for visiting emergency facilities, distance traveled to care, and the size of facility where care is provided.
We investigated the contribution of polymorphisms shown to moderate transcription of serotonin transporter (5HTT) and monoamine oxidase A (MAOA) to the development of violence, and furthermore to test for gene x environment interactions. To do so, a cohort of 184 adult male volunteers referred for forensic assessment were assigned to a violent or non-violent group. 45% of violent, but only 30% of non-violent individuals carried the low-activity, short MAOA allele. In the violent group, carriers of low-function variants of 5HTT were found in 77%, as compared to 59%. Logistic regression was performed and the best fitting model revealed a significant, independent effect of childhood environment and MAOA genotype. A significant influence of an interaction between childhood environment and 5HTT genotype was found (Fig. 1). MAOA thus appears to be independently associated with violent crime, while there is a relevant 5HTT x environment interaction.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
$60+$
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Measurements in the infrared wavelength domain allow direct assessment of the physical state and energy balance of cool matter in space, enabling the detailed study of the processes that govern the formation and evolution of stars and planetary systems in galaxies over cosmic time. Previous infrared missions revealed a great deal about the obscured Universe, but were hampered by limited sensitivity.
SPICA takes the next step in infrared observational capability by combining a large 2.5-meter diameter telescope, cooled to below 8 K, with instruments employing ultra-sensitive detectors. A combination of passive cooling and mechanical coolers will be used to cool both the telescope and the instruments. With mechanical coolers the mission lifetime is not limited by the supply of cryogen. With the combination of low telescope background and instruments with state-of-the-art detectors SPICA provides a huge advance on the capabilities of previous missions.
SPICA instruments offer spectral resolving power ranging from R ~50 through 11 000 in the 17–230 μm domain and R ~28.000 spectroscopy between 12 and 18 μm. SPICA will provide efficient 30–37 μm broad band mapping, and small field spectroscopic and polarimetric imaging at 100, 200 and 350 μm. SPICA will provide infrared spectroscopy with an unprecedented sensitivity of ~5 × 10−20 W m−2 (5σ/1 h)—over two orders of magnitude improvement over what earlier missions. This exceptional performance leap, will open entirely new domains in infrared astronomy; galaxy evolution and metal production over cosmic time, dust formation and evolution from very early epochs onwards, the formation history of planetary systems.
Eudialyte-group minerals (EGM) represent the most important index minerals of persodic agpaitic systems. Results are presented here of a combined EPMA, Mössbauer spectroscopy and LA-ICP-MS study and EGM which crystallized in various fractionation stages from different parental melts and mineral assemblages in silica over- and undersaturated systems are compared. Compositional variability is closely related to texture, allowing for reconstruction of locally acting magmatic to hydrothermal processes. Early-magmatic EGM are invariably dominated by Fe whereas hydrothermal EGM can be virtually Fe-free and form pure Mn end-members. Hence the Mn/Fe ratio is the most suitable fractionation indicator, although crystal chemistry effects and co-crystallizing phases play a secondary role in the incorporation of Fe and Mn into EGM. Mössbauer spectroscopy of EGM from three selected occurrences indicates the Fe3+/ΣFe ratio to be governed by the hydration state of EGM rather than by the oxygen fugacity of the coexisting melt. Negative Eu anomalies are restricted to EGM that crystallized from alkali basaltic parental melts while EGM from nephelinitic parental melts invariably lack negative Eu anomalies. Even after extensive differentiation intervals, EGM reflect properties of their respective parental melts and the fractionation of plagioclase and other minerals such as Fe-Ti oxides, amphibole and sulphides.
A novel, alloy-agnostic, nanofunctionalization process has been utilized to produce metal matrix composites (MMCs) via additive manufacturing, providing new geometric freedom for MMC design. MMCs were produced with the addition of tungsten carbide nanoparticles to commercially available AlSi10Mg alloy powder. Tungsten carbide was chosen due to the potential for coherent crystallographic phases that were identified utilizing a lattice-matching approach to promote wetting and increase dislocation interactions. Structures were produced with evenly distributed strengthening phases leading to tensile strengths >385 MPa and a 50% decrease in wear rate over the commercially available AlSi10Mg alloy at only 1 vol% loading of tungsten carbide.
Outpatient parenteral antimicrobial therapy (OPAT) programmes facilitate hospital discharge, but patients remain at risk of complications and consequent healthcare utilisation (HCU). Here we elucidated the incidence of and risk factors associated with HCU in OPAT patients. This was a retrospective, single-centre, case–control study of adult patients discharged on OPAT. Cases (n = 63) and controls (n = 126) were patients that did or did not utilise the healthcare system within 60 days. Characteristics associated with HCU in bivariate analysis (P ≤ 0.2) were included in a multivariable logistic regression model. Variables were retained in the final model if they were independently (P < 0.05) associated with 60-day HCU. Among all study patients, the mean age was 55 ± 16, 65% were men, and wound infection (22%) and cellulitis (14%) were common diagnoses. The cumulative incidence of 60-day unplanned HCU was 27% with a disproportionately higher incidence in the first 30 days (21%). A statin at discharge (adjusted odds ratios (aOR) 0.23, 95% confidence intervals (CIs) 0.09–0.57), number of prior admissions in past 12 months (aOR 1.48, 95% CIs 1.05–2.10), and a sepsis diagnosis (aOR 4.62, 95% CIs 1.23–17.3) were independently associated with HCU. HCU was most commonly due to non-infection related complications (44%) and worsening primary infection (31%). There are multiple risk factors for HCU in OPAT patients, and formal OPAT clinics may help to risk stratify and target the highest risk groups.
Exercise and physical training are known to affect gastrointestinal function and digestibility in horses and can lead to inaccurate estimates of nutrient and energy digestibility when markers are used. The effect of exercise on apparent nutrient digestibility and faecal recoveries of ADL and TiO2 was studied in six Welsh pony geldings subjected to either a low- (LI) or high-intensity (HI) exercise regime according to a cross-over design. Ponies performing LI exercise were walked once per day for 45 min in a horse walker (5 km/h) for 47 consecutive days. Ponies submitted to HI exercise were gradually trained for the same 47 days according a standardized protocol. Throughout the experiment, the ponies received a fixed level of feed and the daily rations consisted of 4.7 kg DM of grass hay and 0.95 kg DM of concentrate. The diet was supplemented with minerals, vitamins and TiO2 (3.0 g Ti/day). Total tract digestibility of DM, organic matter (OM), CP, crude fat, NDF, ADF, starch, sugar and energy was determined with the total faeces collection (TFC) method. In addition, DM and OM digestibility was estimated using internal ADL and the externally supplemented Ti as markers. Urine was collected on the final 2 days of each experimental period. Exercise did not affect apparent digestibility of CP, crude fat, starch and sugar. Digestibility of DM (DMD), OM (OMD), ADF and NDF tended to be lower and DE was decreased when ponies received the HI exercise regime. For all treatments combined, mean faecal recoveries of ADL and Ti were 87.8±1.7% and 99.3±1.7%, respectively. Ti was not detected in the urine, indicating that intestinal integrity was maintained with exercise. Dry matter digestibility estimated with the TFC, ADL and Ti for ponies subjected to LI exercise were 66.3%, 60.3% and 64.8%, respectively, while DMD for HI ponies were 64.2%, 60.3% and 65.2%, respectively. In conclusion, physical exercise has an influence on the GE digestibility of the feed in ponies provided with equivalent levels of feed intake. In addition, the two markers used for estimating apparent DMD and OMD indicate that externally supplemented Ti is a suitable marker to determine digestibility of nutrients in horses performing exercise unlike dietary ADL.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.