We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Neuropsychiatric symptoms are common after traumatic brain injury (TBI) and often resolve within 3 months post-injury. However, the degree to which individual patients follow this course is unknown. We characterized trajectories of neuropsychiatric symptoms over 12 months post-TBI. We hypothesized that a substantial proportion of individuals would display trajectories distinct from the group-average course, with some exhibiting less favorable courses.
Methods
Participants were level 1 trauma center patients with TBI (n = 1943), orthopedic trauma controls (n = 257), and non-injured friend controls (n = 300). Trajectories of six symptom dimensions (Depression, Anxiety, Fear, Sleep, Physical, and Pain) were identified using growth mixture modeling from 2 weeks to 12 months post-injury.
Results
Depression, Anxiety, Fear, and Physical symptoms displayed three trajectories: Stable-Low (86.2–88.6%), Worsening (5.6–10.9%), and Improving (2.6–6.4%). Among symptomatic trajectories (Worsening, Improving), lower-severity TBI was associated with higher prevalence of elevated symptoms at 2 weeks that steadily resolved over 12 months compared to all other groups, whereas higher-severity TBI was associated with higher prevalence of symptoms that gradually worsened from 3–12 months. Sleep and Pain displayed more variable recovery courses, and the most common trajectory entailed an average level of problems that remained stable over time (Stable-Average; 46.7–82.6%). Symptomatic Sleep and Pain trajectories (Stable-Average, Improving) were more common in traumatically injured groups.
Conclusions
Findings illustrate the nature and rates of distinct neuropsychiatric symptom trajectories and their relationship to traumatic injuries. Providers may use these results as a referent for gauging typical v. atypical recovery in the first 12 months post-injury.
Blood-based biomarkers offer a more feasible alternative to Alzheimer’s disease (AD) detection, management, and study of disease mechanisms than current in vivo measures. Given their novelty, these plasma biomarkers must be assessed against postmortem neuropathological outcomes for validation. Research has shown utility in plasma markers of the proposed AT(N) framework, however recent studies have stressed the importance of expanding this framework to include other pathways. There is promising data supporting the usefulness of plasma glial fibrillary acidic protein (GFAP) in AD, but GFAP-to-autopsy studies are limited. Here, we tested the association between plasma GFAP and AD-related neuropathological outcomes in participants from the Boston University (BU) Alzheimer’s Disease Research Center (ADRC).
Participants and Methods:
This sample included 45 participants from the BU ADRC who had a plasma sample within 5 years of death and donated their brain for neuropathological examination. Most recent plasma samples were analyzed using the Simoa platform. Neuropathological examinations followed the National Alzheimer’s Coordinating Center procedures and diagnostic criteria. The NIA-Reagan Institute criteria were used for the neuropathological diagnosis of AD. Measures of GFAP were log-transformed. Binary logistic regression analyses tested the association between GFAP and autopsy-confirmed AD status, as well as with semi-quantitative ratings of regional atrophy (none/mild versus moderate/severe) using binary logistic regression. Ordinal logistic regression analyses tested the association between plasma GFAP and Braak stage and CERAD neuritic plaque score. Area under the curve (AUC) statistics from receiver operating characteristics (ROC) using predicted probabilities from binary logistic regression examined the ability of plasma GFAP to discriminate autopsy-confirmed AD status. All analyses controlled for sex, age at death, years between last blood draw and death, and APOE e4 status.
Results:
Of the 45 brain donors, 29 (64.4%) had autopsy-confirmed AD. The mean (SD) age of the sample at the time of blood draw was 80.76 (8.58) and there were 2.80 (1.16) years between the last blood draw and death. The sample included 20 (44.4%) females, 41 (91.1%) were White, and 20 (44.4%) were APOE e4 carriers. Higher GFAP concentrations were associated with increased odds for having autopsy-confirmed AD (OR=14.12, 95% CI [2.00, 99.88], p=0.008). ROC analysis showed plasma GFAP accurately discriminated those with and without autopsy-confirmed AD on its own (AUC=0.75) and strengthened as the above covariates were added to the model (AUC=0.81). Increases in GFAP levels corresponded to increases in Braak stage (OR=2.39, 95% CI [0.71-4.07], p=0.005), but not CERAD ratings (OR=1.24, 95% CI [0.004, 2.49], p=0.051). Higher GFAP levels were associated with greater temporal lobe atrophy (OR=10.27, 95% CI [1.53,69.15], p=0.017), but this was not observed with any other regions.
Conclusions:
The current results show that antemortem plasma GFAP is associated with non-specific AD neuropathological changes at autopsy. Plasma GFAP could be a useful and practical biomarker for assisting in the detection of AD-related changes, as well as for study of disease mechanisms.
The inaugural data from the first systematic program of sea-ice observations in Kotzebue Sound, Alaska, in 2018 coincided with the first winter in living memory when the Sound was not choked with ice. The following winter of 2018–19 was even warmer and characterized by even less ice. Here we discuss the mass balance of landfast ice near Kotzebue (Qikiqtaġruk) during these two anomalously warm winters. We use in situ observations and a 1-D thermodynamic model to address three research questions developed in partnership with an Indigenous Advisory Council. In doing so, we improve our understanding of connections between landfast ice mass balance, marine mammals and subsistence hunting. Specifically, we show: (i) ice growth stopped unusually early due to strong vertical ocean heat flux, which also likely contributed to early start to bearded seal hunting; (ii) unusually thin ice contributed to widespread surface flooding. The associated snow ice formation partly offset the reduced ice growth, but the flooding likely had a negative impact on ringed seal habitat; (iii) sea ice near Kotzebue during the winters of 2017–18 and 2018–19 was likely the thinnest since at least 1945, driven by a combination of warm air temperatures and a persistent ocean heat flux.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Nut-based products may aid low-glycaemic dietary strategies that are important for diabetes prevention in populations at increased risk of dysglycaemia, such as Asian Chinese. This randomised cross-over trial assessed the postprandial glycaemic response (0–120 min) of a higher-protein nut-based (HP-NB) snack formulation, in bar format (1009 kJ, Nutrient Profiling Score, NPS, −2), when compared with an iso-energetic higher-carbohydrate (CHO) cereal-based bar (HC-CB, 985 kJ, NPS +3). It also assessed the ability to suppress glucose response to a typical CHO-rich food (white bread, WB), when co-ingested. Ten overweight prediabetic Chinese adults (mean, sd: age 47⋅9, 15⋅7 years; BMI 25⋅5, 1⋅6 kg/m2), with total body fat plus ectopic pancreas and liver fat quantified using dual-energy X-ray absorptiometry and magnetic resonance imaging and spectroscopy, received the five meal treatments in random order: HP-NB, HC-CB, HP-NB + WB (50 g available CHO), HC-CB + WB and WB only. Compared with HC-CB, HP-NB induced a significantly lower 30–120 min glucose response (P < 0⋅05), with an approximately 10-fold lower incremental area under the glucose curve (iAUC0–120; P < 0⋅001). HP-NB also attenuated glucose response by approximately 25 % when co-ingested with WB (P < 0⋅05). Half of the cohort had elevated pancreas and/or liver fat, with 13–21 % greater suppression of iAUC0–120 glucose in the low v. high organ fat subgroups across all five treatments. A nut-based snack product may be a healthier alternative to an energy equivalent cereal-based product with evidence of both a lower postprandial glycaemic response and modulation of CHO-induced hyperglycaemia even in high-risk, overweight, pre-diabetic adults.
There is a continual need for invasive plant science to develop approaches for cost-effectively benefiting native over nonnative species in dynamic management and biophysical contexts, including within predominantly nonnative plant landscapes containing only small patches of native plants. Our objective was to test the effectiveness of a minimal-input strategy for enlarging native species patches within a nonnative plant matrix. In Pecos National Historical Park, New Mexico, USA, we identified 40 native perennial grass patches within a matrix of the nonnative annual forb kochia [Bassia scoparia (L.) A.J. Scott]. We mechanically cut B. scoparia in a 2-m-wide ring surrounding the perimeters of half the native grass patches (with the other half as uncut controls) and measured change in native grass patch size (relative to pretreatment) for 3 yr. Native grass patches around which B. scoparia was cut grew quickly the first posttreatment year and by the third year had increased in size four times more than control patches. Treated native grass patches expanded by an average of 25 m2, from 4 m2 in October 2015 before treatment to 29 m2 in October 2018. The experiment occurred during a dry period, conditions that should favor B. scoparia and contraction of the native grasses, suggesting that the observed increase in native grasses occurred despite suboptimal climatic conditions. Strategically treating around native patches to enlarge them over time showed promise as a minimal-input technique for increasing the proportion of the landscape dominated by native plants.
Ultrasound applications are widespread, and their utility in resource-limited environments are numerous. In disasters, the use of ultrasound can help reallocate resources by guiding decisions on management and transportation priorities. These interventions can occur on-scene, at triage collection points, during transport, and at the receiving medical facility. Literature related to this specific topic is limited. However, literature regarding prehospital use of ultrasound, ultrasound in combat situations, and some articles specific to disaster medicine allude to the potential growth of ultrasound utilization in disaster response.
Aim:
To evaluate the utility of point-of-care ultrasound in a disaster response based on studies involving ultrasonography in resource-limited environments.
Methods:
A narrative review of MEDLINE, MEDLINE InProcess, EPub, and Embase found 20 articles for inclusion.
Results:
Experiences from past disasters, prehospital care, and combat experiences have demonstrated the value of ultrasound both as a diagnostic and interventional modality.
Discussion:
Current literature supports the use of ultrasound in disaster response as a real-time, portable, safe, reliable, repeatable, easy-to-use, and accurate tool. While both false positives and false negatives were reported in prehospital studies, these values correlate to accepted false positive and negative rates of standard in-hospital point-of-care ultrasound exams. Studies involving austere environments demonstrate the ability to apply ultrasound in extreme conditions and to obtain high-quality images with only modest training and real-time remote guidance. The potential for point-of-care ultrasound in triage and management of mass casualty incidents is there. However, as these studies are heterogeneous and observational in nature, further research is needed as to how to integrate ultrasound into the response and recovery phases.
Objectives: Prior research has identified numerous genetic (including sex), education, health, and lifestyle factors that predict cognitive decline. Traditional model selection approaches (e.g., backward or stepwise selection) attempt to find one model that best fits the observed data, risking interpretations that only the selected predictors are important. In reality, several predictor combinations may fit similarly well but result in different conclusions (e.g., about size and significance of parameter estimates). In this study, we describe an alternative method, Information-Theoretic (IT) model averaging, and apply it to characterize a set of complex interactions in a longitudinal study on cognitive decline. Methods: Here, we used longitudinal cognitive data from 1256 late–middle aged adults from the Wisconsin Registry for Alzheimer’s Prevention study to examine the effects of sex, apolipoprotein E (APOE) ɛ4 allele (non-modifiable factors), and literacy achievement (modifiable) on cognitive decline. For each outcome, we applied IT model averaging to a set of models with different combinations of interactions among sex, APOE, literacy, and age. Results: For a list-learning test, model-averaged results showed better performance for women versus men, with faster decline among men; increased literacy was associated with better performance, particularly among men. APOE had less of an association with cognitive performance in this age range (∼40–70 years). Conclusions: These results illustrate the utility of the IT approach and point to literacy as a potential modifier of cognitive decline. Whether the protective effect of literacy is due to educational attainment or intrinsic verbal intellectual ability is the topic of ongoing work. (JINS, 2019, 25, 119–133)
Objectives: A major challenge in cognitive aging is differentiating preclinical disease-related cognitive decline from changes associated with normal aging. Neuropsychological test authors typically publish single time-point norms, referred to here as unconditional reference values. However, detecting significant change requires longitudinal, or conditional reference values, created by modeling cognition as a function of prior performance. Our objectives were to create, depict, and examine preliminary validity of unconditional and conditional reference values for ages 40–75 years on neuropsychological tests. Method: We used quantile regression to create growth-curve–like models of performance on tests of memory and executive function using participants from the Wisconsin Registry for Alzheimer’s Prevention. Unconditional and conditional models accounted for age, sex, education, and verbal ability/literacy; conditional models also included past performance on and number of prior exposures to the test. Models were then used to estimate individuals’ unconditional and conditional percentile ranks for each test. We examined how low performance on each test (operationalized as <7th percentile) related to consensus-conference–determined cognitive statuses and subjective impairment. Results: Participants with low performance were more likely to receive an abnormal cognitive diagnosis at the current visit (but not later visits). Low performance was also linked to subjective and informant reports of worsening memory function. Conclusions: The percentile-based methods and single-test results described here show potential for detecting troublesome within-person cognitive change. Development of reference values for additional cognitive measures, investigation of alternative thresholds for abnormality (including multi-test criteria), and validation in samples with more clinical endpoints are needed. (JINS, 2019, 25, 1–14)
Introduction: The use of personal mobile devices to record patient data appears to be increasing, but remains poorly studied. We sought to determine the magnitude and purposes for which Canadian emergency physicians (EPs) and residents use their personal mobile devices (PMDs) to record patient data in the emergency department (ED). Methods: An anonymous survey was distributed to EPs and residents in the Canadian Association of Emergency Physicians (CAEP) database between 27/02/17 and 23/03/17. The survey captured demographic information and information on frequency and purpose of PMD use in the ED, whether consent was obtained, how the information was secured, and any possible implications for patient care. Participants were also asked about knowledge of, and any perceived restrictions from, current regulations regarding the use of PMDs healthcare settings. Results: The survey response rate was 23.1%. Of 415 respondents, 9 surveys were rejected for incomplete demographic data, resulting in 406 participants. A third (31.5%, 128/406, 95% CI 27.0-36.3) reported using PMDs to record patient data. Most (78.1%) reported doing so more than once a month and 7.0% reported doing so once every shift. 10.9% of participants indicated they did not obtain written or verbal consent. Reasons cited by participants for using PMDs to record patient data included a belief that doing so improves care provided by consultants (36.7%), expedites patient care (31.3%), and improves medical education (32.8%). 53.2% of participants were unaware of current regulations and 19.7% reported feeling restricted by them. Subgroup analysis suggested an increased frequency of PMD use to record patient data among younger physicians and physicians in rural settings. Conclusion: This is the first known Canadian study on the use of PMDs to record patient data in the ED. Our results suggest that this practice is common, and arises from a belief that doing so enhances patient care through better communication, efficiency, and education. Our findings also suggest current practices result in risk of both privacy and confidentiality breaches, and thus support arguments for both physician education and regulation reform.
The Farmers’ Market Fresh Fund Incentive Program is a policy, systems and environmental intervention to improve access to fresh produce for participants on governmental assistance in the USA. The current study examined factors associated with ongoing participation in this matched monetary incentive programme.
Design
Relationship of baseline factors with number of Fresh Fund visits was assessed using Poisson regression. Mixed-effects modelling was used to explore changes in consumption of fruits and vegetables and diet quality.
Setting
San Diego, California.
Subjects
Recipients of Supplemental Nutrition Assistance Program (SNAP), Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) and Supplemental Security Income (SSI) who attended participating farmers’ markets from 2010 to 2012 (n 7298).
Results
Among those with participation for ≤6 months, factors associated with increased visits included reporting more daily servings of fruits and vegetables (F&V) at baseline, being Vietnamese or Asian/Pacific Islander, and eligibility because of SNAP/CalFresh or SSI (v. WIC). Among those who came for 6–12 months, being Asian/Pacific Islander, eligibility because of SNAP/CalFresh and enrolling in the autumn, winter or spring were associated with a greater number of Fresh Fund visits. Among those who came for >12 months, being male and eligibility because of SSI were associated with a greater number of visits. Overall, the odds of increasing number of servings of F&V consumed increased by 2 % per month, and the odds of improved perception of diet quality increased by 10 % per month.
Conclusions
Sustaining and increasing Fresh Fund-type programme operations should be a top priority for future policy decisions concerning farmers’ market use in low-income neighbourhoods.
We explored the middle-Holocene decline of Tsuga canadensis by measuring the diameters of pollen grains in two lake-sediment cores from New England. We hypothesized that a drop in pollen size at the time of the decline followed by an increase in pollen diameters as Tsuga recovered during the late Holocene might indicate reduced abundance of Tsuga in the vicinity of the lake during the decline, as smaller pollen grains travel farther than larger ones. To provide context for this hypothesis, we also measured the diameters of Tsuga pollen grains in the surface sediments of sites spanning the modern-day gradient of Tsuga in New England. Both fossil records exhibited a reduction in pollen size during the interval of the middle-Holocene decline, with diameters similar to those observed in the upper sediments of those sites, yet larger than Tsuga pollen grains in the surface sediments of coastal sites beyond the modern range of Tsuga. This pattern suggests that Tsuga persisted in scattered, low-density populations during the middle Holocene, as it has remained on the landscape since European settlement.
The temperature and albedo distributions of Arctic sea ice are calculated from images obtained from the AVHRR satellite sensor. The temperature estimate uses a split window correction incorporating regression coefficients appropriate for the arctic atmosphere. The albedo estimate is found assuming a clear and dry atmosphere. Both estimates are made with published correction techniques. Inherent errors due to the uncertainty of the atmospheric interference produced by humidity, aerosols, and diamond dust are judged to be 2–5°C in surface temperature and 0.10–0.20 in surface albedo. Cloudy regions are masked out manually using data from all five channels. The relationship between temperature and albedo is shown for a sample scene. A simple model of a surface composed of only cold, bright ice and warm, dark water is inadequate. Model calculations based on the surface energy balance allow us to relate albedo and temperature to ice thickness and snow-cover thickness and to further assess the accuracy of the surface estimates.
Objectives: Intraindividual cognitive variability (IICV) has been shown to differentiate between groups with normal cognition, mild cognitive impairment (MCI), and dementia. This study examined whether baseline IICV predicted subsequent mild to moderate cognitive impairment in a cognitively normal baseline sample. Methods: Participants with 4 waves of cognitive assessment were drawn from the Wisconsin Registry for Alzheimer’s Prevention (WRAP; n=684; 53.6(6.6) baseline age; 9.1(1.0) years follow-up; 70% female; 74.6% parental history of Alzheimer’s disease). The primary outcome was Wave 4 cognitive status (“cognitively normal” vs. “impaired”) determined by consensus conference; “impaired” included early MCI (n=109), clinical MCI (n=11), or dementia (n=1). Primary predictors included two IICV variables, each based on the standard deviation of a set of scores: “6 Factor IICV” and “4 Test IICV”. Each IICV variable was tested in a series of logistic regression models to determine whether IICV predicted cognitive status. In exploratory analyses, distribution-based cutoffs incorporating memory, executive function, and IICV patterns were used to create and test an MCI risk variable. Results: Results were similar for the IICV variables: higher IICV was associated with greater risk of subsequent impairment after covariate adjustment. After adjusting for memory and executive functioning scores contributing to IICV, IICV was not significant. The MCI risk variable also predicted risk of impairment. Conclusions: While IICV in middle-age predicts subsequent impairment, it is a weaker risk indicator than the memory and executive function scores contributing to its calculation. Exploratory analyses suggest potential to incorporate IICV patterns into risk assessment in clinical settings. (JINS, 2016, 22, 1016–1025)
The challenges presented by traumatic injuries in low-resource communities are especially relevant in South Sudan. This study was conducted to assess whether a 3-day wilderness first aid (WFA) training course taught in South Sudan improved first aid knowledge. Stonehearth Open Learning Opportunities (SOLO) Schools designed the course to teach people with limited medical knowledge to use materials from their environment to provide life-saving care in the event of an emergency.
Methods
A pre-test/post-test study design was used to assess first aid knowledge of 46 community members in Kit, South Sudan, according to a protocol approved by the University of New England Institutional Review Board. The course and assessments were administered in English and translated in real-time to Acholi and Arabic, the two primary languages spoken in the Kit region. Descriptive statistics, t-test, ANOVA, and correlation analyses were conducted.
Results
Results included a statistically significant improvement in first aid knowledge after the 3-day training course: t(38)=3.94; P<.001. Although men started with more health care knowledge: (t(37)=2.79; P=.008), men and women demonstrated equal levels of knowledge upon course completion: t(37)=1.56; P=.88.
Conclusions
This research, which may be the first of its kind in South Sudan, provides evidence that a WFA training course in South Sudan is efficacious. These findings suggest that similar training opportunities could be used in other parts of the world to improve basic medical knowledge in communities with limited access to medical resources and varying levels of education and professional experiences.
KatonaLB, DouglasWS, LenaSR, RatnerKG, CrothersD, ZondervanRL, RadisCD. Wilderness First Aid Training as a Tool for Improving Basic Medical Knowledge in South Sudan. Prehosp Disaster Med. 2015;30(6):574–578.
Hospital Ebola preparation is underway in the United States and other countries; however, the best approach and resources involved are unknown.
OBJECTIVE
To examine costs and challenges associated with hospital Ebola preparation by means of a survey of Society for Healthcare Epidemiology of America (SHEA) members.
DESIGN
Electronic survey of infection prevention experts.
RESULTS
A total of 257 members completed the survey (221 US, 36 international) representing institutions in 41 US states, the District of Columbia, and 18 countries. The 221 US respondents represented 158 (43.1%) of 367 major medical centers that have SHEA members and included 21 (60%) of 35 institutions recently defined by the US Centers for Disease Control and Prevention as Ebola virus disease treatment centers. From October 13 through October 19, 2014, Ebola consumed 80% of hospital epidemiology time and only 30% of routine infection prevention activities were completed. Routine care was delayed in 27% of hospitals evaluating patients for Ebola.
LIMITATIONS
Convenience sample of SHEA members with a moderate response rate.
CONCLUSIONS
Hospital Ebola preparations required extraordinary resources, which were diverted from routine infection prevention activities. Patients being evaluated for Ebola faced delays and potential limitations in management of other diseases that are more common in travelers returning from West Africa.
In sheep production systems based on extensive grazing, neonatal mortality often reaches 15% to 20% of lambs born, and the mortality rate can be doubled in the case of multiple births. An important contributing factor is the nutrition of the mother because it affects the amount of colostrum available at birth. Ewes carrying multiple lambs have higher energy requirements than ewes carrying a single lamb and this problem is compounded by limitations to voluntary feed intake as the gravid uterus compresses the rumen. This combination of factors means that the nutritional requirements of the ewe carrying multiple lambs can rarely be met by the supply of pasture alone. This problem can overcome by supplementation with energy during the last week of pregnancy, a treatment that increases colostrum production and also reduces colostrum viscosity, making it easier for the neonatal lamb to suck. In addition, litter size and nutrition both accelerate the decline in concentration of circulating progesterone that, in turn, triggers the onsets of both birth and lactogenesis, and thus ensures the synchrony of these two events. Furthermore, the presence of colostrum in the gut of the lamb increases its ability to recognize its mother, and thus improves mother–young bonding. Most cereal grains that are rich in energy in the form of starch, when used as supplements in late pregnancy will increase colostrum production by 90% to 185% above control (unsupplemented) values. Variation among types of cereal grain in the response they induce may be due to differences in the amount of starch digested post-ruminally. As a percentage of grain dry matter intake, the amount of starch entering the lower digestive tract is 14% for maize, 8.5% for barley and 2% for oats. Supplements of high quality protein from legumes and oleiferous seeds can also increase colostrum production but they are less effective than cereal grains. In conclusion, short-term supplementation before parturition, particularly with energy-rich concentrates, can improve colostrum production, help meet the energy and immunological requirements for new-born lambs, and improve lamb survival.
Since the publication of “A Compendium of Strategies to Prevent Healthcare-Associated Infections in Acute Care Hospitals” in 2008, prevention of healthcare-associated infections (HAIs) has become a national priority. Despite improvements, preventable HAIs continue to occur. The 2014 updates to the Compendium were created to provide acute care hospitals with up-to-date, practical, expert guidance to assist in prioritizing and implementing their HAI prevention efforts. They are the product of a highly collaborative effort led by the Society for Healthcare Epidemiology of America (SHEA), the Infectious Diseases Society of America (IDSA), the American Hospital Association (AHA), the Association for Professionals in Infection Control and Epidemiology (APIC), and The Joint Commission, with major contributions from representatives of a number of organizations and societies with content expertise, including the Centers for Disease Control and Prevention(CDC), the Institute for Healthcare Improvement (IHI), the Pediatric Infectious Diseases Society (PIDS), the Society for Critical Care Medicine (SCCM), the Society for Hospital Medicine (SHM), and the Surgical Infection Society (SIS).
Since the publication of “A Compendium of Strategies to Prevent Healthcare-Associated Infections in Acute Care Hospitals” in 2008, prevention of healthcare-associated infections (HAIs) has become a national priority. Despite improvements, preventable HAIs continue to occur. The 2014 updates to the Compendium were created to provide acute care hospitals with up-to-date, practical, expert guidance to assist in prioritizing and implementing their HAI prevention efforts. They are the product of a highly collaborative effort led by the Society for Healthcare Epidemiology of America (SHEA), the Infectious Diseases Society of America (IDSA), the American Hospital Association (AHA), the Association for Professionals in Infection Control and Epidemiology (APIC), and The Joint Commission, with major contributions from representatives of a number of organizations and societies with content expertise, including the Centers for Disease Control and Prevention (CDC), the Institute for Healthcare Improvement (IHI), the Pediatric Infectious Diseases Society (PIDS), the Society for Critical Care Medicine (SCCM), the Society for Hospital Medicine (SHM), and the Surgical Infection Society (SIS).