To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Among 353 healthcare personnel in a longitudinal cohort in four hospitals in Atlanta, GA (May-June 2020), 23 (6.5%) had SARS-CoV-2 antibodies. Spending >50% of a typical shift at bedside (OR 3.4, 95% CI: 1.2–10.5) and Black race (OR 8.4, 95% CI: 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
The principal aim of this study was to optimize the diagnosis of canine neuroangiostrongyliasis (NA). In total, 92 cases were seen between 2010 and 2020. Dogs were aged from 7 weeks to 14 years (median 5 months), with 73/90 (81%) less than 6 months and 1.7 times as many males as females. The disease became more common over the study period. Most cases (86%) were seen between March and July. Cerebrospinal fluid (CSF) was obtained from the cisterna magna in 77 dogs, the lumbar cistern in f5, and both sites in 3. Nucleated cell counts for 84 specimens ranged from 1 to 146 150 cells μL−1 (median 4500). Percentage eosinophils varied from 0 to 98% (median 83%). When both cisternal and lumbar CSF were collected, inflammation was more severe caudally. Seventy-three CSF specimens were subjected to enzyme-linked immunosorbent assay (ELISA) testing for antibodies against A. cantonensis; 61 (84%) tested positive, titres ranging from <100 to ⩾12 800 (median 1600). Sixty-one CSF specimens were subjected to real-time quantitative polymerase chain reaction (qPCR) testing using a new protocol targeting a bioinformatically-informed repetitive genetic target; 53/61 samples (87%) tested positive, CT values ranging from 23.4 to 39.5 (median 30.0). For 57 dogs, it was possible to compare CSF ELISA serology and qPCR. ELISA and qPCR were both positive in 40 dogs, in 5 dogs the ELISA was positive while the qPCR was negative, in 9 dogs the qPCR was positive but the ELISA was negative, while in 3 dogs both the ELISA and qPCR were negative. NA is an emerging infectious disease of dogs in Sydney, Australia.
Background: Massive hemorrhage protocols (MHPs) streamline the complex logistics required for prompt care of the bleeding patient, but their uptake has been variable and few regions have a system to measure outcomes from these events. Aim Statement: We aim to implement a standardized MHP with uniform quality improvement (QI) metrics to increase uptake of evidence-based MHPs across 150-hospitals in Ontario between 2017 and 2021. Measures & Design: We performed ongoing PDSA cycles; 1) stakeholder analysis by surveying the Ontario Regional Blood Coordinating Network (ORBCoN), 2) problem characterization and Ishikawa analysis for key QI metrics based on areas of MHP variability in 150 Ontario hospitals using a web-based survey, 3) creation of a consensus MHP via a modified Delphi process, 4) problem characterization at ORBCoN for the design of a freely available toolkit for provincial implementation by expert working groups, 5) design of 8 key QI metrics by a modified Delphi process, and 6) identification of process measures for QI data collection by implementation metrics. Evaluation/Results: PDSA1-2; 150-hospitals were surveyed. 33% of hospitals lacked MHPs, mostly in smaller sites. Major areas for QI were related to activation criteria, hemostatic agents, protocolized hypothermia management, variable MHP naming, QI metrics and serial blood work requirements. PDSA3; 3 Delphi rounds were held to reach 100% expert consensus for 42 statements and 8 CQI metrics. Major areas for modification were protocol name, laboratory resuscitation targets, cooler configurations, and role of factor VIIa. PDSA4; adaptable toolkit is under development by the steering committee and expert working groups. Implementation is scheduled for Spring 2020. PDSA5; the 8 CQI metrics are: TXA administration < 1 h, RBC transfusion < 15 min, call to transfer for definitive care < 60 min, temp >35°C at end of protocol, Hgb kept between 60-110g/L, transition to group-specific RBC by 90 min, appropriate activation defined by ≥6 units RBC in the first 24 hours, and any blood component wastage. Discussion/Impact: MHP uptake, content, and tracking is variable. A standardized MHP that is adaptable to diverse settings decreases complexity, improves use of evidence-based practices, and provides a platform for continuous QI. PDSA6 will occur after implementation; we will complete an implementation survey, and design a pilot and feasibility study for prospective tracking of patient outcomes using existing prospectively collected inter-hospital and provincial databases.
Assess current monitoring standards of vital signs, agents in rapid tranquilization, and adverse events related to poor monitoring.
Retrospective audit. 136 Physical restraints reviewed. 92 case notes examined. Gold standard: All physical restraints requiring rapid tranquilization had immediate and regular monitoring every 5 – 10 minutes for the first hour then every 30 minutes for next two hours. in repeat rapid tranquilization, same monitoring standards were examined. Adverse effects or clinical incidents were recorded along with agents used in rapid tranquilization.
Of 92 physical restraints, 62 required rapid tranquilization. of 62 rapid tranquilizations, 12 were repeat rapid tranquilizations. No rapid tranquilizations had adequate monitoring. Adverse events seen in 19% of cases, of these 40% were seen in repeat rapid tranquilization events. the most commonly used agents were a combination of benzodiazepine + antipsychotic (52%). Single agent use was associated with a higher risk of repeat physical restraint and rapid tranquilization (32%) versus combination of agents (18%).
Adequate monitoring of vital signs could have prevented many of the adverse events seen in this audit. Evidence suggests training of staff in both monitoring of patient and the use of de-escalation techniques can sometimes prevent the need for rapid tranquilisation or if required, ensure that it is done so in a safe manner. Recommendations included the proposal of a document for vital sign monitoring along with guidance on managing common adverse events. Training for all staff members,in use of de-escalation techniques, monitoring equipment, resuscitation skills and equipment training.
Breakfast cereals are widely consumed in Ireland with over 80% of adults choosing ready-to-eat cereals or porridge. In terms of healthy eating, breakfast cereals are considered a nutritious choice and are not expected to contribute significantly to daily salt intakes. Since 2003, the Food Safety Authority of Ireland has coordinated a salt reduction programme to achieve voluntary reduction by the food industry in the salt content of processed foods available in Ireland. This study aims to examine whether salt levels of breakfast cereals are decreasing due to reformulation practices.
A random selection of breakfast cereals on the Irish market were sampled using the following categories: rice-based, bran-based, cornflake-type, biscuit-based, multigrain, muesli and no added salt/low salt varieties in 2003, 2007, 2011 and 2015 (muesli and no added salt/low salt varieties were not sampled again in 2015) (n687). Samples were analysed for sodium content using atomic emission spectrophotometry and converted to salt (g) per 100 g of the food product by multiplying by 2.54. Results were analysed using IBM SPSS (version 25). As data was not normally distributed, median values (minimum and maximum) were investigated across breakfast cereal categories at the different time-points. Differences between the time-points were assessed using Krusal-Wallis test and Mann-Whitney U tests.
In 2003, salt levels were found to be highest in cornflake-type cereals and lowest in no added salt/low salt cereals (2.02 g (0.20–2.31) and 0.01 g (0.0–0.03) per 100 g respectively). The salt content of rice-based, bran-based, cornflake-type, biscuit-based and multigrain varieties significantly decreased (up to 65% in cornflake-type cereals) until 2011. No further reduction was achieved for rice-based, bran-based and cornflake-type varieties in 2015 and a significant increase in salt was observed for biscuit-based (p = 0.001) and multigrain products (p = 0.007). Between 2003 and 2011, no reduction in salt levels was observed for muesli or no added salt/low salt products.
This study revealed there has been a significant reduction in the salt content of breakfast cereals since 2003 – an important finding considering breakfast cereals are recommended for healthy eating. However, this work also shows that continuous salt monitoring is necessary to ensure this reduction in breakfast cereals is maintained. Future FSAI reformulation work will examine a range of nutrients in food products as the food industry have committed to achieve a gradual reduction in fat, saturated fat and sugar, as well as salt, as part of the National Obesity Policy and Action Plan.
Over half of the Irish population is overweight or obese. The Obesity Policy and Action Plan 2016–2025 will set reformulation targets for fat, saturated fat and sugar in Ireland and review progress. In 2016, the Food Safety Authority of Ireland undertook a cross-sectional market scan of yoghurts to evaluate the energy, fat, saturated fat and sugar content based solely on declared nutrition labels. The aims of this 2018 study were to verify the accuracy of declared nutrition information on yoghurts and to confirm the suitability of declared nutrition labels for energy, fat, saturated fat and sugar reformulation monitoring.
Yoghurts identified in the 2016 market scan (n578) were weighted based on categorisation of manufacturer type (branded, own brand), product category (natural, flavoured and luxury) and declared nutrition content. Samples (n200) were randomly selected from these weighted groups and tested by a laboratory accredited for energy, fat, saturated fat, and sugar analysis. Data was analysed using IBM SPSS (version25). As data was not normally distributed, median values were investigated for declared and tested energy, fat, saturated fat and sugar content using Wilcoxon Signed-Rank Test and Spearman Rank-Order Correlation.
Of the tested yoghurts, 3% (n6), 5% (n9) and 19% (n31) were outside the recommended European Commission (EC) labelling tolerance for fat, saturated fat and sugar, respectively. Tested nutrient content was consistently lower than declared. There was a statistically significant difference in declared vs. tested energy (87kcal vs. 84kcal p = 0.03), fat (2.7 g vs. 2.5 g p < 0.001), and sugar (9.9 g vs. 8.7 g p < 0.001) content per 100 g yoghurt. Declared vs. tested sugar content per 100 g yoghurt was statistically significant across all yoghurt types, including natural (4.8 g vs. 3.4 g p < 0.001), flavoured (9.7 g vs. 8.6 g p < 0.001) and luxury (15 g vs. 13.6 g p = 0.002). There was a statistically significant difference between declared vs. tested fat (2.8 g vs. 2.5 g p < 0.001) and saturated fat (1.9 g vs.1.6 g p = 0.017) content of own brand yoghurts per 100 g. There was a positive correlation between energy content and portion size (r = .2,p < 0.01).
There was a high level of agreement between declared vs. tested fat and saturated fat content of yoghurts, but a lower level of agreement between declared vs. tested sugar content of yoghurts. This indicates that declared nutrition labels are suitable for reformulation monitoring of fat and saturated fat, but may not be suitable for sugar. This finding will be further investigated and tested in future work planned for nutrition label verification of other food categories.
Habitat avoidance is an anti-parasite behaviour exhibited by at-risk hosts that can minimize exposure to parasites. Because environments are often heterogeneous, host decision-making with regards to habitat use may be affected by the presence of parasites and habitat quality simultaneously. In this study we examine how the ovipositing behaviour of a cactiphilic fruit fly, Drosophila nigrospiracula, is affected by the presence of an ectoparasitic mite, Macrocheles subbadius, in conjunction with other environmental factors – specifically the presence or absence of conspecific eggs and host plant tissue. We hypothesized that the trade-off between site quality and parasite avoidance should favour ovipositing at mite-free sites even if it is of inferior quality. We found that although flies avoided mites in homogeneous environments (86% of eggs at mite-free sites), site quality overwhelmed mite avoidance. Both conspecific eggs (65% of eggs at infested sites with other Drosophila eggs) and host plant tissue (78% of eggs at infested sites with cactus) overpowered mite avoidance. Our results elucidate the context-dependent decision-making of hosts in response to the presence of parasites in variable environments, and suggest how the ecology of fear and associated trade-offs may influence the relative investment in anti-parasite behaviour in susceptible hosts.
Prematurity impacts myocardial development and may determine long-term outcomes. The objective of this study was to test the hypothesis that preterm neonates develop right ventricle dysfunction and adaptive remodelling by 32 weeks post-menstrual age that persists through 1 year corrected age.
Materials and Methods:
A subset of 80 preterm infants (born <29 weeks) was selected retrospectively from a prospectively enrolled cohort and measures of right ventricle systolic function and morphology by two-dimensional echocardiography were assessed at 32 weeks post-menstrual age and at 1 year of corrected age. Comparisons were made to 50 term infants at 1 month and 1 year of age. Sub-analyses were performed in preterm-born infants with bronchopulmonary dysplasia and/or pulmonary hypertension.
In both term and preterm infants, right ventricle function and morphology increased over the first year (p < 0.01). The magnitudes of right ventricle function measures were lower in preterm-born infants at each time period (p < 0.01 for all) and right ventricle morphology indices were wider in all preterm infants by 1 year corrected age, irrespective of lung disease. Measures of a) right ventricle function were further decreased and b) morphology increased through 1 year in preterm infants with bronchopulmonary dysplasia and/or pulmonary hypertension (p < 0.01).
Preterm infants exhibit abnormal right ventricle performance with remodelling at 32 weeks post-menstrual age that persists through 1 year corrected age, suggesting a less developed intrinsic myocardial function response following preterm birth. The development of bronchopulmonary dysplasia and pulmonary hypertension leave a further negative impact on right ventricle mechanics over the first year of age.
Introduction: Emergency hospital admissions are a growing concern for patients and health systems, globally. The objective of this study was to systematically review the evidence for diagnostic, medical, and surgical interventions that reduce emergency hospital admissions. Methods: We conducted a systematic review of systematic reviews by searching MEDLINE, PubMED, the Cochrane Database of Systematic Reviews, Google Scholar, and grey literature. Systematic reviews of any diagnostic, surgical, or medical interventions examining the effect on emergency hospital admissions among adults were included. The quality of reviews was assessed using AMSTAR and the quality of evidence was assessed using GRADE. The subsequent analysis was restricted to interventions with moderate or high-quality evidence only. Results: 13 051 titles and abstracts and 1 791 full-text articles were screened from which 42 systematic reviews were included. The reviews included an underlying evidence base of 215 randomized controlled trials with 135 282 patients. Of 20 unique diagnostic, medical, and surgical interventions identified, four had moderate (n = 4) or high (n = 0) quality evidence for significant reductions in hospital admissions in five patient populations. These were: cardiac resynchronization therapy for heart failure and atrial fibrillation, percutaneous aspiration for pneumothorax, early/routine coronary angiography for acute coronary syndrome (alone or comorbid with chronic kidney disease), and natriuretic peptide guided therapy for heart failure. Conclusion: We identified four interventions across five populations that when optimized, may lead to reductions in emergency hospital admissions. These finding can therefore help guide the development of quality indicators, standards, or practice guidelines.
Clostridium difficile, the most common cause of hospital-associated diarrhoea in developed countries, presents major public health challenges. The high clinical and economic burden from C. difficile infection (CDI) relates to the high frequency of recurrent infections caused by either the same or different strains of C. difficile. An interval of 8 weeks after index infection is commonly used to classify recurrent CDI episodes. We assessed strains of C. difficile in a sample of patients with recurrent CDI in Western Australia from October 2011 to July 2017. The performance of different intervals between initial and subsequent episodes of CDI was investigated. Of 4612 patients with CDI, 1471 (32%) were identified with recurrence. PCR ribotyping data were available for initial and recurrent episodes for 551 patients. Relapse (recurrence with same ribotype (RT) as index episode) was found in 350 (64%) patients and reinfection (recurrence with new RT) in 201 (36%) patients. Our analysis indicates that 8- and 20-week intervals failed to adequately distinguish reinfection from relapse. In addition, living in a non-metropolitan area modified the effect of age on the risk of relapse. Where molecular epidemiological data are not available, we suggest that applying an 8-week interval to define recurrent CDI requires more consideration.
Salmonella enterica serovar Wangata (S. Wangata) is an important cause of endemic salmonellosis in Australia, with human infections occurring from undefined sources. This investigation sought to examine possible environmental and zoonotic sources for human infections with S. Wangata in north-eastern New South Wales (NSW), Australia. The investigation adopted a One Health approach and was comprised of three complimentary components: a case–control study examining human risk factors; environmental and animal sampling; and genomic analysis of human, animal and environmental isolates. Forty-eight human S. Wangata cases were interviewed during a 6-month period from November 2016 to April 2017, together with 55 Salmonella Typhimurium (S. Typhimurium) controls and 130 neighbourhood controls. Indirect contact with bats/flying foxes (S. Typhimurium controls (adjusted odds ratio (aOR) 2.63, 95% confidence interval (CI) 1.06–6.48)) (neighbourhood controls (aOR 8.33, 95% CI 2.58–26.83)), wild frogs (aOR 3.65, 95% CI 1.32–10.07) and wild birds (aOR 6.93, 95% CI 2.29–21.00) were statistically associated with illness in multivariable analyses. S. Wangata was detected in dog faeces, wildlife scats and a compost specimen collected from the outdoor environments of cases’ residences. In addition, S. Wangata was detected in the faeces of wild birds and sea turtles in the investigation area. Genomic analysis revealed that S. Wangata isolates were relatively clonal. Our findings suggest that S. Wangata is present in the environment and may have a reservoir in wildlife populations in north-eastern NSW. Further investigation is required to better understand the occurrence of Salmonella in wildlife groups and to identify possible transmission pathways for human infections.
Evidence from animal models indicates that exposure to an obesogenic or hyperglycemic intrauterine environment adversely impacts offspring kidney development and renal function. However, evidence from human studies has not been evaluated systematically. Therefore, the aim of this systematic review was to synthesize current research in humans that has examined the relationship between gestational obesity and/or diabetes and offspring kidney structure and function. Systematic electronic database searches were conducted of five relevant databases (CINAHL, Cochrane, EMBASE, MEDLINE and Scopus). Preferred Reporting Items for Systematic Reviews and Meta-analysis guidelines were followed, and articles screened by two independent reviewers generated nine eligible papers for inclusion. Six studies were assessed as being of ‘neutral’ quality, two of ‘negative’ and one ‘positive’ quality. Observational studies suggest that offspring exposed to a hyperglycemic intrauterine environment are more likely to display markers of renal dysfunction and are at higher risk of end-stage renal disease. There was limited and inconsistent evidence for a link between exposure to an obesogenic intrauterine environment and offspring renal outcomes. Offspring renal outcome measures across studies were diverse, with a large variation in offspring age at follow-up, limiting comparability across studies. The collective current body of evidence suggests that intrauterine exposure to maternal obesity and/or diabetes adversely impacts renal programming in offspring, with an increased risk of kidney disease in adulthood. Further high-quality, longitudinal, prospective cohort studies that measure indicators of offspring renal development and function, including fetal kidney volume and albuminuria, at standardized follow-up time points, are warranted.
A total of 592 people reported gastrointestinal illness following attendance at Street Spice, a food festival held in Newcastle-upon-Tyne, North East England in February/March 2013. Epidemiological, microbiological and environmental investigations were undertaken to identify the source and prevent further cases. Several epidemiological analyses were conducted; a cohort study; a follow-up survey of cases and capture re-capture to estimate the true burden of cases. Indistinguishable isolates of Salmonella Agona phage type 40 were identified in cases and on fresh curry leaves used in one of the accompaniments served at the event. Molecular testing indicated entero-aggregative Escherichia coli and Shigella also contributed to the burden of illness. Analytical studies found strong associations between illness and eating food from a particular stall and with food items including coconut chutney which contained fresh curry leaves. Further investigation of the food supply chain and food preparation techniques identified a lack of clear instruction on the use of fresh uncooked curry leaves in finished dishes and uncertainty about their status as a ready-to-eat product. We describe the investigation of one of the largest outbreaks of food poisoning in England, involving several gastrointestinal pathogens including a strain of Salmonella Agona not previously seen in the UK.
In Sweden, leishmaniasis is an imported disease and its epidemiology and incidence were not known until now. We conducted a retrospective, nationwide, epidemiological study from 1993 to 2016. Probable cases were patients with leishmaniasis diagnoses reported to the Swedish Patient registry, collecting data on admitted patients in Swedish healthcare since 1993 and out-patient visits since 2001. Confirmed cases were those with a laboratory test positive for leishmaniasis during 1993–2016. 299 probable cases and 182 confirmed cases were identified. Annual incidence ranged from 0.023 to 0.35 per 100 000 with a rapid increase in the last 4 years. Of 182 laboratory-verified cases, 96 were diagnosed from 2013 to 2016, and in this group, almost half of the patients were children under 18 years. Patients presented in different healthcare settings in all regions of Sweden. Cutaneous leishmaniasis was the most common clinical manifestation and the majority of infections were acquired in Asia including the Middle East, specifically Syria and Afghanistan. Leishmania tropica was responsible for the majority of cases (42%). A combination of laboratory methods increased the sensitivity of diagnosis among confirmed cases. In 2016, one-tenth of the Swedish population were born in Leishmania-endemic countries and many Swedes travel to these countries for work or vacation. Swedish residents who have spent time in Leishmania-endemic areas, could be at risk of developing disease some time during their lives. Increased awareness and knowledge are needed for correct diagnosis and management of leishmaniasis in Sweden.