To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Foodborne salmonellosis causes approximately 1 million illnesses annually in the United States. In the summer of 2017, we investigated four multistate outbreaks of Salmonella infections associated with Maradol papayas imported from four Mexican farms. PulseNet initially identified a cluster of Salmonella Kiambu infections in June 2017, and early interviews identified papayas as an exposure of interest. Investigators from Maryland, Virginia and Food and Drug Administration (FDA) collected papayas for testing. Several strains of Salmonella were isolated from papayas sourced from Mexican Farm A, including Salmonella Agona, Gaminara, Kiambu, Thompson and Senftenberg. Traceback from two points of service associated with illness sub-clusters in two states identified Farm A as a common source of papayas, and three voluntary recalls of Farm A papayas were issued. FDA sampling isolated four additional Salmonella strains from papayas sourced from Mexican Farms B, C and D. In total, four outbreaks were identified, resulting in 244 cases with illness onset dates from 20 December 2016 to 20 September 2017. The sampling of papayas and the collaborative work of investigative partners were instrumental in identifying the source of these outbreaks and preventing additional illnesses. Evaluating epidemiological, laboratory and traceback evidence together during investigations is critical to solving and stopping outbreaks.
In the present study, we aimed to compare anthropometric indicators as predictors of mortality in a community-based setting.
We conducted a population-based longitudinal study nested in a cluster-randomized trial. We assessed weight, height and mid-upper arm circumference (MUAC) on children 12 months after the trial began and used the trial’s annual census and monitoring visits to assess mortality over 2 years.
Children aged 6–60 months during the study.
Of 1023 children included in the study at baseline, height-for-age Z-score, weight-for-age Z-score, weight-for-height Z-score and MUAC classified 777 (76·0 %), 630 (61·6 %), 131 (12·9 %) and eighty (7·8 %) children as moderately to severely malnourished, respectively. Over the 2-year study period, fifty-eight children (5·7 %) died. MUAC had the greatest AUC (0·68, 95 % CI 0·61, 0·75) and had the strongest association with mortality in this sample (hazard ratio = 2·21, 95 % CI 1·26, 3·89, P = 0·006).
MUAC appears to be a better predictor of mortality than other anthropometric indicators in this community-based, high-malnutrition setting in Niger.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Children with congenital heart disease are at high risk for malnutrition. Standardisation of feeding protocols has shown promise in decreasing some of this risk. With little standardisation between institutions’ feeding protocols and no understanding of protocol adherence, it is important to analyse the efficacy of individual aspects of the protocols.
Adherence to and deviation from a feeding protocol in high-risk congenital heart disease patients between December 2015 and March 2017 were analysed. Associations between adherence to and deviation from the protocol and clinical outcomes were also assessed. The primary outcome was change in weight-for-age z score between time intervals.
Increased adherence to and decreased deviation from individual instructions of a feeding protocol improves patients change in weight-for-age z score between birth and hospital discharge (p = 0.031). Secondary outcomes such as markers of clinical severity and nutritional delivery were not statistically different between groups with high or low adherence or deviation rates.
High-risk feeding protocol adherence and fewer deviations are associated with weight gain independent of their influence on nutritional delivery and caloric intake. Future studies assessing the efficacy of feeding protocols should include the measures of adherence and deviations that are not merely limited to caloric delivery and illness severity.
Introduction: 9-1-1 telecommunicators receive minimal education on agonal breathing, often resulting in unrecognized out-of-hospital cardiac arrest (OHCA). We successfully piloted an educational intervention that significantly improved telecommunicators’ OHCA recognition and bystander CPR rates in Ottawa. We sought to better understand the operations of Canadian 9-1-1 communications centers (CC) in preparation for a multi-centre study of this intervention. Methods: We conducted a National survey of all Canadian CCs. Survey domains included information on organizational structure, dispatch system used, education curriculum, and performance monitoring. It was peer-reviewed, translated in French, pilot-tested, and distributed electronically using a modified Dillman method. We designated respondents in each CC before distribution and used targeted follow-up and small incentives to increase response rate. Respondents also described functioning of neighboring CCs if known. Results: We received information from 51/51 provincial and 1/25 territorial CCs, representing 99.7% of the Canadian population. CCs largely utilize the Medical Dispatch Priority System (MPDS) platform (93%), many are Province/Ministry regulated (50%) and most require a High School diploma as minimum entry level education (78%). Telecommunicators receive initial in-class training (median 1.3 months, IQR 0.3-1.9; range 0.1-2.2), often followed by a preceptorship (84.4%) (median 1.0 months, IQR 0.7-1.7; range 0.4-6.0). Educational curriculum includes information on agonal breathing in 41% of CC, without audio examples in 34%. Among responding CCs, over 39,000 suspected OHCA 9-1-1 calls are received annually. Few CCs maintain local performance statistics on OHCA recognition (25%), bystander CPR rates (25%) or survival rates (50%). Most (97%) expressed interest in future research collaborations. Conclusion: Most Canadian telecommunicators receive no or minimal education in recognizing agonal breathing. Further training and improved OHCA monitoring may assist recognition and enhance outcomes.
Understanding temporal patterns in biodiversity is an enduring question in paleontology. Compared with studies of taxonomic diversity, long-term perspectives on ecological diversity are rare, particularly in terrestrial systems. Yet ecological diversity is critical for the maintenance of biodiversity, especially during times of major perturbations. Here, we explore the ecological diversity of Cretaceous herbivorous dinosaurs leading up to the K-Pg extinction, using dental and jaw morphological disparity as a proxy. We test the hypothesis that a decline in ecological diversity could have facilitated their rapid extinction 66 Ma. We apply three disparity metrics that together capture different aspects of morphospace occupation and show how this approach is key to understanding patterns of morphological evolution. We find no evidence of declining disparity in herbivorous dinosaurs as a whole—suggesting that dinosaur ecological diversity remained high during the last 10 Myr of their existence. Clades show different disparity trends through the Cretaceous, but none except sauropods exhibits a long-term decline. Herbivorous dinosaurs show two disparity peaks characterized by different processes; in the Early Cretaceous by expansion in morphospace and in the Campanian by morphospace packing. These trends were only revealed by using a combination of disparity metrics, demonstrating how this approach can offer novel insights into macroevolutionary processes underlying patterns of disparity and ecological diversity.
We observed pediatric S. aureus hospitalizations decreased 36% from 26.3 to 16.8 infections per 1,000 admissions from 2009 to 2016, with methicillin-resistant S. aureus (MRSA) decreasing by 52% and methicillin-susceptible S. aureus decreasing by 17%, among 39 pediatric hospitals. Similar decreases were observed for days of therapy of anti-MRSA antibiotics.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
The triazines are one of the most widely used herbicide classes ever developed and are critical for managing weed populations that have developed herbicide resistance. These herbicides are traditionally valued for their residual weed control in more than 50 crops. Scientific literature suggests that atrazine, and perhaps other s-triazines, may no longer remain persistent in soils due to enhanced microbial degradation. Experiments examined the rate of degradation of atrazine and two other triazine herbicides, simazine and metribuzin, in both atrazine-adapted and non-history Corn Belt soils, with similar soils being used from each state as a comparison of potential triazine degradation. In three soils with no history of atrazine use, the t1/2 of atrazine was at least four times greater than in three soils with a history of atrazine use. Simazine degradation in the same three sets of soils was 2.4 to 15 times more rapid in history soils than non-history soils. Metribuzin in history soils degraded at 0.6, 0.9, and 1.9 times the rate seen in the same three non-history soils. These results indicate enhanced degradation of the symmetrical triazine simazine, but not of the asymmetrical triazine metribuzin.
In 785 mother–child (50% male) pairs from a longitudinal epidemiological birth cohort, we investigated associations between inflammation-related epigenetic polygenic risk scores (i-ePGS), environmental exposures, cognitive function, and child and adolescent internalizing and externalizing problems. We examined prenatal and postnatal effects. For externalizing problems, one prenatal effect was found: i-ePGS at birth associated with higher externalizing problems (ages 7–15) indirectly through lower cognitive function (age 7). For internalizing problems, we identified two effects. For a prenatal effect, i-ePGS at birth associated with higher internalizing symptoms via continuity in i-ePGS at age 7. For a postnatal effect, higher postnatal adversity exposure (birth through age 7) associated with higher internalizing problems (ages 7–15) via higher i-ePGS (age 7). Hence, externalizing problems were related mainly to prenatal effects involving lower cognitive function, whereas internalizing problems appeared related to both prenatal and postnatal effects. The present study supports a link between i-ePGS and child and adolescent mental health.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
Improving access to tuberculosis (TB) care and ensuring early diagnosis are two major aims of the WHO End TB strategy and the Collaborative TB Strategy for England. This study describes risk factors associated with diagnostic delay among TB cases in England. We conducted a retrospective cohort study of TB cases notified to the Enhanced TB Surveillance System in England between 2012 and 2015. Diagnostic delay was defined as more than 4 months between symptom onset and treatment start date. Multivariable logistic regression was used to identify demographic and clinical factors associated with diagnostic delay. Between 2012 and 2015, 22 422 TB cases were notified in England and included in the study. A third (7612) of TB cases had a diagnostic delay of more than 4 months. Being female, aged 45 years and older, residing outside of London and having extra-pulmonary TB disease were significantly associated with a diagnostic delay in the multivariable model (aOR = 1.2, 1.2, 1.2, 1.3, 1.8, respectively). This study identifies demographic and clinical factors associated with diagnostic delay, which will inform targeted interventions to improve access to care and early diagnosis among these groups, with the ultimate aim of helping reduce transmission and improve treatment outcomes for TB cases in England.
Background: Despite advances in neonatal care, neonates with moderate to severe HIE are at high risk of mortality and morbidity. we report the impact of a dedicated NNCC team on short term mortality and morbidities. Methods: A retrospective cohort study on neonates with moderate to serve HIE between July 1st 2008 and December 31st 2017. primary outcome : a composite of death and/or brain injury on MRI. Secondary outcomes: rate of cooling, length of hospital stay, anti-seizure medication burden, and use of inotropes. A regression analysis was done adjusting for gestational age, birth weight, gender, out-born status, Apgar score at 10 minutes, cord blood pH, and HIE clinical staging Results: 216 neonates were included, 109 before NNCC implementation, and 107 thereafter. NNCC program resulted in reduction in the primary outcome (AOR: 0.28, CI: 0.14-0.54, p<0.001) and brain injury (AOR: 0.28, CI: 0.14-0.55, p<0.001). It decreased average length of stay/infants by 5 days (p=0.03), improved cooling rate (73% compared to 93% , p <0.001), reduced: seizure misdiagnosis (71% compared to 23%, P <0.001), anti-seizure medication burden (P = 0.001), and inotrope use (34% compared to 53%, p=0.004) Conclusions: NNCC program decreased mortality and brain injury , shortened the length of hospital stay and improved care of neonates with significant HIE.
Background: Continuous video-EEG (cvEEG) monitoring is the standard of care for diagnosis and management of neonatal seizures. However, it is labour-intensive. We aimed to establish consistency in monitoring of newborns utilising NICU nurses. Methods: Neonatal nurses were trained to apply scalp electrodes, troubleshoot technical issues. Guidelines, checklists and visual training modules were developed. A central network system allowed remote access to the cvEEGs by the epileptologist for timely interpretation and feedback. We compared 100 infants with moderate to severe HIE before and after the training program. Results: 192 cvEEGs were performed. Of the 100 infants compared; time to initiate brain monitoring decreased by average of 31.5 hours, in electrographic seizure detection increased(20% compared to 34% a), seizure clinical misdiagnosis decreased (65% compared to 36% ), and Anti-Seizure burden decreased. Conclusions: Training experienced NICU nurses to set-up, start and monitor cvEEG can decrease the time to initiate cvEEG which may lead to better seizure diagnosis and management.
Insomnia and depression are highly comorbid and mutually exacerbate clinical trajectories and outcomes. Cognitive behavioral therapy for insomnia (CBT-I) effectively reduces both insomnia and depression severity, and can be delivered digitally. This could substantially increase the accessibility to CBT-I, which could reduce the health disparities related to insomnia; however, the efficacy of digital CBT-I (dCBT-I) across a range of demographic groups has not yet been adequately examined. This randomized placebo-controlled trial examined the efficacy of dCBT-I in reducing both insomnia and depression across a wide range of demographic groups.
Of 1358 individuals with insomnia randomized, a final sample of 358 were retained in the dCBT-I condition and 300 in the online sleep education condition. Severity of insomnia and depression was examined as a dependent variable. Race, socioeconomic status (SES; household income and education), gender, and age were also tested as independent moderators of treatment effects.
The dCBT-I condition yielded greater reductions in both insomnia and depression severity than sleep education, with significantly higher rates of remission following treatment. Demographic variables (i.e. income, race, sex, age, education) were not significant moderators of the treatment effects, suggesting that dCBT-I is comparably efficacious across a wide range of demographic groups. Furthermore, while differences in attrition were found based on SES, attrition did not differ between white and black participants.
Results provide evidence that the wide dissemination of dCBT-I may effectively target both insomnia and comorbid depression across a wide spectrum of the population.