We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Crises such as the global pandemic of COVID-19 (coronavirus) elicit a range of responses from individuals and societies adversely affecting physical and emotional well-being. This article provides an overview of factors elicited in response to COVID-19 and their impact on immunity, physical health, mental health and well-being. Certain groups, such as individuals with mental illness, are especially vulnerable, so it is important to maximise the supports available to this population and their families during the pandemic. More broadly, the World Health Organization recommends ‘Psychological First Aid’ as a useful technique that can help many people in a time of crisis.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
Many institutions are attempting to implement patient-reported outcome (PRO) measures. Because PROs often change clinical workflows significantly for patients and providers, implementation choices can have major impact. While various implementation guides exist, a stepwise list of decision points covering the full implementation process and drawing explicitly on a sociotechnical conceptual framework does not exist.
Methods:
To facilitate real-world implementation of PROs in electronic health records (EHRs) for use in clinical practice, members of the EHR Access to Seamless Integration of Patient-Reported Outcomes Measurement Information System (PROMIS) Consortium developed structured PRO implementation planning tools. Each institution pilot tested the tools. Joint meetings led to the identification of critical sociotechnical success factors.
Results:
Three tools were developed and tested: (1) a PRO Planning Guide summarizes the empirical knowledge and guidance about PRO implementation in routine clinical care; (2) a Decision Log allows decision tracking; and (3) an Implementation Plan Template simplifies creation of a sharable implementation plan. Seven lessons learned during implementation underscore the iterative nature of planning and the importance of the clinician champion, as well as the need to understand aims, manage implementation barriers, minimize disruption, provide ample discussion time, and continuously engage key stakeholders.
Conclusions:
Highly structured planning tools, informed by a sociotechnical perspective, enabled the construction of clear, clinic-specific plans. By developing and testing three reusable tools (freely available for immediate use), our project addressed the need for consolidated guidance and created new materials for PRO implementation planning. We identified seven important lessons that, while common to technology implementation, are especially critical in PRO implementation.
Executive functions (EF) drive health and educational outcomes and therefore are increasingly common treatment targets. Most treatment trials rely on questionnaires to capture meaningful change because ecologically valid, pediatric performance-based EF tasks are lacking. The Executive Function Challenge Task (EFCT) is a standardized, treatment-sensitive, objective measure which assesses flexibility and planning in the context of provocative social interactions, making it a “hot” EF task.
Method:
We investigate the structure, reliability, and validity of the EFCT in youth with autism (Autism Spectrum Disorder; n = 129), or attention deficit hyperactivity disorder with flexibility problems (n = 93), and typically developing (TD; n = 52) youth.
Results:
The EFCT can be coded reliably, has a two-factor structure (flexibility and planning), and adequate internal consistency and consistency across forms. Unlike a traditional performance-based EF task (verbal fluency), it shows significant correlations with parent-reported EF, indicating ecological validity. EFCT performance distinguishes youth with known EF problems from TD youth and is not significantly related to visual pattern recognition, or social communication/understanding in autistic children.
Conclusions:
The EFCT demonstrates adequate reliability and validity and may provide developmentally appropriate, treatment-sensitive, and ecologically valid assessment of “hot” EF in youth. It can be administered in controlled settings by masked administrators.
The mechanism through which developmental programming of offspring overweight/obesity following in utero exposure to maternal overweight/obesity operates is unknown but may operate through biologic pathways involving offspring anthropometry at birth. Thus, we sought to examine to what extent the association between in utero exposure to maternal overweight/obesity and childhood overweight/obesity is mediated by birth anthropometry. Analyses were conducted on a retrospective cohort with data obtained from one hospital system. A natural effects model framework was used to estimate the natural direct effect and natural indirect effect of birth anthropometry (weight, length, head circumference, ponderal index, and small-for-gestational age [SGA] or large-for-gestational age [LGA]) for the association between pre-pregnancy maternal body mass index (BMI) category (overweight/obese vs normal weight) and offspring overweight/obesity in childhood. Models were adjusted for maternal and child socio-demographics. Three thousand nine hundred and fifty mother–child dyads were included in analyses (1467 [57.8%] of mothers and 913 [34.4%] of children were overweight/obese). Results suggest that a small percentage of the effect of maternal pre-pregnancy BMI overweight/obesity on offspring overweight/obesity operated through offspring anthropometry at birth (weight: 15.5%, length: 5.2%, head circumference: 8.5%, ponderal index: 2.2%, SGA: 2.9%, and LGA: 4.2%). There was a small increase in the percentage mediated when gestational diabetes or hypertensive disorders were added to the models. Our study suggests that some measures of birth anthropometry mediate the association between maternal pre-pregnancy overweight/obesity and offspring overweight/obesity in childhood and that the size of this mediated effect is small.
Methods to stimulate appetite in the sick or elderly remains a challenge with few safe therapeutic options. Ghrelin is an orexigenic hormone, increasing appetite and subsequent food intake. It has received considerable attention as a therapeutic target to stimulate food intake in patients with anorexia. The identification of food-grade bioactives with proven orexigenic effects would mark significant progress in the treatment of disease-related malnutrition. This study therefore investigated the effects of two milk-derived ghrelinergic peptides on appetite and energy intake in healthy humans.
A single-blind, placebo-controlled, 3-arm (placebo, casein bioactive MF1145 and whey bioactive UL-2-141) cross-over trial was conducted in healthy male volunteers. Participants received 26 mg/kg of both the bioactives and placebo. The main outcome measures were energy & protein intake from a set breakfast and ad libitum lunch and subjective appetite sensations as assessed by visual analogue scale (VAS). Basal and postprandial levels of active ghrelin (AG) were measured. Dietary intakes were analysed using Nutritics software. Statistical analyses were performed in R.
Overall, 22 male participants (mean age 27 years) were included, average BMI was 24.6 kg/m2, (19.8 to 30.2 kg/m2). Mean energy and protein intakes at lunch when treated with placebo were 1343 kcal (95% CI: 1215–1471 kcal) and 74 g (95% CI: 66–81 g), respectively. Energy and protein intakes were not significantly different from placebo for either treatment (p = 0.918, p = 0.319 for UL-2-141 and p = 0.889, p = 0.959 for MF1145, respectively). Similarly, appetite, hunger and satiety responses on VAS were not significantly different from placebo for either treatment. AG peak post-lunch on placebo was 653 pg/ml (95% CI: 511–794 pg/ml). Treatment with UL-2-141 resulted in 139 pg/ml reduction in post-prandial AG compared to placebo and treatment with MF1145 resulted in 114 pg/ml reduction compared to placebo. This pattern was significant for both treatments (p = 0.021 and p = 0.045, respectively) however when controlling for fasting-AG, the pattern was no longer significant (p = 0.590 and p = 0.877 respectively). Pre-prandial AG peaks were not significantly different across treatments.
While these peptides have previously demonstrated ghrelinergic effects in rats, no effect on appetite or food intake in humans was identified by this study. This may be attributable to the small sample size or low dose. However, since healthy adults are often not in tune with their own physiological hunger, they may not respond strongly to simple physiological modulators and repeating the study in subjects with established anorexia may be prudent.
Culture-based studies, which focus on individual organisms, have implicated stethoscopes as potential vectors of nosocomial bacterial transmission. However, the full bacterial communities that contaminate in-use stethoscopes have not been investigated.
Methods
We used bacterial 16S rRNA gene deep-sequencing, analysis, and quantification to profile entire bacterial populations on stethoscopes in use in an intensive care unit (ICU), including practitioner stethoscopes, individual-use patient-room stethoscopes, and clean unused individual-use stethoscopes. Two additional sets of practitioner stethoscopes were sampled before and after cleaning using standardized or practitioner-preferred methods.
Results
Bacterial contamination levels were highest on practitioner stethoscopes, followed by patient-room stethoscopes, whereas clean stethoscopes were indistinguishable from background controls. Bacterial communities on stethoscopes were complex, and community analysis by weighted UniFrac showed that physician and patient-room stethoscopes were indistinguishable and significantly different from clean stethoscopes and background controls. Genera relevant to healthcare-associated infections (HAIs) were common on practitioner stethoscopes, among which Staphylococcus was ubiquitous and had the highest relative abundance (6.8%–14% of contaminating bacterial sequences). Other HAI-related genera were also widespread although lower in abundance. Cleaning of practitioner stethoscopes resulted in a significant reduction in bacterial contamination levels, but these levels reached those of clean stethoscopes in only a few cases with either standardized or practitioner-preferred methods, and bacterial community composition did not significantly change.
Conclusions
Stethoscopes used in an ICU carry bacterial DNA reflecting complex microbial communities that include nosocomially important taxa. Commonly used cleaning practices reduce contamination but are only partially successful at modifying or eliminating these communities.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Chondrichthyan teeth from a new locality in the Scottish Borders supply additional evidence of Early Carboniferous chondrichthyans in the UK. The interbedded dolostones and siltstones of the Ballagan Formation exposed along Whitrope Burn are interpreted as representing a restricted lagoonal environment that received significant amounts of land-derived sediment. This site is palynologically dated to the latest Tournaisian–early Viséan. The diverse dental fauna documented here is dominated by large crushing holocephalan toothplates, with very few, small non-crushing chondrichthyan teeth. Two new taxa are named and described. Our samples are consistent with worldwide evidence that chondrichthyan crushing faunas are common following the Hangenberg extinction event. The lagoonal habitat represented by Whitrope Burn may represent a temporary refugium that was host to a near-relict fauna dominated by large holocephalan chondrichthyans with crushing dentitions. Many of these had already become scarce in other localities by the Viséan and become extinct later in the Carboniferous. This fauna provides evidence of early endemism or niche separation within European chondrichthyan faunas at this time. This evidence points to a complex picture in which the diversity of durophagous chondrichthyans is controlled by narrow spatial shifts in niche availability over time.
We analyzed birth order differences in means and variances of height and body mass index (BMI) in monozygotic (MZ) and dizygotic (DZ) twins from infancy to old age. The data were derived from the international CODATwins database. The total number of height and BMI measures from 0.5 to 79.5 years of age was 397,466. As expected, first-born twins had greater birth weight than second-born twins. With respect to height, first-born twins were slightly taller than second-born twins in childhood. After adjusting the results for birth weight, the birth order differences decreased and were no longer statistically significant. First-born twins had greater BMI than the second-born twins over childhood and adolescence. After adjusting the results for birth weight, birth order was still associated with BMI until 12 years of age. No interaction effect between birth order and zygosity was found. Only limited evidence was found that birth order influenced variances of height or BMI. The results were similar among boys and girls and also in MZ and DZ twins. Overall, the differences in height and BMI between first- and second-born twins were modest even in early childhood, while adjustment for birth weight reduced the birth order differences but did not remove them for BMI.
In western Canada, more money is spent on wild oat herbicides than on any
other weed species, and wild oat resistance to herbicides is the most
widespread resistance issue. A direct-seeded field experiment was conducted
from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop
species, crop seeding rate, crop usage, and herbicide rate combination
effects on wild oat management and canola yield. Combining 2× seeding rates
of early-cut barley silage with 2× seeding rates of winter cereals and
excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to
similar wild oat density, aboveground wild oat biomass, wild oat seed
density in the soil, and canola yield as a repeated canola–wheat rotation
under a full wild oat herbicide rate regime. Wild oat was similarly well
managed after 3 yr of perennial alfalfa without wild oat herbicides.
Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer
annual crop rotations resulted in higher wild oat density, biomass, and seed
banks. Management systems that effectively combine diverse and optimal
cultural practices against weeds, and limit herbicide use, reduce selection
pressure for weed resistance to herbicides and prolong the utility of
threatened herbicide tools.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
To observe patient care across hemodialysis facilities enrolled in the National Opportunity to Improve Infection Control in ESRD (end-stage renal disease) (NOTICE) project in order to evaluate adherence to evidence-based practices aimed at prevention of infection.
SETTING AND PARTICIPANTS
Thirty-four hemodialysis facilities were randomly selected from among 772 facilities in 4 end-stage renal disease participating networks. Facility selection was stratified on dialysis organization affiliation, size, socioeconomic status, and urban/rural status.
MEASUREMENTS
Trained infection control evaluators used an infection control worksheet to observe 73 distinct infection control practices at the hemodialysis facilities, from October 1, 2011, through January 31, 2012.
RESULTS
There was considerable variation in infection control practices across enrolled facilities. Overall adherence to recommended practices was 68% (range, 45%–92%) across all facilities. Overall adherence to expected hand hygiene practice was 72% (range, 10%–100%). Compliance to hand hygiene before and after procedures was high; however, during procedures hand hygiene compliance averaged 58%. Use of chlorhexidine as the specific agent for exit site care was 19% overall but varied from 0% to 35% by facility type. The 8 checklists varied in the frequency of perfect performance from 0% for meeting every item on the checklist for disinfection practices to 22% on the arteriovenous access practices at initiation.
CONCLUSIONS
Our findings suggest that there are many areas for improvement in hand hygiene and other infection prevention practices in end-stage renal disease. These NOTICE project findings will help inform the development of a larger quality improvement initiative at dialysis facilities.