To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Carfentrazone-ethyl is one of few herbicides labeled for control of silvery-thread moss (STM) in golf course putting greens, but common use rates are up to three times higher than for broadleaf weeds. Our objective was to determine the efficacy of a single postemergence application of carfentrazone-ethyl for STM control in greenhouse and field dose response studies. In the greenhouse, carfentrazone-ethyl was applied at 0, 14, 28, 56, 112, and 224 g ai ha−1 to pots containing established STM and creeping bentgrass. Percent gametophyte injury was visually estimated at 14, 28, 49, and 77 d after treatment (DAT). Shoot viability was determined by excising shoots from treated pots and plating them in petri dishes containing sand. The 28 and 49 DAT ED90 (dose required to cause 90% gametophyte injury) were 26.8 and 54.3 g ha−1, respectively; both of these doses are substantially lower than the label rates for long- and short-term control, respectively. All doses reduced the viability of transplanted shoots at 10 DAT compared to untreated STM; however, regrowth occurred in all petri dishes by 17 DAT. Field studies were initiated in Manhattan, Kansas and San Luis Obispo, California to corroborate greenhouse results. Averaged across locations, carfentrazone-ethyl applied at 56 and 112 g ha−1 caused 76% and 84% STM injury at 14 DAT, but quickly reduced to 45% and 48% STM injury by 28 DAT, respectively. In greenhouse and field studies, STM recovery did not occur until after 2 wk after treatment (WAT), which indicates the label-stipulated application interval of 2 wk is too short. Our research suggests 56 g ha−1 can provide similar burndown control of STM as compared to the highest label rate (112 g ha−1), and turfgrass managers should consider extending the reapplication interval to 3 or 4 wk when moss recovery is observed.
The coronavirus disease (COVID-19) crisis provoked an organizational ethics dilemma: how to develop ethical pandemic policy while upholding our organizational mission to deliver relationship- and patient-centered care. Tasked with producing a recommendation about whether healthcare workers and essential personnel should receive priority access to limited medical resources during the pandemic, the bioethics department and survey and interview methodologists at our institution implemented a deliberative approach that included the perspectives of healthcare professionals and patient stakeholders in the policy development process. Involving the community more, not less, during a crisis required balancing the need to act quickly to garner stakeholder perspectives, uncertainty about the extent and duration of the pandemic, and disagreement among ethicists about the most ethically supportable way to allocate scarce resources. This article explains the process undertaken to garner stakeholder input as it relates to organizational ethics, recounts the stakeholder perspectives shared and how they informed the triage policy developed, and offers suggestions for how other organizations may integrate stakeholder involvement in ethical decision-making as well as directions for future research and public health work.
The stable chromium (Cr) isotope system has emerged over the past decade as a new tool to track changes in the amount of oxygen in earth's ocean-atmosphere system. Much of the initial foundation for using Cr isotopes (δ53Cr) as a paleoredox proxy has required recent revision. However, the basic idea behind using Cr isotopes as redox tracers is straightforward—the largest isotope fractionations are redox-dependent and occur during partial reduction of Cr(VI). As such, Cr isotopic signatures can provide novel insights into Cr redox cycling in both marine and terrestrial settings. Critically, the Cr isotope system—unlike many other trace metal proxies—can respond to short-term redox perturbations (e.g., on timescales characteristic of Pleistocene glacial-interglacial cycles). The Cr isotope system can also be used to probe the earth's long-term atmospheric oxygenation, pointing towards low but likely dynamic oxygen levels for the majority of Earth's history.
To use Internet search data to compare duration of compliance for various diets.
Using a passive surveillance digital epidemiological approach, we estimated the average duration of diet compliance by examining monthly Internet searches for recipes related to popular diets. We fit a mathematical model to these data to estimate the time spent on a diet by new January dieters (NJD) and to estimate the percentage of dieters dropping out during the American winter holiday season between Thanksgiving and the end of December.
Internet searches in the USA for recipes related to popular diets over a 15-year period from 2004 to 2019.
Individuals in the USA performing Internet searches for recipes related to popular diets.
All diets exhibited significant seasonality in recipe-related Internet searches, with sharp spikes every January followed by a decline in the number of searches and a further decline in the winter holiday season. The Paleo diet had the longest average compliance times among NJD (5.32 ± 0.68 weeks) and the lowest dropout during the winter holiday season (only 14 ± 3 % dropping out in December). The South Beach diet had the shortest compliance time among NJD (3.12 ± 0.64 weeks) and the highest dropout during the holiday season (33 ± 7 % dropping out in December).
The current study is the first of its kind to use passive surveillance data to compare the duration of adherence with different diets and underscores the potential usefulness of digital epidemiological approaches to understanding health behaviours.
Intentional facial disfigurement is documented in archaeological contexts around the world. Here, the authors present the first archaeological evidence for intentional facial mutilation from Anglo-Saxon England—comprising the removal of the nose, upper lip and possible scalping—inflicted upon a young adult female. The injuries are consistent with documented punishments for female offenders. Although such mutilations do not appear in the written record until the tenth century AD, the instance reported here suggests that the practice may have emerged a century earlier. This case is examined in the context of a wider consideration of the motivations and significance of facial disfigurement in past societies.
Carbapenem-resistant Enterobacteriaceae (CRE) are an important cause of healthcare-associated infections (HAIs) in human hospitals. The Philadelphia Department of Public Health (PDPH) made CRE reportable in April 2018. In May 2019, the Matthew J. Ryan Veterinary Hospital (MJRVH) reported an NDM-5 Escherichia coli cluster in companion animals to the PDPH. In total, 15 infected animals (14 dogs and 1 cat) were reported between July 2018 and June 2019, with no new infections after June 2019. Limited literature is available on the prevalence of CRE in companion animals, and recommendations for dealing with CRE infections currently target human healthcare settings. Methods: A collaborative containment response included assessing interspecies transmission to veterinary staff and a comprehensive evaluation of the infection control program at MJRVH. MJRVH notified all owners of affected animals verbally and via notification letters with PDPH recommendations for CRE colonization screening of high-risk individuals. CRE screening of exposed high-risk employees was conducted by the University of Pennsylvania Occupational Health service and PDPH. Human rectal swabs were analyzed at the Antibiotic Resistance Laboratory Network (ARLN) Maryland Laboratory. PDPH were invited to conduct an onsite infection control assessment and to suggest improvements. Results: No pet owners self-identified in high-risk groups to be screened. In total, 10 high-risk staff were screened, and no colonized individuals were detected. Recommendations made by the PDPH to MJRVH included improvement of infection prevention and control policies (eg, consolidation of the infection control manual and identification of lead staff member), improvement in hand hygiene (HH) compliance (eg, increasing amount of HH supplies), improvement of environment of care (eg, decluttering and evaluation of mulched animal relief area), and improvement of respiratory care processes (eg, standardization of care policies). MJRVH made substantial improvements across recommendation areas including revision of infection control manual, creation of a full-time infection preventionist position, individual alcohol hand sanitizers for patient cages, and environmental decluttering and decontamination. PDPH and MJRVH maintained frequent communication about infection control improvements. Conclusions: No positive transmission to high-risk staff members suggest that, like in human healthcare facilities, transmission of CRE to caretakers may not be a common event. Stronger communication and collaboration is required from Departments of Public Health (DPH) to the veterinary profession regarding the reporting requirements of emerging pathogens such as CRE. Veterinary facilities should view DPH as a valuable resource for recommendations to fill in gaps that exist in infection control “best practices,” particularly for novel pathogens in veterinary settings.
Disclosures: Jane M. Gould reports that her spouse receives salary from Incyte.
The Expanded Program for Immunization Consortium – Human Immunology Project Consortium study aims to employ systems biology to identify and characterize vaccine-induced biomarkers that predict immunogenicity in newborns. Key to this effort is the establishment of the Data Management Core (DMC) to provide reliable data and bioinformatic infrastructure for centralized curation, storage, and analysis of multiple de-identified “omic” datasets. The DMC established a cloud-based architecture using Amazon Web Services to track, store, and share data according to National Institutes of Health standards. The DMC tracks biological samples during collection, shipping, and processing while capturing sample metadata and associated clinical data. Multi-omic datasets are stored in access-controlled Amazon Simple Storage Service (S3) for data security and file version control. All data undergo quality control processes at the generating site followed by DMC validation for quality assurance. The DMC maintains a controlled computing environment for data analysis and integration. Upon publication, the DMC deposits finalized datasets to public repositories. The DMC architecture provides resources and scientific expertise to accelerate translational discovery. Robust operations allow rapid sharing of results across the project team. Maintenance of data quality standards and public data deposition will further benefit the scientific community.
We report key learning from the public health management of the first two confirmed cases of COVID-19 identified in the UK. The first case imported, and the second associated with probable person-to-person transmission within the UK. Contact tracing was complex and fast-moving. Potential exposures for both cases were reviewed, and 52 contacts were identified. No further confirmed COVID-19 cases have been linked epidemiologically to these two cases. As steps are made to enhance contact tracing across the UK, the lessons learned from earlier contact tracing during the country's containment phase are particularly important and timely.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
Schizophrenia is an aetiologically complex disorder associated with significant familial risk. It is marked also by reductions in whole brain, grey and possibly white matter volumes. How these pathological abnormalities are influenced by schizophrenia's genetic and environmental risk remains uncertain.
We investigated the relationship between familial and environmental risk on brain volume in twin pairs varying in their zygosity and concordance for schizophrenia, and healthy control twins, using a variety of complementary imaging strategies. These included region of interest and automated tissue segmentation volumes and voxel based morphometry.
We found that whole brain, grey, white, frontal and right hippocampal volumes were smaller in probands with schizophrenia compared to healthy controls. Well co-twins from MZ discordant pairs showed a trend towards lower white matter volume compared to the healthy controls. Well co-twins from DZ discordant pairs had smaller hippocampal volumes compared to the healthy controls. The patients with schizophrenia and their well co-twins from MZ discordant pairs differed in the superior frontal cortex using both region of interest and VBM techniques. Lower birth weight and hypoxia were both associated with lower whole brain volumes, and with lower white and grey matter volumes respectively.
Our data suggest that total brain and grey matter volume reductions in schizophrenia, possibly focused in part in the frontal cortex, are related primarily to unique environmental factors, including perinatal complications. The white matter and local hippocampal volume reductions suggest an additional vulnerability to genetic risk effects.
The social-environment has a critical impact on health. The proposed progress is by stimulation of downstream pathways, from the central nervous system to the periphery, which subsequently alters the cells’ gene expression and transcription, particularly affecting the immune system. Stressors such as childhood adversity and mental health problems have both separately, and together, been identified as having crucial impact on inflammatory and immune genes. We intend to investigate how depression alone, and depression in combination with childhood adversity, as markers of adverse life events, alter the immune system towards dysregulation and increases the risk of developing immune-related pathologies such as autoimmune, cardiovascular and neurodegenerative diseases. We hypothesize that depression has negative modulating effect on the immune system and thus increases the risk of autoimmune disease, severe infection and cancer. Childhood adversity sensitizes the immune system (i.e. forms a pipeline) to develop pathological dysregulations when subsequently exposed to stressors later in life. Furthermore, we predict a dose-response relationship between the gravity and number of stressors and the risk of dysregulation. This study links two nationwide population-based registers namely the Danish Psychiatric Central Register and the National Hospital Register to create a longitudinal cohort study. Rate ratios, and accompanying 95% confidence intervals will be obtained. Accordingly, this work yields additional knowledge to how the social-environment, specifically adverse life events, affects the risk of immune-related diseases. Thus this can improve understandings on the interplay of mental disorders and immune-related diseases, and subsequently establish fundament for future research and possibilities for treatment and prevention.
Since its 1960s origins, the Haddon matrix has served as a tool to understand and prevent diverse mechanisms of injuries and promote safety. Potential remains for broadened application and innovation of the matrix for disaster preparedness. Hospital functionality and efficiency are particularly important components of community vulnerability in developed and developing nations alike. Given the Haddon matrixʼs user-friendly approach to integrating current engineering concepts, behavioral sciences, and policy dimensions, we seek to apply it in the context of hospital earthquake preparedness and response. The matrixʼs framework lends itself to interdisciplinary planning and collaboration between social and physical sciences, paving the way for a systems-oriented reduction in vulnerabilities. Here, using an associative approach to integrate seemingly disparate social and physical science disciplines yields innovative insights about hospital disaster preparedness for earthquakes. We illustrate detailed examples of pre-event, event, and post-event engineering, behavioral science, and policy factors that hospital planners should evaluate given the complex nature, rapid onset, and broad variation in impact and outcomes of earthquakes. This novel contextual examination of the Haddon matrix can enhance critical infrastructure disaster preparedness across the epidemiologic triad, by integrating essential principles of behavioral sciences, policy, law, and engineering to earthquake preparedness.
The Coronavirus (Covid-19) pandemic is exerting unprecedented pressure on NHS Health and Social Care provisions, with frontline staff, such as those of critical care units, encountering vast practical and emotional challenges on a daily basis. Although staff are being supported through organisational provisions, facilitated by those in leadership roles, the emergence of mental health difficulties or the exacerbation of existing ones amongst these members of staff is a cause for concern. Acknowledging this, academics and healthcare professionals alike are calling for psychological support for frontline staff, which not only addresses distress during the initial phases of the outbreak but also over the months, if not years, that follow. Fortunately, mental health services and psychology professional bodies across the United Kingdom have issued guidance to meet these needs. An attempt has been made to translate these sets of guidance into clinical provisions via the recently established Homerton Covid Psychological Support (HCPS) pathway delivered by Talk Changes (Hackney & City IAPT). This article describes the phased, stepped-care and evidence-based approach that has been adopted by the service to support local frontline NHS staff. We wish to share our service design and pathway of care with other Improving Access to Psychological Therapies (IAPT) services who may also seek to support hospital frontline staff within their associated NHS Trusts and in doing so, lay the foundations of a coordinated response.
Key learning aims
(1) To understand the ways staff can be psychologically and emotionally impacted by working on the frontline of disease outbreaks.
(2) To understand the ways in which IAPT services have previously supported populations exposed to crises.
(3) To learn ways of delivering psychological support and interventions during a pandemic context based on existing guidance and research.
Three-dimensional printing is a revolutionary technology that is disrupting the status quo in surgery. It has been rapidly adopted by otolaryngology as a tool in surgical simulation for high-risk, low-frequency procedures. This systematic review comprehensively evaluates the contemporary usage of three-dimensional printed otolaryngology simulators.
A systematic review of the literature was performed with narrative synthesis.
Twenty-two articles were identified for inclusion, describing models that span a range of surgical tasks (temporal bone dissection, airway procedures, functional endoscopic sinus surgery and endoscopic ear surgery). Thirty-six per cent of articles assessed construct validity (objective measures); the other 64 per cent only assessed face and content validity (subjective measures). Most studies demonstrated positive feedback and high confidence in the models’ value as additions to the curriculum.
Whilst further studies supported with objective metrics are merited, the role of three-dimensional printed otolaryngology simulators is poised to expand in surgical training given the enthusiastic reception from trainees and experts alike.
To investigate changes in socio-economic inequalities in growth in height, weight, BMI and grip strength in children born during 1955–1993 in Guatemala, a period of marked socio-economic-political change.
We modelled longitudinal data on height, weight, BMI and hand grip strength using Super-Imposition by Translation and Rotation (SITAR). Internal Z-scores summarising growth size, timing and intensity (peak growth velocity, e.g. cm/year) were created to investigate inequalities by socio-economic position (SEP; measured by school attended). Interactions of SEP with date of birth were investigated to capture secular changes in inequalities.
Urban and peri-urban schools in the region of Guatemala City, Guatemala.
Participants were 40 484 children and adolescents aged 3–19 years of Ladino and Maya ancestry (nobservations 157 067).
The difference in height (SITAR size) between lowest and highest SEP decreased from −2·0 (95 % CI −2·2, −1·9) sd to −1·4 (95 % CI −1·5, −1·3) sd in males, and from −2·0 (95 % CI −2·1, −1·9) sd to −1·2 (95 % CI −1·3, −1·2) sd in females over the study period. Inequalities also reduced for weight, BMI and grip strength, due to greater secular increases in lowest-SEP groups. The puberty period was earlier and shorter in higher-SEP individuals (earlier SITAR timing and higher SITAR intensity). All SEP groups showed increases in BMI intensity over time.
Inequality narrowed between the 1960s and 1990s. The lowest-SEP groups were still >1 sd shorter than the highest. Risks remain for reduced human capital and poorer population health for urban Guatemalans.
A large and growing body of literature has studied consumer willingness to pay (WTP) for local foods in the United States. However, these studies implicitly assume that consumers perceive local foods to have superior quality than nonlocal foods. Little is known about WTP for local foods when taking into account differences in consumer perception of food quality between local and nonlocal foods. In this article, we conduct an economic experiment to assess the effect of locally grown information on consumer WTP and quality perceptions of three broccoli varieties (one commercial variety grown in California and two newly developed local varieties). Our results show that consumers rate both the appearance and the taste of the two local broccoli varieties lower than the California variety when evaluating food quality blindly. However, consumers’ evaluations of the two local varieties improve substantially after being told the two varieties are locally grown. Results also indicate that consumers are willing to pay a price premium for the two local varieties after being told that they are locally grown. Our results provide evidence that locally grown information has a positive effect on both consumer WTP and quality perception of local foods.
BMI z (BMIz) score based on the Centers for Disease Control and Prevention growth charts is widely used, but it is inaccurate above the 97th percentile. We explored the performance of alternative metrics based on the absolute distance or % distance of a child’s BMI from the median BMI for sex and age. We used longitudinal data from 5628 children who were first examined <12 years to compare the tracking of three BMI metrics: distance from median, % distance from median and % distance from median on a log scale. We also explored the effects of adjusting these metrics for age differences in the distribution of BMI. The intraclass correlation coefficient (ICC) was used to compare tracking of the metrics. Metrics based on % distance (whether on the original or log scale) yielded higher ICCs compared with distance from median. The ICCs of the age-adjusted metrics were higher than that of the unadjusted metrics, particularly among children who were (1) overweight or had obesity, (2) younger and (3) followed for >3 years. The ICCs of the age-adjusted metrics were also higher compared with that of BMIz among children who were overweight or obese. Unlike BMIz, these alternative metrics do not have an upper limit and can be used for assessing BMI in all children, even those with very high BMIs. The age-adjusted % from median (on a log or linear scale) works well for all ages, while unadjusted % from median is better limited to older children or short follow-up periods.
Childhood maltreatment (CM) plays an important role in the development of major depressive disorder (MDD). The aim of this study was to examine whether CM severity and type are associated with MDD-related brain alterations, and how they interact with sex and age.
Within the ENIGMA-MDD network, severity and subtypes of CM using the Childhood Trauma Questionnaire were assessed and structural magnetic resonance imaging data from patients with MDD and healthy controls were analyzed in a mega-analysis comprising a total of 3872 participants aged between 13 and 89 years. Cortical thickness and surface area were extracted at each site using FreeSurfer.
CM severity was associated with reduced cortical thickness in the banks of the superior temporal sulcus and supramarginal gyrus as well as with reduced surface area of the middle temporal lobe. Participants reporting both childhood neglect and abuse had a lower cortical thickness in the inferior parietal lobe, middle temporal lobe, and precuneus compared to participants not exposed to CM. In males only, regardless of diagnosis, CM severity was associated with higher cortical thickness of the rostral anterior cingulate cortex. Finally, a significant interaction between CM and age in predicting thickness was seen across several prefrontal, temporal, and temporo-parietal regions.
Severity and type of CM may impact cortical thickness and surface area. Importantly, CM may influence age-dependent brain maturation, particularly in regions related to the default mode network, perception, and theory of mind.
Introduction: Long-term immobility has detrimental effects for critically ill patients admitted to the intensive care unit (ICU) including ICU-acquired weakness. Early mobilization of patients admitted to ICU has been demonstrated to be a safe, feasible and effective strategy to improve patient outcomes. The optimal mobilization of trauma ICU patients has not been extensively studied. Our objective was to determine the impact of an early mobilization protocol on outcomes among trauma patients admitted to the ICU. Methods: We analyzed all adult trauma patients ( > 18 years old) admitted to ICU over a 2-year period prior to and following implementation of an early mobilization protocol, allowing for a 1-year transition period. Data were collected from the Nova Scotia Trauma Registry. We compared patient characteristics and outcomes (mortality, length of stay [LOS], ventilator days) between the pre- and post-implementation groups. Associations between early mobilization and clinical outcomes were estimated using binary and linear regression models. Results: Overall, there were 526 patients included in the analysis (292 pre-implementation, 234 post-implementation). The study population ranged in age from 18 to 92 years (mean age 49.0 ± 20.4 years) and 74.3% of all patients were male. The pre- and post-implementation groups were similar in age, sex, and injury severity. In-hospital mortality was reduced in the post-implementation group (25.3% vs. 17.5%; p = 0.031). In addition, there was a reduction in ICU mortality in the post-implementation group (21.6% vs. 12.8%; p = 0.009). We did not observe any difference in overall hospital LOS, ICU LOS, or ventilator days between the two groups. Compared to the pre-implementation period, trauma patients admitted to the ICU following protocol implementation were less likely to die in-hospital (OR = 0.52, 95% CI 0.30-0.91; p = 0.021) or in the ICU (OR = 0.40, 95% CI 0.21- 0.76, p = 0.005). Results were similar following a sensitivity analysis limited to patients with blunt or penetrating injuries. There was no difference between the pre- and post-implementation groups with respect to in-hospital LOS, ICU LOS, or the number of ventilator days. Conclusion: We found that trauma patients admitted to ICU during the post-implementation period had decreased odds of in-hospital mortality and ICU mortality. Ours is the first study to demonstrate a significant reduction in trauma mortality following implementation of an ICU mobility protocol.