We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Situated within the public will and political will framework, this paper explores frames to address the social issue of gender pay inequity. Specifically, the authors examine whether demographic characteristics affect perceived acceptability of different frames describing gender pay inequity and perceptions of this social issue. First, the authors identified 26 terms used to discuss gender pay inequity; this list was narrowed to 12, representing four categories. Next, the authors solicited sentiment reactions to those frames and perceptions of gender pay inequity. Taken together, the results indicated that although respondents had consistently positive reactions to the frames fair pay, equal pay, and pay fairness, perceptions varied across demographic groups. The biggest effects were consistently for political party-related variables. One frame, strategic compensation practices, emerged as a value-neutral frame that could potentially be used to reframe the issue and re-engage business and political stakeholders who do not perceive gender pay inequity as problematic.
Post-traumatic stress disorder occurs in parents of infants with CHD, contributing to psychological distress with detrimental effects on family functioning and well-being. We sought to determine the prevalence and factors associated with post-traumatic stress disorder symptoms in parents whose infants underwent staged palliation for single ventricle heart disease.
Materials and methods:
A large longitudinal multi-centre cohort study evaluated 215 mothers and fathers for symptoms of post-traumatic stress disorder at three timepoints, including post-Norwood, post-Stage II, and a final study timepoint when the child reached approximately 16 months of age, using the self-report questionnaire Impact of Event Scale – Revised.
Results:
The prevalence of probable post-traumatic stress disorder post-Norwood surgery was 50% of mothers and 39% of fathers, decreasing to 27% of mothers and 24% of fathers by final follow-up. Intrusive symptoms such as flashbacks and nightmares and hyperarousal symptoms such as poor concentration, irritability, and sudden physical symptoms of racing heart and difficulty breathing were particularly elevated in parents. Higher levels of anxiety, reduced coping, and decreased satisfaction with parenting were significantly associated with symptoms of post-traumatic stress disorder in parents. Demographic and clinical variables such as parent education, pre-natal diagnosis, medical complications, and length of hospital stay(s) were not significantly associated with symptoms of post-traumatic stress disorder.
Discussion:
Parents whose infants underwent staged palliation for single ventricle heart disease often reported symptoms of post-traumatic stress disorder. Symptoms persisted over time and routine screening might help identify parents at-risk and prompt referral to appropriate supports.
We explore the motion of an axisymmetric gravity current in an anisotropic porous medium in which the horizontal permeability is larger than the vertical permeability. It is well known that the classical axisymmetric gravity current supplied by a constant point source of fluid has an unphysical singularity near the origin. We address this by considering a pressure-dominated region near the origin which allows for vertical flow from the source, such that the current remains of finite depth, whilst beyond this region the flow is gravity dominated. At early times the inner pressure-driven region controls the spreading of the current, but at late times the inner region occupies a progressively smaller fraction of the current such that the radius increases as ${\sim }t^{3/7}$, while the depth near the origin increases approximately as ${\sim }t^{1/7}$. The presence of anisotropy highlights this phenomenon, since the vertical permeability maintains an effect on the flow at late times through the pressure-driven flow near the origin. Using these results we provide some quantitative insights into the dominant dynamics which controls CO$_2$ migration through permeable aquifers, as occurs in the context of carbon capture and storage.
A general obligation to make aggregate research results available to participants has been widely supported in the bioethics literature. However, dementia research presents several challenges to this perspective, particularly because of the fear associated with developing dementia. The authors argue that considerations of respect for persons, beneficence, and justice fail to justify an obligation to make aggregate research results available to participants in dementia research. Nevertheless, there are positive reasons in favor of making aggregate research results available; when the decision is made to do so, it is critical that a clear strategy for communicating results is developed, including what support will be provided to participants receiving aggregate research results.
To explore communities’ perspectives on the factors in the social food environment that influence dietary behaviours in African cities.
Design:
A qualitative study using participatory photography (Photovoice). Participants took and discussed photographs representing factors in the social food environment that influence their dietary behaviours. Follow-up in-depth interviews allowed participants to tell the ‘stories’ of their photographs. Thematic analysis was conducted, using data-driven and theory-driven (based on the socio-ecological model) approaches.
Setting:
Three low-income areas of Nairobi (n 48) in Kenya and Accra (n 62) and Ho (n 32) in Ghana.
Participants:
Adolescents and adults, male and female aged ≥13 years.
Results:
The ‘people’ who were most commonly reported as influencers of dietary behaviours within the social food environment included family members, friends, health workers and food vendors. They mainly influenced food purchase, preparation and consumption, through (1) considerations for family members’ food preferences, (2) considerations for family members’ health and nutrition needs, (3) social support by family and friends, (4) provision of nutritional advice and modelling food behaviour by parents and health professionals, (5) food vendors’ services and social qualities.
Conclusions:
The family presents an opportunity for promoting healthy dietary behaviours among family members. Peer groups could be harnessed to promote healthy dietary behaviours among adolescents and youth. Empowering food vendors to provide healthier and safer food options could enhance healthier food sourcing, purchasing and consumption in African low-income urban communities.
Whole-genome sequencing (WGS) has traditionally been used in infection prevention to confirm or refute the presence of an outbreak after it has occurred. Due to decreasing costs of WGS, an increasing number of institutions have been utilizing WGS-based surveillance. Additionally, machine learning or statistical modeling to supplement infection prevention practice have also been used. We systematically reviewed the use of WGS surveillance and machine learning to detect and investigate outbreaks in healthcare settings.
Methods:
We performed a PubMed search using separate terms for WGS surveillance and/or machine-learning technologies for infection prevention through March 15, 2021.
Results:
Of 767 studies returned using the WGS search terms, 42 articles were included for review. Only 2 studies (4.8%) were performed in real time, and 39 (92.9%) studied only 1 pathogen. Nearly all studies (n = 41, 97.6%) found genetic relatedness between some isolates collected. Across all studies, 525 outbreaks were detected among 2,837 related isolates (average, 5.4 isolates per outbreak). Also, 35 studies (83.3%) only utilized geotemporal clustering to identify outbreak transmission routes. Of 21 studies identified using the machine-learning search terms, 4 were included for review. In each study, machine learning aided outbreak investigations by complementing methods to gather epidemiologic data and automating identification of transmission pathways.
Conclusions:
WGS surveillance is an emerging method that can enhance outbreak detection. Machine learning has the potential to identify novel routes of pathogen transmission. Broader incorporation of WGS surveillance into infection prevention practice has the potential to transform the detection and control of healthcare outbreaks.
Metabolites produced by microbial fermentation in the human intestine, especially short-chain fatty acids (SCFAs), are known to play important roles in colonic and systemic health. Our aim here was to advance our understanding of how and why their concentrations and proportions vary between individuals. We have analysed faecal concentrations of microbial fermentation acids from 10 human volunteer studies, involving 163 subjects, conducted at the Rowett Institute, Aberdeen, UK over a 7-year period. In baseline samples, the % butyrate was significantly higher, whilst % iso-butyrate and % iso-valerate were significantly lower, with increasing total SCFA concentration. The decreasing proportions of iso-butyrate and iso-valerate, derived from amino acid fermentation, suggest that fibre intake was mainly responsible for increased SCFA concentrations. We propose that the increase in % butyrate among faecal SCFA is largely driven by a decrease in colonic pH resulting from higher SCFA concentrations. Consistent with this, both total SCFA and % butyrate increased significantly with decreasing pH across five studies for which faecal pH measurements were available. Colonic pH influences butyrate production through altering the stoichiometry of butyrate formation by butyrate-producing species, resulting in increased acetate uptake and butyrate formation, and facilitating increased relative abundance of butyrate-producing species (notably Roseburia and Eubacterium rectale).
Relapse and recurrence of depression are common, contributing to the overall burden of depression globally. Accurate prediction of relapse or recurrence while patients are well would allow the identification of high-risk individuals and may effectively guide the allocation of interventions to prevent relapse and recurrence.
Aims
To review prognostic models developed to predict the risk of relapse, recurrence, sustained remission, or recovery in adults with remitted major depressive disorder.
Method
We searched the Cochrane Library (current issue); Ovid MEDLINE (1946 onwards); Ovid Embase (1980 onwards); Ovid PsycINFO (1806 onwards); and Web of Science (1900 onwards) up to May 2021. We included development and external validation studies of multivariable prognostic models. We assessed risk of bias of included studies using the Prediction model risk of bias assessment tool (PROBAST).
Results
We identified 12 eligible prognostic model studies (11 unique prognostic models): 8 model development-only studies, 3 model development and external validation studies and 1 external validation-only study. Multiple estimates of performance measures were not available and meta-analysis was therefore not necessary. Eleven out of the 12 included studies were assessed as being at high overall risk of bias and none examined clinical utility.
Conclusions
Due to high risk of bias of the included studies, poor predictive performance and limited external validation of the models identified, presently available clinical prediction models for relapse and recurrence of depression are not yet sufficiently developed for deploying in clinical settings. There is a need for improved prognosis research in this clinical area and future studies should conform to best practice methodological and reporting guidelines.
The critical period for weed control (CPWC) adds value to integrated weed management by identifying the period during which weeds need to be controlled to avoid yield losses exceeding a defined threshold. However, the traditional application of the CPWC does not identify the timing of control needed for weeds that emerge late in the critical period. In this study, CPWC models were developed from field data in high-yielding cotton crops during three summer seasons from 2005 to 2008, using the mimic weed, common sunflower, at densities of two to 20 plants per square meter. Common sunflower plants were introduced at up to 450 growing degree days (GDD) after crop planting and removed at successive 200 GDD intervals after introduction. The CPWC models were described using extended Gompertz and logistic functions that included weed density, time of weed introduction, and time of weed removal (logistic function only) in the relationships. The resulting models defined the CPWC for late-emerging weeds, identifying a period after weed emergence before weed control was required to prevent yield loss exceeding the yield-loss threshold. When weeds emerged in sufficient numbers toward the end of the critical period, the model predicted that crop yield loss resulting from competition by these weeds would not exceed the yield-loss threshold until well after the end of the CPWC. These findings support the traditional practice of ensuring weeds are controlled before crop canopy closure, with later weed control inputs used as required.
Recent X-ray observations by Jiang et al. have identified an active galactic nucleus (AGN) in the bulgeless spiral galaxy NGC 3319, located just
$14.3\pm 1.1$
Mpc away, and suggest the presence of an intermediate-mass black hole (IMBH;
$10^2\leq M_\bullet/\textrm{M}_{\odot}\leq 10^5$
) if the Eddington ratios are as high as 3 to
$3\times10^{-3}$
. In an effort to refine the black hole mass for this (currently) rare class of object, we have explored multiple black hole mass scaling relations, such as those involving the (not previously used) velocity dispersion, logarithmic spiral arm pitch angle, total galaxy stellar mass, nuclear star cluster mass, rotational velocity, and colour of NGC 3319, to obtain 10 mass estimates, of differing accuracy. We have calculated a mass of
$3.14_{-2.20}^{+7.02}\times10^4\,\textrm{M}_\odot$
, with a confidence of 84% that it is
$\leq $
$10^5\,\textrm{M}_\odot$
, based on the combined probability density function from seven of these individual estimates. Our conservative approach excluded two black hole mass estimates (via the nuclear star cluster mass and the fundamental plane of black hole activity—which only applies to black holes with low accretion rates) that were upper limits of
${\sim}10^5\,{\textrm M}_{\odot}$
, and it did not use the
$M_\bullet$
–
$L_{\textrm 2-10\,\textrm{keV}}$
relation’s prediction of
$\sim$
$10^5\,{\textrm M}_{\odot}$
. This target provides an exceptional opportunity to study an IMBH in AGN mode and advance our demographic knowledge of black holes. Furthermore, we introduce our novel method of meta-analysis as a beneficial technique for identifying new IMBH candidates by quantifying the probability that a galaxy possesses an IMBH.
The Subglacial Antarctic Lakes Scientific Access (SALSA) Project accessed Mercer Subglacial Lake using environmentally clean hot-water drilling to examine interactions among ice, water, sediment, rock, microbes and carbon reservoirs within the lake water column and underlying sediments. A ~0.4 m diameter borehole was melted through 1087 m of ice and maintained over ~10 days, allowing observation of ice properties and collection of water and sediment with various tools. Over this period, SALSA collected: 60 L of lake water and 10 L of deep borehole water; microbes >0.2 μm in diameter from in situ filtration of ~100 L of lake water; 10 multicores 0.32–0.49 m long; 1.0 and 1.76 m long gravity cores; three conductivity–temperature–depth profiles of borehole and lake water; five discrete depth current meter measurements in the lake and images of ice, the lake water–ice interface and lake sediments. Temperature and conductivity data showed the hydrodynamic character of water mixing between the borehole and lake after entry. Models simulating melting of the ~6 m thick basal accreted ice layer imply that debris fall-out through the ~15 m water column to the lake sediments from borehole melting had little effect on the stratigraphy of surficial sediment cores.
The primary aim of this study was to assess the epidemiology of carbapenem-resistant Acinetobacter baumannii (CRAB) for 9 months following a regional outbreak with this organism. We also aimed to determine the differential positivity rate from different body sites and characterize the longitudinal changes of surveillance test results among CRAB patients.
Design:
Observational study.
Setting:
A 607-bed tertiary-care teaching hospital in Milwaukee, Wisconsin.
Patients:
Any patient admitted from postacute care facilities and any patient housed in the same inpatient unit as a positive CRAB patient.
Methods:
Participants underwent CRAB surveillance cultures from tracheostomy secretions, skin, and stool from December 5, 2018, to September 6, 2019. Cultures were performed using a validated, qualitative culture method, and final bacterial identification was performed using mass spectrometry.
Results:
In total, 682 patients were tested for CRAB, of whom 16 (2.3%) were positive. Of the 16 CRAB-positive patients, 14 (87.5%) were residents from postacute care facilities and 11 (68.8%) were African American. Among positive patients, the positivity rates by body site were 38% (6 of 16) for tracheal aspirations, 56% (9 of 16) for skin, and 82% (13 of 16) for stool.
Conclusions:
Residents from postacute care facilities were more frequently colonized by CRAB than patients admitted from home. Stool had the highest yield for identification of CRAB.
Throughout the Ediacaran Period, variable water-column redox conditions persisted along productive ocean margins due to a complex interplay between nutrient supply and oceanographic restriction. These changing conditions are considered to have influenced early faunal evolution, with marine anoxia potentially inhibiting the development of the ecological niches necessary for aerobic life forms. To understand this link between oxygenation and evolution, the combined geochemical and palaeontological study of marine sediments is preferable. Located in the Yangtze Gorges region of southern China, lagoonal black shales at Miaohe preserve alga and putative metazoans, including Eoandromeda, a candidate total-group ctenophore, thereby providing one example of where integrated study is possible. We present a multi-proxy investigation into water-column redox variability during deposition of these shales (c. 560–551 Ma). For this interval, reactive iron partitioning indicates persistent water-column anoxia, while trace metal enrichments and other geochemical data suggest temporal fluctuations between ferruginous, euxinic and rare suboxic conditions. Although trace metal and total organic carbon values imply extensive basin restriction, sustained trace metal enrichment and δ15Nsed data indicate periodic access to open-ocean inventories across a shallow-marine sill. Lastly, δ13Corg values of between −35‰ and −40‰ allow at least partial correlation of the shales at Miaohe with Member IV of the Doushantuo Formation. This study provides evidence for fluctuating redox conditions in the lagoonal area of the Yangtze platform during late Ediacaran time. If these low-oxygen environments were regionally characteristic, then the restriction of aerobic fauna to isolated environments can be inferred.
Clarifying the relationship between depression symptoms and cardiometabolic and related health could clarify risk factors and treatment targets. The objective of this study was to assess whether depression symptoms in midlife are associated with the subsequent onset of cardiometabolic health problems.
Methods
The study sample comprised 787 male twin veterans with polygenic risk score data who participated in the Harvard Twin Study of Substance Abuse (‘baseline’) and the longitudinal Vietnam Era Twin Study of Aging (‘follow-up’). Depression symptoms were assessed at baseline [mean age 41.42 years (s.d. = 2.34)] using the Diagnostic Interview Schedule, Version III, Revised. The onset of eight cardiometabolic conditions (atrial fibrillation, diabetes, erectile dysfunction, hypercholesterolemia, hypertension, myocardial infarction, sleep apnea, and stroke) was assessed via self-reported doctor diagnosis at follow-up [mean age 67.59 years (s.d. = 2.41)].
Results
Total depression symptoms were longitudinally associated with incident diabetes (OR 1.29, 95% CI 1.07–1.57), erectile dysfunction (OR 1.32, 95% CI 1.10–1.59), hypercholesterolemia (OR 1.26, 95% CI 1.04–1.53), and sleep apnea (OR 1.40, 95% CI 1.13–1.74) over 27 years after controlling for age, alcohol consumption, smoking, body mass index, C-reactive protein, and polygenic risk for specific health conditions. In sensitivity analyses that excluded somatic depression symptoms, only the association with sleep apnea remained significant (OR 1.32, 95% CI 1.09–1.60).
Conclusions
A history of depression symptoms by early midlife is associated with an elevated risk for subsequent development of several self-reported health conditions. When isolated, non-somatic depression symptoms are associated with incident self-reported sleep apnea. Depression symptom history may be a predictor or marker of cardiometabolic risk over decades.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Errors inherent in self-reported measures of energy intake (EI) are substantial and well documented, but correlates of misreporting remain unclear. Therefore, potential predictors of misreporting were examined. In Study One, fifty-nine individuals (BMI = 26·1 (sd 3·8) kg/m2, age = 42·7 (sd 13·6) years, females = 29) completed a 14-d stay in a residential feeding behaviour suite where eating behaviour was continuously monitored. In Study Two, 182 individuals (BMI = 25·7 (sd 3·9) kg/m2, age = 42·4 (sd 12·2) years, females = 96) completed two consecutive days in a residential feeding suite and five consecutive days at home. Misreporting was directly quantified by comparing covertly measured laboratory weighed intakes (LWI) with self-reported EI (weighed dietary record (WDR), 24-h recall, 7-d diet history, FFQ). Personal (age, sex and %body fat) and psychological traits (personality, social desirability, body image, intelligence quotient and eating behaviour) were used as predictors of misreporting. In Study One, those with lower psychoticism (P = 0·009), openness to experience (P = 0·006) and higher agreeableness (P = 0·038) reduced EI on days participants knew EI was being measured to a greater extent than on covert days. Isolated associations existed between personality traits (psychoticism and openness to experience), eating behaviour (emotional eating) and differences between the LWI and self-reported EI, but these were inconsistent between dietary assessment techniques and typically became non-significant after accounting for multiplicity of comparisons. In Study Two, sex was associated with differences between LWI and the WDR (P = 0·009), 24-h recall (P = 0·002) and diet history (P = 0·050) in the laboratory, but not home environment. Personal and psychological correlates of misreporting identified displayed no clear pattern across studies or dietary assessment techniques and had little utility in predicting misreporting.
The association between Clostridioides difficile colonization and C. difficile infection (CDI) is unknown in solid-organ transplant (SOT) patients. We examined C. difficile colonization and healthcare-associated exposures as risk factors for development of CDI in SOT patients.
Methods:
The retrospective study cohort included all consecutive SOT patients with at least 1 screening test between May 2017 and April 2018. CDI was defined as the presence of diarrhea (without laxatives), a positive C. difficile clinical test, and the use of C. difficile-directed antimicrobial therapy as ordered by managing clinicians. In addition to demographic variables, exposures to antimicrobials, immunosuppressants, and gastric acid suppressants were evaluated from the time of first screening test to the time of CDI, death, or final discharge.
Results:
Of the 348 SOT patients included in our study, 33 (9.5%) were colonized with toxigenic C. difficile. In total, 11 patients (3.2%) developed CDI. Only C. difficile colonization (odds ratio [OR], 13.52; 95% CI, 3.46–52.83; P = .0002), age (OR, 1.09; CI, 1.02–1.17; P = .0135), and hospital days (OR, 1.05; 95% CI, 1.02–1.08; P = .0017) were independently associated with CDI.
Conclusions:
Although CDI was more frequent in C. difficile colonized SOT patients, the overall incidence of CDI was low in this cohort.
Glyphosate-tolerant and glyphosate-resistant weeds are becoming increasingly problematic in cotton fields in Australia, necessitating a return from a glyphosate dominated system to a more integrated approach to weed management. The development of an integrated weed management system can be facilitated by identifying the critical period for weed control (CPWC), a model that enables cotton growers to optimize the timing of their weed control inputs. Using data from field studies conducted from 2003 to 2015, CPWC models using extended functions, including weed biomass in the relationships, were developed for the mimic weeds, common sunflower and Japanese millet, in high-yielding, fully irrigated cotton. A multispecies CPWC model was developed after combining these data sets with data for mungbean in irrigated cotton, using weed height and weed biomass as descriptors in the models. Comparison of observed and predicted relative cotton-lint yields from the multispecies CPWC model demonstrated that the model reasonably described the competition from these three very different mimic weeds, opening the possibility for cotton growers to use a multispecies CPWC model in their production systems.