We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
The EAT–Lancet Commission promulgated a universal reference diet. Subsequently, researchers constructed an EAT–Lancet diet score (0–14 points), with minimum intake values for various dietary components set at 0 g/d, and reported inverse associations with risks of major health outcomes in a high-income population. We assessed associations between EAT–Lancet diet scores, without or with lower bound values, and the mean probability of micronutrient adequacy (MPA) among nutrition-insecure women of reproductive age (WRA) from low- and middle-income countries (LMIC). We analysed single 24-h diet recall data (n 1950) from studies in rural DRC, Ecuador, Kenya, Sri Lanka and Vietnam. Associations between EAT–Lancet diet scores and MPA were assessed by fitting linear mixed-effects models. Mean EAT–Lancet diet scores were 8·8 (SD 1·3) and 1·9 (SD 1·1) without or with minimum intake values, respectively. Pooled MPA was 0·58 (SD 0·22) and energy intake was 10·5 (SD 4·6) MJ/d. A one-point increase in the EAT–Lancet diet score, without minimum intake values, was associated with a 2·6 (SD 0·7) percentage points decrease in MPA (P < 0·001). In contrast, the EAT–Lancet diet score, with minimum intake values, was associated with a 2·4 (SD 1·3) percentage points increase in MPA (P = 0·07). Further analysis indicated positive associations between EAT–Lancet diet scores and MPA adjusted for energy intake (P < 0·05). Our findings indicate that the EAT–Lancet diet score requires minimum intake values for nutrient-dense dietary components to avoid positively scoring non-consumption of food groups and subsequently predicting lower MPA of diets, when applied to rural WRA in LMIC.
Most oviposition by Helicoverpa zea (Boddie) occurs near the top of the canopy in soybean, Glycine max (L.) Merr, and larval abundance is influenced by the growth habit of plants. However, the vertical distribution of larvae within the canopy is not as well known. We evaluated the vertical distribution of H. zea larvae in determinate and indeterminate varieties, hypothesizing that larval distribution in the canopy would vary between these two growth habits and over time. We tested this hypothesis in a naturally infested replicated field experiment and two experimentally manipulated cage experiments. In the field experiment, flowering time was synchronized between the varieties by manipulating planting date, while infestation timing was manipulated in the cage experiments. Larvae were recovered using destructive sampling of individual soybean plants, and their vertical distribution by instar was recorded from three sampling points over time in each experiment. While larval population growth and development varied between the determinate and indeterminate varieties within and among experiments, we found little evidence that larvae have preference for different vertical locations in the canopy. This study lends support to the hypothesis that larval movement and location within soybean canopies do not result entirely from oviposition location and nutritional requirements.
Helicoverpa zea (Boddie) is a damaging pest of many crops including soybean, Glycine max (L.), especially in the southern United States. Previous studies have concluded that oviposition and development of H. zea larvae mirror the phenology of soybean, with oviposition occurring during full bloom, younger larvae developing on blooms and leaves, intermediate aged larvae developing on varying tissue types, and older larvae developing on flowers and pods. In a field trial, we investigated the presence of natural infestations of H. zea larvae by instar in determinate and indeterminate soybean varieties. In complementary experiments, we artificially infested H. zea and allowed them to oviposit on plants within replicated cages (one with a determinate variety and two with an indeterminate variety). Plants were sampled weekly during the time larvae were present. In the natural infestation experiment, most larvae were found on blooms during R3 and were early to middle instars; by R4, most larvae were found on leaves and were middle to late instars. In contrast, in the cage study, most larvae were found on leaves regardless of soybean growth stage or larval stage. Determinate and indeterminate growth habit did not impact larval preference for different soybean tissue types. Our studies suggest H. zea larvae prefer specific tissue types, but also provide evidence that experimental design can influence the results. Finally, our finding of larval preference for leaves contrasts with findings from previous studies.
Southeastern Appalachian Ohio has more than double the national average of diabetes and a critical shortage of healthcare providers. Paradoxically, there is limited research focused on primary care providers’ experiences treating people with diabetes in this region. This study explored providers’ perceived barriers to and facilitators for treating patients with diabetes in southeastern Appalachian Ohio.
Methods:
We conducted in-depth interviews with healthcare providers who treat people with diabetes in rural southeastern Ohio. Interviews were transcribed, coded, and analyzed via content and thematic analyses using NVivo 12 software (QSR International, Chadstone, VIC, Australia).
Results:
Qualitative analysis revealed four themes: (1) patients’ diabetes fatalism and helplessness: providers recounted story after story of patients believing that their diabetes was inevitable and that they were helpless to prevent or delay diabetes complications. (2) Comorbid psychosocial issues: providers described high rates of depression, anxiety, incest, abuse, and post-traumatic stress disorder among people with diabetes in this region. (3) Inter-connected social determinants interfering with diabetes care: providers identified major barriers including lack of access to providers, lack of access to transportation, food insecurity, housing insecurity, and financial insecurity. (4) Providers’ cultural understanding and recommendations: providers emphasized the importance of understanding of the values central to Appalachian culture and gave culturally attuned clinical suggestions for how to use these values when working with this population.
Conclusions:
Evidence-based interventions tailored to Appalachian culture and training designed to increase the cultural competency and cultural humility of primary care providers may be effective approaches to reduce barriers to diabetes care in Appalachian Ohio.
Introduction: Prognostication and disposition among older Emergency Department (ED) patients with suspected infection remains challenging. Frailty is increasingly recognized as a predictor of poor prognosis among critically ill patients, however its association with clinical outcomes among older ED patients with suspected infection is unknown. Methods: We conducted a multicentre prospective cohort study at two tertiary care EDs. We included older ED patients (≥ 75 years) presenting with suspected infection. Frailty at baseline (prior to index illness) was explicitly measured for all patients by the treating physicians using the Clinical Frailty Scale (CFS). We defined frailty as a CFS 5-8. The primary outcome was 30-day mortality. We used multivariable logistic regression to adjust for known confounders. We also compared the prognostic accuracy of frailty against the Systemic Inflammatory Response Syndrome (SIRS) and Quick Sequential Organ Failure Assessment (qSOFA) criteria. Results: We enrolled 203 patients, of whom 117 (57.6%) were frail. Frail patients were more likely to develop septic shock (adjusted odds ratio [aOR]: 1.83, 95% confidence interval [CI]: 1.08-2.51) and more likely to die within 30 days of ED presentation (aOR 2.05, 95% CI: 1.02-5.24). Sensitivity for mortality was highest among the CFS (73.1%, 95% CI: 52.2-88.4), as compared to SIRS ≥ 2 (65.4%, 95% CI: 44.3-82.8) or qSOFA ≥ 2 (38.4, 95% CI: 20.2-59.4). Conclusion: Frailty is a highly prevalent prognostic factor that can be used to risk-stratify older ED patients with suspected infection. ED clinicians should consider screening for frailty in order to optimize disposition in this population.
Yukon Territory (YT) is a remote region in northern Canada with ongoing spread of tuberculosis (TB). To explore the utility of whole genome sequencing (WGS) for TB surveillance and monitoring in a setting with detailed contact tracing and interview data, we used a mixed-methods approach. Our analysis included all culture-confirmed cases in YT (2005–2014) and incorporated data from 24-locus Mycobacterial Interspersed Repetitive Units-Variable Number of Tandem Repeats (MIRU-VNTR) genotyping, WGS and contact tracing. We compared field-based (contact investigation (CI) data + MIRU-VNTR) and genomic-based (WGS + MIRU-VNTR + basic case data) investigations to identify the most likely source of each person's TB and assessed the knowledge, attitudes and practices of programme personnel around genotyping and genomics using online, multiple-choice surveys (n = 4) and an in-person group interview (n = 5). Field- and genomics-based approaches agreed for 26 of 32 (81%) cases on likely location of TB acquisition. There was less agreement in the identification of specific source cases (13/22 or 59% of cases). Single-locus MIRU-VNTR variants and limited genetic diversity complicated the analysis. Qualitative data indicated that participants viewed genomic epidemiology as a useful tool to streamline investigations, particularly in differentiating latent TB reactivation from the recent transmission. Based on this, genomic data could be used to enhance CIs, focus resources, target interventions and aid in TB programme evaluation.
Heart disease is the leading cause of death in schizophrenia. However, there has been little research directly examining cardiac function in schizophrenia.
Aims
To investigate cardiac structure and function in individuals with schizophrenia using cardiac magnetic resonance imaging (CMR) after excluding medical and metabolic comorbidity.
Method
In total, 80 participants underwent CMR to determine biventricular volumes and function and measures of blood pressure, physical activity and glycated haemoglobin levels. Individuals with schizophrenia (‘patients’) and controls were matched for age, gender, ethnicity and body surface area.
Results
Patients had significantly smaller indexed left ventricular (LV) end-diastolic volume (effect size d = −0.82, P = 0.001), LV end-systolic volume (d = −0.58, P = 0.02), LV stroke volume (d = −0.85, P = 0.001), right ventricular (RV) end-diastolic volume (d = −0.79, P = 0.002), RV end-systolic volume (d = −0.58, P = 0.02), and RV stroke volume (d = −0.87, P = 0.001) but unaltered ejection fractions relative to controls. LV concentricity (d = 0.73, P = 0.003) and septal thickness (d = 1.13, P < 0.001) were significantly larger in the patients. Mean concentricity in patients was above the reference range. The findings were largely unchanged after adjusting for smoking and/or exercise levels and were independent of medication dose and duration.
Conclusions
Individuals with schizophrenia show evidence of concentric cardiac remodelling compared with healthy controls of a similar age, gender, ethnicity, body surface area and blood pressure, and independent of smoking and activity levels. This could be contributing to the excess cardiovascular mortality observed in schizophrenia. Future studies should investigate the contribution of antipsychotic medication to these changes.
The study provides a comprehensive insight into how an initial receiving hospital without adequate capacity adapted to coping with a mass casualty incident after the Formosa Fun Coast Dust Explosion (FFCDE).
Methods:
Data collection was via in-depth interviews with 11 key participants. This was combined with information from medical records of FFCDE patients and admission logs from the emergency department (ED) to build a detailed timeline of patients flow and ED workload changes. Process tracing analysis focused on how the ED and other units adapted to coping with the difficulties created by the patient surge.
Results:
The hospital treated 30 victims with 36.3% average total body surface area burn for over 5 hours alongside 35 non-FFCDE patients. Overwhelming demand resulted in the saturation of ED space and intensive care unit beds, exhaustion of critical materials, and near-saturation of clinicians. The hospital reconfigured human and physical resources differently from conventional drills. Graphical timelines illustrate anticipatory or reactive adaptations. The hospital’s ability to adapt was based on anticipation during uncertainty and coordination across roles and units to keep pace with varying demands.
Conclusion:
Adapting to beyond-surge capacity incident is essential to effective disaster response. Building organizational support for effective adaptation is critical for disaster planning.
Few studies have used genomic epidemiology to understand tuberculosis (TB) transmission in rural and remote settings – regions often unique in history, geography and demographics. To improve our understanding of TB transmission dynamics in Yukon Territory (YT), a circumpolar Canadian territory, we conducted a retrospective analysis in which we combined epidemiological data collected through routine contact investigations with clinical and laboratory results. Mycobacterium tuberculosis isolates from all culture-confirmed TB cases in YT (2005–2014) were genotyped using 24-locus Mycobacterial Interspersed Repetitive Units-Variable Number of Tandem Repeats (MIRU-VNTR) and compared to each other and to those from the neighbouring province of British Columbia (BC). Whole genome sequencing (WGS) of genotypically clustered isolates revealed three sustained transmission networks within YT, two of which also involved BC isolates. While each network had distinct characteristics, all had at least one individual acting as the probable source of three or more culture-positive cases. Overall, WGS revealed that TB transmission dynamics in YT are distinct from patterns of spread in other, more remote Northern Canadian regions, and that the combination of WGS and epidemiological data can provide actionable information to local public health teams.
Drawing on a landscape analysis of existing data-sharing initiatives, in-depth interviews with expert stakeholders, and public deliberations with community advisory panels across the U.S., we describe features of the evolving medical information commons (MIC). We identify participant-centricity and trustworthiness as the most important features of an MIC and discuss the implications for those seeking to create a sustainable, useful, and widely available collection of linked resources for research and other purposes.
A 2011 National Academies of Sciences report called for an “Information Commons” and a “Knowledge Network” to revolutionize biomedical research and clinical care. We interviewed 41 expert stakeholders to examine governance, access, data collection, and privacy in the context of a medical information commons. Stakeholders' attitudes about MICs align with the NAS vision of an Information Commons; however, differences of opinion regarding clinical use and access warrant further research to explore policy and technological solutions.
Memory services have expanded significantly in the UK, but limited performance data have been published. The aim of this programme was to determine variation in London memory services and address this through service improvement projects. In 2016 London memory services were invited to participate in an audit consisting of case note reviews of at least 50 consecutively seen patients.
Results
Ten services participated in the audit, totalling 590 patients. Variation was noted in neuroimaging practice, neuropsychology referrals, diagnosis subtype, non-dementia diagnoses, waiting times and post-diagnostic support. Findings from the audit were used to initiate four service improvement projects.
Clinical Implications
Memory services should consider streamlining pathways to reduce waiting times, implementing pathways for patients who do not have dementia, monitoring appropriateness of neuroimaging, and working with commissioners and primary care to ensure that access to post-diagnostic interventions is consistent with the updated National Institute for Health and Care Excellence (NICE) dementia guideline.
Clostridium difficile, the most common cause of hospital-associated diarrhoea in developed countries, presents major public health challenges. The high clinical and economic burden from C. difficile infection (CDI) relates to the high frequency of recurrent infections caused by either the same or different strains of C. difficile. An interval of 8 weeks after index infection is commonly used to classify recurrent CDI episodes. We assessed strains of C. difficile in a sample of patients with recurrent CDI in Western Australia from October 2011 to July 2017. The performance of different intervals between initial and subsequent episodes of CDI was investigated. Of 4612 patients with CDI, 1471 (32%) were identified with recurrence. PCR ribotyping data were available for initial and recurrent episodes for 551 patients. Relapse (recurrence with same ribotype (RT) as index episode) was found in 350 (64%) patients and reinfection (recurrence with new RT) in 201 (36%) patients. Our analysis indicates that 8- and 20-week intervals failed to adequately distinguish reinfection from relapse. In addition, living in a non-metropolitan area modified the effect of age on the risk of relapse. Where molecular epidemiological data are not available, we suggest that applying an 8-week interval to define recurrent CDI requires more consideration.
Solvency II came into force on 1 January 2016 and included a transitional measure on technical provisions (“TMTP”) designed to help smooth in the capital impact of Solvency II over a 16-year period. The working party’s view is that the main intention of the TMTP is to mitigate the impact of the introduction of the risk margin, which significantly increases the technical provisions of firms, relative to their Solvency I Pillar 2 liabilities.
The majority of firms who hold a TMTP have now had at least one recalculation approved by the Prudential Regulation Authority (PRA); or are in the process of applying for a recalculation. Despite this large number of approved recalculations, there remains significant uncertainty in the industry around the approach and triggers for recalculation.
This paper considers aspects of TMTP recalculation for regulated UK life firms, for example practicalities of the calculation, asset and liability considerations, and communications/announcements.
In this paper, we outline the need for pragmatism when considering the approach to recalculation of a measure originally intended to serve as the bridge between two regimes. We call for an allowance for doing what is sensible in a principles-based regime balancing what might be more theoretically correct with what is practical and possible to support effective management of the business.
A variety of paediatric tracheostomy tubes are available. This article reviews the tubes in current use at Great Ormond Street Hospital for Children and Evelina London Children's Hospital.
Methods
This paper outlines our current preferences, and the particular indications for different tracheostomy tubes, speaking valves and other attachments.
Results
Our preferred types of tubes have undergone significant design changes. This paper also reports further experience with certain tubes that may be useful in particular circumstances. An updated sizing chart is included for reference purposes.
Conclusion
The choice of a paediatric tracheostomy tube remains largely determined by individual clinical requirements. Although we still favour a small range of tubes for use in the majority of our patients, there are circumstances in which other varieties are indicated.
Excitable temperament disrupts physiological events required for reproductive development in cattle, but no research has investigated the impacts of temperament on growth and puberty attainment in Bos indicus females. Hence, this experiment evaluated the effects of temperament on growth, plasma cortisol concentrations and puberty attainment in B. indicus heifers. A total of 170 Nelore heifers, weaned 4 months before the beginning of this experiment (days 0 to 91), were managed in two groups of 82 and 88 heifers each (mean ± SE; initial BW=238±2 kg, initial age=369±1 days across groups). Heifer temperament was evaluated via exit velocity on day 0. Individual exit score was calculated within each group by dividing exit velocity into quintiles and assigning heifers with a score from 1 to 5 (1=slowest; 5=fastest heifer). Heifers were classified according to exit score as adequate (ADQ, n=96; exit score⩽3) or excitable temperament (EXC, n=74; exit score>3). Heifer BW, body condition score (BCS) and blood samples were obtained on days 0, 31, 60 and 91. Heifer exit velocity and score were recorded again on days 31, 60 and 91. Ovarian transrectal ultrasonography was performed on days 0 and 10, 31 and 41, 60 and 70, 81 and 91 for puberty evaluation. Heifer was declared pubertal at the first 10-day interval in which a corpus luteum was detected. Exit velocity and exit score obtained on day 0 were correlated (r⩾0.64, P<0.01) with evaluations on days 31, 60 and 91. During the experiment, ADQ had greater (P<0.01) mean BCS and BW gain, and less (P<0.01) mean plasma cortisol concentration compared with EXC heifers. Temperament × time interactions were detected (P<0.01) for exit velocity and exit score, which were always greater (P<0.01) in EXC v. ADQ heifers. A temperament × time interaction was also detected (P=0.03) for puberty attainment, which was delayed in EXC v. ADQ heifers. At the end of the experiment, a greater (P<0.01) proportion of ADQ were pubertal compared with EXC heifers. In summary, B. indicus heifers classified as EXC had reduced growth, increased plasma cortisol concentrations and hindered puberty attainment compared to ADQ heifers. Moreover, exit velocity may serve as temperament selection criteria to optimize development of B. indicus replacement heifers.
We propose here a generalization of the problem addressed by the SHGH conjecture. The SHGH conjecture posits a solution to the question of how many conditions a general union
$X$
of fat points imposes on the complete linear system of curves in
$\mathbb{P}^{2}$
of fixed degree
$d$
, in terms of the occurrence of certain rational curves in the base locus of the linear subsystem defined by
$X$
. As a first step towards a new theory, we show that rational curves play a similar role in a special case of a generalized problem, which asks how many conditions are imposed by a general union of fat points on linear subsystems defined by imposed base points. Moreover, motivated by work of Di Gennaro, Ilardi and Vallès and of Faenzi and Vallès, we relate our results to the failure of a strong Lefschetz property, and we give a Lefschetz-like criterion for Terao’s conjecture on the freeness of line arrangements.