To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the association between novel pre- and post-operative biomarker levels and 30-day unplanned readmission or mortality after paediatric congenital heart surgery.
Children aged 18 years or younger undergoing congenital heart surgery (n = 162) at Johns Hopkins Hospital from 2010 to 2014 were enrolled in the prospective cohort. Collected novel pre- and post-operative biomarkers include soluble suppression of tumorgenicity 2, galectin-3, N-terminal prohormone of brain natriuretic peptide, and glial fibrillary acidic protein. A model based on clinical variables from the Society of Thoracic Surgery database was developed and evaluated against two augmented models.
Unplanned readmission or mortality within 30 days of cardiac surgery occurred among 21 (13%) children. The clinical model augmented with pre-operative biomarkers demonstrated a statistically significant improvement over the clinical model alone with a receiver-operating characteristics curve of 0.754 (95% confidence interval: 0.65–0.86) compared to 0.617 (95% confidence interval: 0.47–0.76; p-value: 0.012). The clinical model augmented with pre- and post-operative biomarkers demonstrated a significant improvement over the clinical model alone, with a receiver-operating characteristics curve of 0.802 (95% confidence interval: 0.72–0.89; p-value: 0.003).
Novel biomarkers add significant predictive value when assessing the likelihood of unplanned readmission or mortality after paediatric congenital heart surgery. Further exploration of the utility of these novel biomarkers during the pre- or post-operative period to identify early risk of mortality or readmission will aid in determining the clinical utility and application of these biomarkers into routine risk assessment.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
The north-west European population of Bewick’s Swan Cygnus columbianus bewickii declined by 38% between 1995 and 2010 and is listed as ‘Endangered’ on the European Red List of birds. Here, we combined information on food resources within the landscape with long-term data on swan numbers, habitat use, behaviour and two complementary measures of body condition, to examine whether changes in food type and availability have influenced the Bewick’s Swan’s use of their main wintering site in the UK, the Ouse Washes and surrounding fens. Maximum number of Bewick’s Swans rose from 620 in winter 1958/59 to a high of 7,491 in winter 2004/05, before falling to 1,073 birds in winter 2013/14. Between winters 1958/59 and 2014/15 the Ouse Washes supported between 0.5 and 37.9 % of the total population wintering in north-west Europe (mean ± 95 % CI = 18.1 ± 2.4 %). Swans fed on agricultural crops, shifting from post-harvest remains of root crops (e.g. sugar beet and potatoes) in November and December to winter-sown cereals (e.g. wheat) in January and February. Inter-annual variation in the area cultivated for these crops did not result in changes in the peak numbers of swans occurring on the Ouse Washes. Behavioural and body condition data indicated that food supplies on the Ouse Washes and surrounding fens remain adequate to allow the birds to gain and maintain good body condition throughout winter with no increase in foraging effort. Our findings suggest that the recent decline in numbers of Bewick’s Swans at this internationally important site was not linked to inadequate food resources.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Introduction: The ECG diagnosis of acute coronary occlusion (ACO) in the setting of ventricular paced rhythm (VPR) is purported to be impossible. However, VPR has a similar ECG morphology to LBBB. The validated Smith-modified Sgarbossa criteria (MSC) have high sensitivity (Sens) and specificity (Spec) for ACO in LBBB. MSC consist of 1 of the following in 1 lead: concordant ST Elevation (STE) 1 mm, concordant ST depression 1 mm in V1-V3, or ST/S ratio <−0.25 (in leads with 1 mm STE). We hypothesized that the MSC will have higher Sens for diagnosis of ACO in VPR when compared to the original Sgarbossa criteria. We report preliminary findings of the Paced Electrocardiogram Requiring Fast Emergency Coronary Therapy (PERFECT) study Methods: The PERFECT study is a retrospective, multicenter, international investigation of ED patients from 1/2008 - 12/2016 with VPR on the ECG and symptoms suggestive of acute coronary syndrome (e.g. chest pain or shortness of breath). Data from four sites are presented. Acute myocardial infarction (AMI) was defined by the Third Universal Definition of AMI. A blinded cardiologist adjudicated ACO, defined as thrombolysis in myocardial infarction score 0 or 1 on coronary angiography; a pre-defined subgroup of ACO patients with peak cardiac troponin (cTn) >100 times the 99% upper reference limit (URL) of the cTn assay was also analyzed. Another blinded physician measured all ECGs. Statistics were by Mann Whitney U, Chi-square, and McNemars test. Results: The ACO and No-AMI groups consisted of 15 and 79 encounters, respectively. For the ACO and No-AMI groups, median age was 78 [IQR 72-82] vs. 70 [61-75] and 13 (86%) vs. 48 (61%) patients were male. The median peak cTn ratio (cTn/URL) was 260 [33-663] and 0.5 [0-1.3] for ACO vs. no-AMI. The Sens and Spec for the MSC and the original Sgarbossa criteria were 67% (95%CI 39-87) vs. 46% (22-72; p=0.25) and 99% (92-100) vs. 99% (92-100; p=0.5). In pre-defined subgroup analysis of ACO patients with peak cTn >100 times the URL (n=10), the Sens was 90% (54-100) for the MSC vs. 60% (27- 86) for original Sgarbossa criteria (p=0.25). Conclusion: ACO in VPR is an uncommon condition. The MSC showed good Sens for diagnosis of ACO in the presence of VPR, especially among patients with high peak cTn, and Spec was excellent. These methods and results are consistent with studies that have used the MSC to diagnose ACO in LBBB.
Background: High comorbidity rates among emotional disorders have led researchers to examine transdiagnostic factors that may contribute to shared psychopathology. Bifactor models provide a unique method for examining transdiagnostic variables by modelling the common and unique factors within measures. Previous findings suggest that the bifactor model of the Depression Anxiety and Stress Scale (DASS) may provide a method for examining transdiagnostic factors within emotional disorders. Aims: This study aimed to replicate the bifactor model of the DASS, a multidimensional measure of psychological distress, within a US adult sample and provide initial estimates of the reliability of the general and domain-specific factors. Furthermore, this study hypothesized that Worry, a theorized transdiagnostic variable, would show stronger relations to general emotional distress than domain-specific subscales. Method: Confirmatory factor analysis was used to evaluate the bifactor model structure of the DASS in 456 US adult participants (279 females and 177 males, mean age 35.9 years) recruited online. Results: The DASS bifactor model fitted well (CFI = 0.98; RMSEA = 0.05). The General Emotional Distress factor accounted for most of the reliable variance in item scores. Domain-specific subscales accounted for modest portions of reliable variance in items after accounting for the general scale. Finally, structural equation modelling indicated that Worry was strongly predicted by the General Emotional Distress factor. Conclusions: The DASS bifactor model is generalizable to a US community sample and General Emotional Distress, but not domain-specific factors, strongly predict the transdiagnostic variable Worry.
On 27 April 2015, Washington health authorities identified Escherichia coli O157:H7 infections associated with dairy education school field trips held in a barn 20–24 April. Investigation objectives were to determine the magnitude of the outbreak, identify the source of infection, prevent secondary illness transmission and develop recommendations to prevent future outbreaks. Case-finding, hypothesis generating interviews, environmental site visits and a case–control study were conducted. Parents and children were interviewed regarding event activities. Odds ratios (OR) and 95% confidence intervals (CI) were computed. Environmental testing was conducted in the barn; isolates were compared to patient isolates using pulsed-field gel electrophoresis (PFGE). Sixty people were ill, 11 (18%) were hospitalised and six (10%) developed haemolytic uremic syndrome. Ill people ranged in age from <1 year to 47 years (median: 7), and 20 (33%) were female. Twenty-seven case-patients and 88 controls were enrolled in the case–control study. Among first-grade students, handwashing (i.e. soap and water, or hand sanitiser) before lunch was protective (adjusted OR 0.13; 95% CI 0.02–0.88, P = 0.04). Barn samples yielded E. coli O157:H7 with PFGE patterns indistinguishable from patient isolates. This investigation provided epidemiological, laboratory and environmental evidence for a large outbreak of E. coli O157:H7 infections from exposure to a contaminated barn. The investigation highlights the often overlooked risk of infection through exposure to animal environments as well as the importance of handwashing for disease prevention. Increased education and encouragement of infection prevention measures, such as handwashing, can prevent illness.
Developing countries are experiencing an increase in total demand for livestock commodities, as populations and per capita demands increase. Increased production is therefore required to meet this demand and maintain food security. Production increases will lead to proportionate increases in greenhouse gas (GHG) emissions unless offset by reductions in the emissions intensity (Ei) (i.e. the amount of GHG emitted per kg of commodity produced) of livestock production. It is therefore important to identify measures that can increase production whilst reducing Ei cost-effectively. This paper seeks to do this for smallholder agro-pastoral cattle systems in Senegal; ranging from low input to semi-intensified, they are representative of a large proportion of the national cattle production. Specifically, it identifies a shortlist of mitigation measures with potential for application to the various herd systems and estimates their GHG emissions abatement potential (using the Global Livestock Environmental Assessment Model) and cost-effectiveness. Limitations and future requirements are identified and discussed. This paper demonstrates that the Ei of meat and milk from livestock systems in a developing region can be reduced through measures that would also benefit food security, many of which are likely to be cost-beneficial. The ability to make such quantification can assist future sustainable development efforts.
To assess relationships between mothers’ feeding practices (food as a reward, food for emotion regulation, modelling of healthy eating) and mothers’ willingness to purchase child-marketed foods and fruits/vegetables (F&V) requested by their children during grocery co-shopping.
Cross-sectional. Mothers completed an online survey that included questions about feeding practices and willingness (i.e. intentions) to purchase child-requested foods during grocery co-shopping. Feeding practices scores were dichotomized at the median. Foods were grouped as nutrient-poor or nutrient-dense (F&V) based on national nutrition guidelines. Regression models compared mothers with above-the-median v. at-or-below-the-median feeding practices scores on their willingness to purchase child-requested food groupings, adjusting for demographic covariates.
Participants completed an online survey generated at a public university in the USA.
Mothers (n 318) of 2- to 7-year-old children.
Mothers who scored above-the-median on using food as a reward were more willing to purchase nutrient-poor foods (β=0·60, P<0·0001), mothers who scored above-the-median on use of food for emotion regulation were more willing to purchase nutrient-poor foods (β=0·29, P<0·0031) and mothers who scored above-the-median on modelling of healthy eating were more willing to purchase nutrient-dense foods (β=0·22, P<0·001) than were mothers with at-or-below-the-median scores, adjusting for demographic covariates.
Mothers who reported using food to control children’s behaviour were more willing to purchase child-requested, nutrient-poor foods. Parental feeding practices may facilitate or limit children’s foods requested in grocery stores. Parent–child food consumer behaviours should be investigated as a route that may contribute to children’s eating patterns.
Historically, community engagement (CE) in research has been implemented in the fields of public health, education and agricultural development. In recent years, international discussions on the ethical and practical goals of CE have been extended to human genomic research and biobanking, particularly in the African context. While there is some consensus on the goals and value of CE generally, questions remain about the effectiveness of CE practices and how to evaluate this. Under the auspices of the Human Heredity and Health in Africa Initiative (H3Africa), the H3Africa CE working group organized a workshop in Stellenbosch, South Africa in March 2016 to explore the extent to which communities should be involved in genomic research and biobanking and to examine various methods of evaluating the effectiveness of CE. In this paper, we present the key themes that emerged from the workshop and make a case for the development of a rigorous application, evaluation and learning around approaches for CE that promote a more systematic process of engaging relevant communities. We highlight the key ways in which CE should be embedded into genomic research and biobanking projects.
Background: Standardized data collection for traumatic brain injury (TBI) (including concussion) using common data elements (CDEs) has strengthened clinical care and research capacity in the United States and Europe. Currently, Ontario healthcare providers do not collect uniform data on adult patients diagnosed with concussion. Objective: The Ontario Concussion Care Strategy (OCCS) is a collaborative network of multidisciplinary healthcare providers, brain injury advocacy groups, patient representatives, and researchers with a shared vision to improve concussion care across the province, starting with the collection of standardized data. Methods: The International Framework of Functioning Disability and Health was selected as the conceptual framework to inform the selection of CDEs. The CDEs recommended by the OCCS were identified using key literature, including the National Institute of Neurological Disorders and Stroke–Zurich Consensus Statements for concussion in sport and the Ontario Neurotrauma Foundation Concussion/mTBI clinical guidelines. Results: The OCCS has recommended and piloted CDEs for Ontario that are readily available at no cost, clinically relevant, patient friendly, easy to interpret, and recognized by the international scientific community. Conclusions: The implementation of CDEs can help to shift Ontario toward internationally recognized standard data collection, and in so doing yield a more comprehensive evidence-based approach to care while also supporting rigorous research.
Introduction: We characterised tobacco use, cessation patterns, and patient satisfaction with a cessation support program at an NCI Designated Comprehensive Cancer Center following a mandatory tobacco assessment and automatic referral.
Methods: A 3-month follow-up survey (via web, paper, or telephone) was administered between March 2013 and November 2013 for all patients referred to and contacted by a cessation support service, and who consented to participation three months prior to administration. Patients were asked about their perceived importance and self-efficacy to quit smoking, quit attempts, and satisfaction with the cessation service.
Results: Fifty-two percent (257/499) of patients who participated in the cessation support service, and consented to be contacted again, completed a follow-up survey. Of those who participated, 9.7% were referred to the service as having recently quit tobacco (in the past 30 days) and 23.6% reported having quit at the time of first contact. At the 3-month follow-up, 48.1% reported being smoke-free for the previous seven days. When patients were asked about their experience with the cessation service, 86.4% reported being very or mostly satisfied with the service, and 64.3% reported that their experience with the service increased their satisfaction with the care received at the cancer centre.
Conclusions: Our findings suggest that recently diagnosed cancer patients are aware that quitting tobacco is important, are making attempts to quit, and are amenable to an opt-out automatic referral cessation support service as part of their cancer care.
Salmonella is a leading cause of bacterial foodborne illness. We report the collaborative investigative efforts of US and Canadian public health officials during the 2013–2014 international outbreak of multiple Salmonella serotype infections linked to sprouted chia seed powder. The investigation included open-ended interviews of ill persons, traceback, product testing, facility inspections, and trace forward. Ninety-four persons infected with outbreak strains from 16 states and four provinces were identified; 21% were hospitalized and none died. Fifty-four (96%) of 56 persons who consumed chia seed powder, reported 13 different brands that traced back to a single Canadian firm, distributed by four US and eight Canadian companies. Laboratory testing yielded outbreak strains from leftover and intact product. Contaminated product was recalled. Although chia seed powder is a novel outbreak vehicle, sprouted seeds are recognized as an important cause of foodborne illness; firms should follow available guidance to reduce the risk of bacterial contamination during sprouting.
The past decade has been a tumultuous one for Japanese higher education with faculty pitted against students, students against government, and government against both in an often chaotic and seemingly incessant search for an answer to the question “Who governs Japan's universities?” The answer given by many analysts is “no one”—that the conflict between these three bodies have rendered Japanese universities ungovernable. Failure to achieve consensus and implement needed reforms in higher education has been attributed largely to the decentralized internal organization of the leading universities and the tradition of deep-seated hostility between academic intellectuals and the Japanese government. Although the more violent and dramatic of the conflicts in the late 1960s and early 1970s have received considerable attention by observers outside Japan, the long tradition of conflict over university governance which provides such a significant part of the intellectual and political context for those caught up in the contemporary debates has received far less attention. The purpose of this essay is to provide an historical perspective on this conflict by sketching in the prewar background that constitutes the heritage of academic self-government at Japan's oldest and still foremost universities, the imperial universities of Tokyo and Kyoto.