To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Less than half of stool samples from people symptomatic with infectious intestinal disease (IID) will identify a causative organism. A secondary data analysis was undertaken to explore whether symptomology alone could be used to make inferences about causative organisms. Data were utilised from the Second Study of Infectious Intestinal Disease in the Community. A total of 844 cases were analysed. Few symptoms differentiated individual pathogens, but grouping pathogens together showed that viral IID was more likely when symptom onset was in winter (odds ratio (OR) 2.08, 95% confidence interval (CI) 1.16–3.75) or spring (OR 1.92, 95% CI 1.11–3.33), the patient was aged under 5 years (OR 3.63, 95% CI 2.24–6.03) and there was loss of appetite (OR 2.19, 95% CI 1.29–3.72). The odds of bacterial IID were higher with diarrhoea in the absence of vomiting (OR 3.54, 95% CI 2.37–5.32), diarrhoea which persisted for >3 days (OR 2.69, 95% CI 1.82–3.99), bloody diarrhoea (OR 4.17, 95% CI 1.63–11.83) and fever (OR 1.67, 95% CI 1.11–2.53). Symptom profiles could be of value to help guide clinicians and public health professionals in the management of IID, in the absence of microbiological confirmation.
To enhance enrollment into randomized clinical trials (RCTs), we proposed electronic health record-based clinical decision support for patient–clinician shared decision-making about care and RCT enrollment, based on “mathematical equipoise.”
As an example, we created the Knee Osteoarthritis Mathematical Equipoise Tool (KOMET) to determine the presence of patient-specific equipoise between treatments for the choice between total knee replacement (TKR) and nonsurgical treatment of advanced knee osteoarthritis.
With input from patients and clinicians about important pain and physical function treatment outcomes, we created a database from non-RCT sources of knee osteoarthritis outcomes. We then developed multivariable linear regression models that predict 1-year individual-patient knee pain and physical function outcomes for TKR and for nonsurgical treatment. These predictions allowed detecting mathematical equipoise between these two options for patients eligible for TKR. Decision support software was developed to graphically illustrate, for a given patient, the degree of overlap of pain and functional outcomes between the treatments and was pilot tested for usability, responsiveness, and as support for shared decision-making.
The KOMET predictive regression model for knee pain had four patient-specific variables, and an r2 value of 0.32, and the model for physical functioning included six patient-specific variables, and an r2 of 0.34. These models were incorporated into prototype KOMET decision support software and pilot tested in clinics, and were generally well received.
Use of predictive models and mathematical equipoise may help discern patient-specific equipoise to support shared decision-making for selecting between alternative treatments and considering enrollment into an RCT.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
To determine which healthcare worker (HCW) roles and patient care activities are associated with acquisition of vancomycin-resistant Enterococcus (VRE) on HCW gloves or gowns after patient care, as a surrogate for transmission to other patients.
Prospective cohort study.
Medical and surgical intensive care units at a tertiary-care academic institution.
VRE-colonized patients on Contact Precautions and their HCWs.
Overall, 94 VRE-colonized patients and 469 HCW–patient interactions were observed. Research staff recorded patient care activities and cultured HCW gloves and gowns for VRE before doffing and exiting patient room.
VRE were isolated from 71 of 469 HCWs’ gloves or gowns (15%) following patient care. Occupational/physical therapists, patient care technicians, nurses, and physicians were more likely than environmental services workers and other HCWs to have contaminated gloves or gowns. Compared to touching the environment alone, the odds ratio (OR) for VRE contamination associated with touching both the patient (or objects in the immediate vicinity of the patient) and environment was 2.78 (95% confidence interval [CI], 0.99–0.77) and the OR associated with touching only the patient (or objects in the immediate vicinity) was 3.65 (95% CI, 1.17–11.41). Independent risk factors for transmission of VRE to HCWs were touching the patient’s skin (OR, 2.18; 95% CI, 1.15–4.13) and transferring the patient into or out of bed (OR, 2.66; 95% CI, 1.15–6.43).
Patient contact is a major risk factor for HCW contamination and subsequent transmission. Interventions should prioritize contact precautions and hand hygiene for HCWs whose activities involve touching the patient.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.
Crib–biting is a stereotypic behaviour performed by approximately 5% of captive domestic horses. Dietary factors have been strongly associated with the development of oral stereotypies and risk factors for crib–biting, identified in recent epidemiological studies, include feeding high concentrate and/or low forage diets (Waters et al., 2002). Experimental work has shown that such diets are likely to result in increased gastric acidity (Murray and Eichorn, 1996; Nadeau et al., 2000). We therefore propose that young horses initiate crib–biting in an attempt to produce alkaline saliva to buffer their stomachs when alternative opportunities for mastication are limited. The aim of this study was to determine whether there was an association between crib–biting behaviour and stomach condition in foals.
Foals that had recently started to perform crib–biting were recruited into the study and compared with non–stereotypic foals. The stomachs of 15 crib-biting foals and 9 normal foals were examined using a video endoscope.
Oxidative stress occurs when antioxidant defence mechanisms are overwhelmed by free radicals and may lead to damage to DNA, which has been implicated in processes such as ageing and cancer. The Comet assay allows detection of oxidative DNA damage in individual cells. As horses with recurrent airway obstruction (RAO) have been shown to demonstrate low antioxidant status and oxidative stress, we hypothesised that peripheral blood mononuclear cells (PBMC) of horses with RAO would demonstrate increases in DNA damage following natural allergen challenge.
Six horses (mean age 15 years, range 8-23 years) diagnosed with RAO (in remission) and 6 healthy breed matched controls (mean age 9 years, range 5-15 years) were studied. Blood samples were collected 7 days prior to challenge and immediately and 3 days after stabling on mouldy hay and straw for 24h. All animals were kept at grass prior to and after the challenge period. Bronchoalveolar lavage (BAL) was performed and neutrophil counts determined.
Recombined milks manufactured from milk powders made from milk produced by b-lactoglobulin (b-LG) AA phenotype cows were not suitable for processing into ultra-heat-treated (UHT) milk products as these milks rapidly fouled heat exchanger surfaces when compared with standard mixed b-LG variant milk. Recombined milks manufactured from powders from b-LG BB phenotype milk generally gave low fouling rates upon UHT treatment and in some cases gave almost negligible fouling of UHT heat exchanger surfaces. Fresh milk from b-LG AA phenotype cows fouled evaporator preheaters more rapidly than standard milk, whereas fresh milk produced from b-LG BB phenotype cows fouled evaporator preheaters less rapidly than standard milk. Recombined milks manufactured from powders made with milk from k-casein (k-CN) BB phenotype cows fouled heat exchanger surfaces more rapidly than recombined milks manufactured from powders from milk from k-CN AA phenotype cows.
It has been shown that horses and ponies at pasture usually graze for 15-17 hours per day, and consume between 16 and 33g dry matter (DM) /kg live weight per day, depending on animal size and physiological status. However, many predominantly stabled horses have restricted access to pasture, often only 1-3 hours/day. There is no information on voluntary food intake (VFI) of horses under such regimens. Therefore the aim of this pilot study was to determine the voluntary intake of fresh herbage by ponies when their access to pasture was restricted.
We previously found that guar gum (GG) and chickpea flour (CPF) added to flatbread wheat flour lowered postprandial blood glucose (PPG) and insulin responses dose dependently. However, rates of glucose influx cannot be determined from PPG, which integrates rates of influx, tissue disposal and hepatic glucose production. The objective was to quantify rates of glucose influx and related fluxes as contributors to changes in PPG with GG and CPF additions to wheat-based flatbreads. In a randomised cross-over design, twelve healthy males consumed each of three different 13C-enriched meals: control flatbreads (C), or C incorporating 15 % CPF with either 2 % (GG2) or 4 % (GG4) GG. A dual isotope technique was used to determine the time to reach 50 % absorption of exogenous glucose (T50 %abs, primary objective), rate of appearance of exogenous glucose (RaE), rate of appearance of total glucose (RaT), endogenous glucose production (EGP) and rate of disappearance of total glucose (RdT). Additional exploratory outcomes included PPG, insulin, glucose-dependent insulinotropic peptide and glucagon-like peptide 1, which were additionally measured over 4 h. Compared with C, GG2 and GG4 had no significant effect on T50 %abs. However, GG4 significantly reduced 4-h AUC values for RaE, RaT, RdT and EGP, by 11, 14, 14 and 64 %, respectively, whereas GG2 showed minor effects. Effect sizes over 2 and 4 h were similar except for significantly greater reduction in EGP for GG4 at 2 h. In conclusion, a soluble fibre mix added to flatbreads only slightly reduced rates of glucose influx, but more substantially affected rates of postprandial disposal and hepatic glucose production.
We have previously reported an association between childhood abuse and psychotic experiences (PEs) in survey data from South East London. Childhood abuse is related to subsequent adulthood adversity, which could form one pathway to PEs. We aimed to investigate evidence of mediation of the association between childhood abuse and PEs by adverse life events.
Data were analysed from the South East London Community Health Study (SELCoH, n = 1698). Estimates of the total effects on PEs of any physical or sexual abuse while growing up were partitioned into direct (i.e. unmediated) and indirect (total and specific) effects, mediated via violent and non-violent life events.
There was strong statistical evidence for direct (OR 1.58, 95% CI: 1.19–2.1) and indirect (OR 1.51, 95% CI: 1.32–1.72) effects of childhood abuse on PEs after adjustment for potential confounders, indicating partial mediation of this effect via violent and non-violent life events. An estimated 47% of the total effect of abuse on PEs was mediated via adulthood adverse life events, of which violent life events made up 33% and non-violent life events the remaining 14%.
The association between childhood abuse and PEs is partly mediated through the experience of adverse life events in adulthood. There is some evidence that a larger proportion of this effect was mediated through violent life events than non-violent life events.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
To determine whether patients using the Centers for Medicare and Medicaid Services (CMS) Hospital Compare website (http://medicare.gov/hospitalcompare) can use nationally reported healthcare-associated infection (HAI) data to differentiate hospitals.
Secondary analysis of publicly available HAI data for calendar year 2013.
We assessed the availability of HAI data for geographically proximate hospitals (ie, hospitals within the same referral region) and then analyzed these data to determine whether they are useful to differentiate hospitals. We assessed data for the 6 HAIs reported by hospitals to the Centers for Disease Control and Prevention (CDC).
Data were analyzed for 4,561 hospitals representing 88% of registered community and federal government hospitals in the United States. Healthcare-associated infection data are only useful for comparing hospitals if they are available for multiple hospitals within a geographic region. We found that data availability differed by HAI. Clostridium difficile infections (CDI) data were most available, with 82% of geographic regions (ie, hospital referral regions) having >50% of hospitals reporting them. In contrast, 4% of geographic regions had >50% of member hospitals reporting surgical site infections (SSI) for hysterectomies, which had the lowest availability. The ability of HAI data to differentiate hospitals differed by HAI: 72% of hospital referral regions had at least 1 pair of hospitals with statistically different risk-adjusted CDI rates (SIRs), compared to 9% for SSI (hysterectomy).
HAI data generally are reported by enough hospitals to meet minimal criteria for useful comparisons in many geographic locations, though this varies by type of HAI. CDI and catheter-associated urinary tract infection (CAUTI) are more likely to differentiate hospitals than the other publicly reported HAIs.
Despite the ubiquity of tunnel channels and valleys within formerly glaciated areas, their origin remains enigmatic. Few modern analogues exist for event-related subglacial erosion. This paper presents evidence of subglacial meltwater erosion and tunnel channel formation during the November 1996 jökulhlaup, Skeiðarárjökull, Iceland. The jökulhlaup reached a peak discharge of 45 000 to 50 000 m3 s–1, with flood outbursts emanating from multiple outlets across the entire 23 km wide glacier snout. Subsequent retreat of the southeast margin of Skeiðarárjökull has revealed a tunnel channel excavated into the surrounding moraine sediment and ascending 11.5m over a distance of 160 m from a larger trough to join the apex of an ice-contact fan formed in November 1996. The tunnel channel formed via hydro-mechanical erosion of 14 000m3 to 24 000 m3 of unconsolidated glacier substrate, evidenced by copious rip-up clasts within the ice-contact fan. Flow reconstruction provides peak discharge estimates of 680±140m3 s–1. The tunnel channel orientation, oblique to local ice flow direction and within a col, suggests that local jökulhlaup routing was controlled by (a) subglacial topography and (b) the presence of a nearby proglacial lake. We describe the first modern example of tunnel channel formation and illustrate the importance of pressurized subglacial jökulhlaup flow for tunnel channel formation.
Cognitive deficits are a core feature of schizophrenia, and impairments in most domains are thought to be stable over the course of the illness. However, cross-sectional evidence indicates that some areas of cognition, such as visuospatial associative memory, may be preserved in the early stages of psychosis, but become impaired in later established illness stages. This longitudinal study investigated change in visuospatial and verbal associative memory following psychosis onset.
In total 95 first-episode psychosis (FEP) patients and 63 healthy controls (HC) were assessed on neuropsychological tests at baseline, with 38 FEP and 22 HCs returning for follow-up assessment at 5–11 years. Visuospatial associative memory was assessed using the Cambridge Neuropsychological Test Automated Battery Visuospatial Paired-Associate Learning task, and verbal associative memory was assessed using Verbal Paired Associates subtest of the Wechsler Memory Scale - Revised.
Visuospatial and verbal associative memory at baseline did not differ significantly between FEP patients and HCs. However, over follow-up, visuospatial associative memory deteriorated significantly for the FEP group, relative to healthy individuals. Conversely, verbal associative memory improved to a similar degree observed in HCs. In the FEP cohort, visuospatial (but not verbal) associative memory ability at baseline was associated with functional outcome at follow-up.
Areas of cognition that develop prior to psychosis onset, such as visuospatial and verbal associative memory, may be preserved early in the illness. Later deterioration in visuospatial memory ability may relate to progressive structural and functional brain abnormalities that occurs following psychosis onset.
Currently it is estimated that about 1 billion people globally have non-alcoholic fatty liver disease (NAFLD), a condition in which liver fat exceeds 5 % of liver weight in the absence of significant alcohol intake. Due to the central role of the liver in metabolism, the prevalence of NAFLD is increasing in parallel with the prevalence of obesity, insulin resistance and other risk factors of metabolic diseases. However, the contribution of liver fat to the risk of type 2 diabetes mellitus and CVD, relative to other ectopic fat depots and to other risk markers, is unclear. Various studies have suggested that the accumulation of liver fat can be reduced or prevented via dietary changes. However, the amount of liver fat reduction that would be physiologically relevant, and the timeframes and dose–effect relationships for achieving this through different diet-based approaches, are unclear. Also, it is still uncertain whether the changes in liver fat per se or the associated metabolic changes are relevant. Furthermore, the methods available to measure liver fat, or even individual fatty acids, differ in sensitivity and reliability. The present report summarises key messages of presentations from different experts and related discussions from a workshop intended to capture current views and research gaps relating to the points above.
To review the clinical signs of vocal fold paresis on laryngeal videostroboscopy, to quantify its impact on patients’ quality of life and to confirm the benefit of laryngeal electromyography in its diagnosis.
Twenty-nine vocal fold paresis patients were referred for laryngeal electromyography. Voice Handicap Index 10 results were compared to 43 patients diagnosed with vocal fold paralysis. Laryngeal videostroboscopy analysis was conducted to determine side of paresis.
Blinded laryngeal electromyography confirmed vocal fold paresis in 92.6 per cent of cases, with vocal fold lag being the most common diagnostic sign. The laryngology team accurately predicted side of paresis in 76 per cent of cases. Total Voice Handicap Index 10 responses were not significantly different between vocal fold paralysis and vocal fold paresis groups (26.08 ± 0.21 and 22.93 ± 0.17, respectively).
Vocal fold paresis has a significant impact on quality of life. This study shows that laryngeal electromyography is an important diagnostic tool. Patients with persisting dysphonia and apparently normal vocal fold movement, who fail to respond to appropriate speech therapy, should be investigated for a diagnosis of vocal fold paresis.