To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Antimicrobial stewardship programs typically use days of therapy to assess antimicrobial use. However, this metric does not account for the antimicrobial spectrum of activity. We applied an antibiotic spectrum index to a population of very-low-birth-weight infants to assess its utility to evaluate the impact of antimicrobial stewardship interventions.
Introduction: Although acute gastroenteritis is an extremely common childhood illness, there is a paucity of literature characterizing the associated pain and its management. Our primary objective was to quantify the pain experienced by children with acute gastroenteritis in the 24-hours prior to emergency department (ED) presentation. Secondary objectives included describing maximum pain, analgesic use, discharge recommendations, and factors that influenced analgesic use in the ED. Methods: Study participants were recruited into this prospective cohort study by the Alberta Provincial Pediatric EnTeric Infection TEam between January 2014 and September 2017. This study was conducted at two Canadian pediatric EDs; the Alberta Children's Hospital (Calgary) and the Stollery Children's Hospital (Edmonton). Eligibility criteria included < 18 years of age, acute gastroenteritis (□ 3 episodes of diarrhea or vomiting in the previous 24 hours), and symptom duration □ 7 days. The primary study outcome, caregiver-reported maximum pain in the 24-hours prior to presentation, was assessed using the 11-point Verbal Numerical Rating Scale. Results: We recruited 2136 patients, median age 20.8 months (IQR 10.4, 47.4); 45.8% (979/2136) female. In the 24-hours prior to enrolment, 28.6% (610/2136) of caregivers reported that their child experienced moderate (4-6) and 46.2% (986/2136) severe (7-10) pain in the preceding 24-hours. During the emergency visit, 31.1% (664/2136) described pain as moderate and 26.7% (571/2136) as severe. In the ED, analgesia was provided to 21.2% (452/2131) of children. The most commonly administered analgesics in the ED were ibuprofen (68.1%, 308/452) and acetaminophen (43.4%, 196/452); at home, acetaminophen was most commonly administered (77.7%, 700/901), followed by ibuprofen (37.5%, 338/901). Factors associated with analgesia use in the ED were greater pain scores during the visit, having a primary-care physician, shorter illness duration, fewer diarrheal episodes, presence of fever and hospitalization. Conclusion: Although children presenting to the ED with acute gastroenteritis experience moderate to severe pain, both prior to and during their emergency visit, analgesic use is limited. Future research should focus on appropriate pain management through the development of effective and safe pain treatment plans.
We assessed self-reported drives for alcohol use and their impact on clinical features of alcohol use disorder (AUD) patients. Our prediction was that, in contrast to “affectively” (reward or fear) driven drinking, “habitual” drinking would be associated with worse clinical features in relation to alcohol use and higher occurrence of associated psychiatric symptoms.
Fifty-eight Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) alcohol abuse patients were assessed with a comprehensive battery of reward- and fear-based behavioral tendencies. An 18-item self-report instrument (the Habit, Reward and Fear Scale; HRFS) was employed to quantify affective (fear or reward) and non-affective (habitual) motivations for alcohol use. To characterize clinical and demographic measures associated with habit, reward, and fear, we conducted a partial least squares analysis.
Habitual alcohol use was significantly associated with the severity of alcohol dependence reflected across a range of domains and with lower number of detoxifications across multiple settings. In contrast, reward-driven alcohol use was associated with a single domain of alcohol dependence, reward-related behavioral tendencies, and lower number of detoxifications.
These results seem to be consistent with a shift from goal-directed to habit-driven alcohol use with severity and progression of addiction, complementing preclinical work and informing biological models of addiction. Both reward-related and habit-driven alcohol use were associated with lower number of detoxifications, perhaps stemming from more benign course for the reward-related and lack of treatment engagement for the habit-related alcohol abuse group. Future work should further explore the role of habit in this and other addictive disorders, and in obsessive-compulsive related disorders.
Reciprocal space mapping can be efficiently carried out using a position-sensitive x-ray detector (PSD) coupled to a traditional double-axis diffractometer. The PSD offers parallel measurement of the total scattering angle of all diffracted x-rays during a single rocking-curve scan. As a result, a two-dimensional reciprocal space map can be made in a very short time similar to that of a one-dimensional rocking-curve scan. Fast, efficient reciprocal space mapping offers numerous routine advantages to the x-ray diffraction analyst. Some of these advantages arc the explicit differentiation of lattice strain from crystal orientation effects in strain-relaxed heteroepitaxial layers; the nondestructive characterization of the size, shape and orientation of nanocrystalline domains in ordered-alloy epilayers; and the ability to measure the average size and shape of voids in porous epilayers. Here, the PSD-based diffractometer is described, and specific examples clearly illustrating the advantages of complete reciprocal space analysis are presented.
Porphyrins absorb light to initiate photocatalytic activity. The complex, asymmetric structures of natural porphyrins such as heme, chlorophyll, and their derivatives hold unique interest. A platform for biosynthesis of porphyrins in Escherichia coli is developed with the aim of producing a variety of porphyrins for examining their photocatalytic properties within a porous material. Bioderived protoporphyrin IX is tethered inside the highly porous metal-organic framework (MOF) NU-1000 via solvent-assisted ligand incorporation. This MOF catalyzes the photocatalytic oxidation of 2-chloroethyl ethyl sulfide with improved performance over an expanded range of the visible spectrum when compared to unmodified NU-1000.
The Pain Catastrophizing Scale (PCS) measures three aspects of catastrophic cognitions about pain—rumination, magnification, and helplessness. To facilitate assessment and clinical application, we aimed to (a) develop a short version on the basis of its factorial structure and the items’ correlations with key pain-related outcomes, and (b) identify the threshold on the short form indicative of risk for depression.
Social centers for older people.
664 Chinese older adults with chronic pain.
Besides the PCS, pain intensity, pain disability, and depressive symptoms were assessed.
For the full scale, confirmatory factor analysis showed that the hypothesized 3-factor model fit the data moderately well. On the basis of the factor loadings, two items were selected from each of the three dimensions. An additional item significantly associated with pain disability and depressive symptoms, over and above these six items, was identified through regression analyses. A short-PCS composed of seven items was formed, which correlated at r=0.97 with the full scale. Subsequently, receiver operating characteristic (ROC) curves were plotted against clinically significant depressive symptoms, defined as a score of ≥12 on a 10-item version of the Center for Epidemiologic Studies-Depression Scale. This analysis showed a score of ≥7 to be the optimal cutoff for the short-PCS, with sensitivity = 81.6% and specificity = 78.3% when predicting clinically significant depressive symptoms.
The short-PCS may be used in lieu of the full scale and as a brief screen to identify individuals with serious catastrophizing.
A comprehensive analysis of early dinosaur relationships raised the possibility that the group may have originated in Laurasia (Northern Hemisphere), rather than Gondwana (Southern Hemisphere) as often thought. However, that study focused solely on morphology and phylogenetic relationships and did not quantitatively evaluate this issue. Here, we investigate dinosaur origins using a novel Bayesian framework uniting tip-dated phylogenetics with dynamic, time-sliced biogeographic methods, which explicitly account for the age and locality of fossils and the changing interconnections of areas through time due to tectonic and eustatic change. Our analysis finds strong support for a Gondwanan origin of Dinosauria, with 99 % probability for South America (83 % for southern South America). Parsimony analysis gives concordant results. Inclusion of time-sliced biogeographic information affects ancestral state reconstructions (e.g., high connectivity between two regions increases uncertainty over which is the ancestral area) and influences tree topology (disfavouring uniting fossil taxa from localities that were widely separated during the relevant time slice). Our approach directly integrates plate tectonics with phylogenetics and divergence dating, and in doing so reaffirms southern South America as the most likely area for the geographic origin of Dinosauria.
Cognitive dysfunction is a symptomatic domain identified across many mental disorders. Cognitive deficits in individuals with major depressive disorder (MDD) contribute significantly to occupational and functional disability. Notably, cognitive subdomains such as learning and memory, executive functioning, processing speed, and attention and concentration are significantly impaired during, and between, episodes in individuals with MDD. Most antidepressants have not been developed and/or evaluated for their ability to directly and independently ameliorate cognitive deficits. Multiple interacting neurobiological mechanisms (eg, neuroinflammation) are implicated as subserving cognitive deficits in MDD. A testable hypothesis, with preliminary support, posits that improving performance across cognitive domains in individuals with MDD may improve psychosocial function, workplace function, quality of life, and other patient-reported outcomes, independent of effects on core mood symptoms. Herein we aim to (1) provide a rationale for prioritizing cognitive deficits as a therapeutic target, (2) briefly discuss the neurobiological substrates subserving cognitive dysfunction, and (3) provide an update on current and future treatment avenues.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
We observed pediatric S. aureus hospitalizations decreased 36% from 26.3 to 16.8 infections per 1,000 admissions from 2009 to 2016, with methicillin-resistant S. aureus (MRSA) decreasing by 52% and methicillin-susceptible S. aureus decreasing by 17%, among 39 pediatric hospitals. Similar decreases were observed for days of therapy of anti-MRSA antibiotics.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
Brain tumor behavior is driven by aberrations in the genome and epigenome. Many of these changes, such as IDH mutations in diffuse low-grade glioma (DLGG), are common amongst the same class of tumour and can be incorporated into the diagnostic criteria. However, any given tumor may have other, less common genomic aberrations that are essential for its biological behavior and may inform on underlying aberrant cellular pathways, and potential therapeutic agents. Precision oncology is a genomics-based approach which profiles these alterations to better manage cancer patients and has established itself within the practice of oncology and is slowly making its way into neuro-oncology. The BC Cancer’s Personalized OncoGenomics (POG) program has profiled 16 adult tumours originating from the central nervous system using whole genome and transcriptome analysis (WGTA), for the first time, within a meaningful clinical timeframe/setting. As expected, primary genomic drivers were consistent with their respective diagnoses, though secondary drivers were found to be unique to each tumour. Although these analyses did not result in altered clinical management for these patients, primarily due to availability of drug or clinical trials, they highlight the heterogeneity of secondary drivers in cancers and provide clinicians with meaningful biological information. Lastly, the data generated by POG has highlighted the frequency and complexity of novel driver fusions which are predicted to behave similarly to canonical driver events in their respective tumours. The information available to clinicians through POG has provided paramount knowledge into the biology of each unique tumour.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Introduction: Gastroenteritis accounts for 1.7 million emergency department visits by children annually in the United States. We conducted a double-blind trial to determine whether twice daily probiotic administration for 5 days, improves outcomes. Methods: 886 children aged 348 months with gastroenteritis were enrolled in six Canadian pediatric emergency departments. Participants were randomly assigned to twice daily Lactobacillus rhamnosus R0011 and Lactobacillus helveticus R0052, 4.0 x 109 CFU, in a 95:5 ratio or placebo. Primary outcome was development of moderate-severe disease within 14 days of randomization defined by a Modified Vesikari Scale score 9. Secondary outcomes included duration of diarrhea and vomiting, subsequent physician visits and adverse events. Results: Moderate-severe disease occurred in 108 (26.1%) participants administered probiotics and 102 (24.7%) participants allocated to placebo (OR 1.06; 95%CI: 0.77, 1.46; P=0.72). After adjustment for site, age, and frequency of vomiting and diarrhea, treatment assignment did not predict moderate-severe disease (OR, 1.11, 95%CI, 0.80 to 1.56; P=0.53). In the probiotic versus placebo groups, there were no differences in the median duration of diarrhea [52.5 (18.3, 95.8) vs. 55.5 (20.2, 102.3) hours; P=0.31], vomiting [17.7 (0, 58.6) vs. 18.7 (0, 51.6) hours; P=0.18], physician visits (30.2% vs. 26.6%; OR 1.19; 95% CI0.87. 1.62; P=0.27), or adverse events (32.9% vs. 36.8%; OR 0.83; 95%CI 0.62. 1.11; P=0.21). Conclusion: In children presenting to an emergency department with gastroenteritis, twice daily administration of 4.0 x 109 CFU of a Lactobacillus rhamnosus/helveticus probiotic does not prevent development of moderate-severe disease or improvements in other outcomes measured.
Introduction: Ideal management of alcohol withdrawal syndrome (AWS) incorporates a symptom driven approach, whereby patients are regularly assessed using a standardized scoring system (Clinical Institute Withdrawal Assessment for Alcohol-Revised; CIWA-Ar) and treated according to severity. Accurate administration of the CIWA-Ar requires experience, yet there is no training program to teach this competency. The objective of this study was to develop and evaluate a web-based curriculum to teach clinicians how to accurately assess and treat AWS. Methods: This was a three-phase educational program consisting of a series of 3 e-learning modules of core competency material, in-person seminar to orient learners to high fidelity simulation, and summative evaluation in an OSCE setting using a standardized patient. To determine the ED impact of the AWS curriculum, we recorded how often the CIWA-Ar was appropriately applied in the ED pre and post training. ED length of stay, total dose of benzodiazepines administered in the ED, and number of prescriptions and unit benzodiazepine doses given upon discharge were also recorded. Results: 74 nurses from an academic ED completed the AWS curriculum. There were 130 and 126 patients in the pre and post AWS training periods, respectively. Management of AWS was not compliant with CIWA-Ar protocol in 78 (60.0%) and 46 (36.5%) patients pre and post AWS training, respectively ( 23.5%; 95% CI: 11.3%, 34.7%), resulting in administration of benzodiazepine when it was not required, or not giving benzodiazepines with a CIWA-Ar score of 10. There was an average of 4 CIWA-Ar scores per patient in both the pre and post implementation periods. Prior to AWS training, 144/560 (25.5%) CIWA-Ar scores resulted in a breach of protocol, compared to 64/547 (11.7%) following AWS training ( 13.8%; 95% CI: 9.3%, 18.3%). Median total dose of benzodiazepines administered in the ED was lower after the implementation of the AWS curriculum (40mg vs 30mg; 10 mg; 95% CI: 0mg, 20mg). ED length of stay and the amount of benzodiazepines given to patients at discharge were similar between groups. Conclusion: This AWS curriculum appears to be an effective way to train ED clinicians on the proper administration of the CIWA-Ar protocol, and results in improved patient care.
Introduction: The optimal management of emergency department (ED) patients with alcohol withdrawal syndrome (AWS) includes a symptom driven approach with scheduled reassessments using a standardized scoring system (Clinical Institute Withdrawal Assessment for Alcohol-Revised; CIWA-Ar) and treatments according to symptom severity. The subjective nature of the CIWA-Ar, and lack of standardized competency-based education related to alcohol withdrawal results in widely variable treatment. The objective of this study was to perform a summative evaluation of clinical staff during the objective structured clinical examination (OSCE) of a simulated patient (SP) with AWS. Methods: The AWS education curriculum was completed by all staff nurses in our ED (mandatory for full-time, optional to part-time staff). It was based on a real clinical scenario depicting moderate alcohol withdrawal and portrayed by a single SP. Prior to the OSCE, participants attended a seminar orienting them to the simulation. Each participant was asked to do a complete assessment of the SP, and graded for completeness on 37 individual components of history/physical exam, including the 10 domains of the CIWA-Ar. Results: 74 participants completed the educational curriculum over 8 weeks. At least 9/10 domains of the CIWA-Ar assessment were completed by 65 (88%) of participants, and 28 (38%) correctly assessed at least 80% of all summative evaluation components. 63 (85%) participants correctly identified the need for treatment of withdrawal symptoms. Only 13 (18%) participant assessments exactly matched our exact target CIWA-Ar score of 15, however 61% were within 2 points on the CIWA-Ar scale. In only 4 (5%) instances would a participant have inappropriately rated AWS severity below the treatment threshold. 62/72 (86%) participants rated the SP tremor as 2-4 (intended tremor =3). Clinical features most often overlooked were history of other addictions (25 participants, 33%) and history of liver disease (15 participants, 20%). Conclusion: The majority of participants in this OSCE correctly assessed the important elements in the assessment of AWS, and diagnosed the SP as having moderate alcohol withdrawal. Thus our educational intervention resulted in 85% of participants properly identifying the severity of AWS, and developing an appropriate treatment strategy. The impact of this curriculum on actual patient treatment requires further evaluation.
Children with CHD and acquired heart disease have unique, high-risk physiology. They may have a higher risk of adverse tracheal-intubation-associated events, as compared with children with non-cardiac disease.
Materials and methods
We sought to evaluate the occurrence of adverse tracheal-intubation-associated events in children with cardiac disease compared to children with non-cardiac disease. A retrospective analysis of tracheal intubations from 38 international paediatric ICUs was performed using the National Emergency Airway Registry for Children (NEAR4KIDS) quality improvement registry. The primary outcome was the occurrence of any tracheal-intubation-associated event. Secondary outcomes included the occurrence of severe tracheal-intubation-associated events, multiple intubation attempts, and oxygen desaturation.
A total of 8851 intubations were reported between July, 2012 and March, 2016. Cardiac patients were younger, more likely to have haemodynamic instability, and less likely to have respiratory failure as an indication. The overall frequency of tracheal-intubation-associated events was not different (cardiac: 17% versus non-cardiac: 16%, p=0.13), nor was the rate of severe tracheal-intubation-associated events (cardiac: 7% versus non-cardiac: 6%, p=0.11). Tracheal-intubation-associated cardiac arrest occurred more often in cardiac patients (2.80 versus 1.28%; p<0.001), even after adjusting for patient and provider differences (adjusted odds ratio 1.79; p=0.03). Multiple intubation attempts occurred less often in cardiac patients (p=0.04), and oxygen desaturations occurred more often, even after excluding patients with cyanotic heart disease.
The overall incidence of adverse tracheal-intubation-associated events in cardiac patients was not different from that in non-cardiac patients. However, the presence of a cardiac diagnosis was associated with a higher occurrence of both tracheal-intubation-associated cardiac arrest and oxygen desaturation.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.