To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
We used multivariable analyses to assess whether meeting core elements was associated with antibiotic utilization. Compliance with 7 elements versus not doing so was associated with higher use of broad-spectrum agents for community-acquired infections [days of therapy per 1,000 patient days: 155 (39) vs 133 (29), P = .02] and anti-methicillin-resistant S. aureus agents [days of therapy per 1,000 patient days: 145 (37) vs 124 (30), P = .03].
Introduction: Although acute gastroenteritis is an extremely common childhood illness, there is a paucity of literature characterizing the associated pain and its management. Our primary objective was to quantify the pain experienced by children with acute gastroenteritis in the 24-hours prior to emergency department (ED) presentation. Secondary objectives included describing maximum pain, analgesic use, discharge recommendations, and factors that influenced analgesic use in the ED. Methods: Study participants were recruited into this prospective cohort study by the Alberta Provincial Pediatric EnTeric Infection TEam between January 2014 and September 2017. This study was conducted at two Canadian pediatric EDs; the Alberta Children's Hospital (Calgary) and the Stollery Children's Hospital (Edmonton). Eligibility criteria included < 18 years of age, acute gastroenteritis (□ 3 episodes of diarrhea or vomiting in the previous 24 hours), and symptom duration □ 7 days. The primary study outcome, caregiver-reported maximum pain in the 24-hours prior to presentation, was assessed using the 11-point Verbal Numerical Rating Scale. Results: We recruited 2136 patients, median age 20.8 months (IQR 10.4, 47.4); 45.8% (979/2136) female. In the 24-hours prior to enrolment, 28.6% (610/2136) of caregivers reported that their child experienced moderate (4-6) and 46.2% (986/2136) severe (7-10) pain in the preceding 24-hours. During the emergency visit, 31.1% (664/2136) described pain as moderate and 26.7% (571/2136) as severe. In the ED, analgesia was provided to 21.2% (452/2131) of children. The most commonly administered analgesics in the ED were ibuprofen (68.1%, 308/452) and acetaminophen (43.4%, 196/452); at home, acetaminophen was most commonly administered (77.7%, 700/901), followed by ibuprofen (37.5%, 338/901). Factors associated with analgesia use in the ED were greater pain scores during the visit, having a primary-care physician, shorter illness duration, fewer diarrheal episodes, presence of fever and hospitalization. Conclusion: Although children presenting to the ED with acute gastroenteritis experience moderate to severe pain, both prior to and during their emergency visit, analgesic use is limited. Future research should focus on appropriate pain management through the development of effective and safe pain treatment plans.
We assessed self-reported drives for alcohol use and their impact on clinical features of alcohol use disorder (AUD) patients. Our prediction was that, in contrast to “affectively” (reward or fear) driven drinking, “habitual” drinking would be associated with worse clinical features in relation to alcohol use and higher occurrence of associated psychiatric symptoms.
Fifty-eight Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) alcohol abuse patients were assessed with a comprehensive battery of reward- and fear-based behavioral tendencies. An 18-item self-report instrument (the Habit, Reward and Fear Scale; HRFS) was employed to quantify affective (fear or reward) and non-affective (habitual) motivations for alcohol use. To characterize clinical and demographic measures associated with habit, reward, and fear, we conducted a partial least squares analysis.
Habitual alcohol use was significantly associated with the severity of alcohol dependence reflected across a range of domains and with lower number of detoxifications across multiple settings. In contrast, reward-driven alcohol use was associated with a single domain of alcohol dependence, reward-related behavioral tendencies, and lower number of detoxifications.
These results seem to be consistent with a shift from goal-directed to habit-driven alcohol use with severity and progression of addiction, complementing preclinical work and informing biological models of addiction. Both reward-related and habit-driven alcohol use were associated with lower number of detoxifications, perhaps stemming from more benign course for the reward-related and lack of treatment engagement for the habit-related alcohol abuse group. Future work should further explore the role of habit in this and other addictive disorders, and in obsessive-compulsive related disorders.
Porphyrins absorb light to initiate photocatalytic activity. The complex, asymmetric structures of natural porphyrins such as heme, chlorophyll, and their derivatives hold unique interest. A platform for biosynthesis of porphyrins in Escherichia coli is developed with the aim of producing a variety of porphyrins for examining their photocatalytic properties within a porous material. Bioderived protoporphyrin IX is tethered inside the highly porous metal-organic framework (MOF) NU-1000 via solvent-assisted ligand incorporation. This MOF catalyzes the photocatalytic oxidation of 2-chloroethyl ethyl sulfide with improved performance over an expanded range of the visible spectrum when compared to unmodified NU-1000.
The Pain Catastrophizing Scale (PCS) measures three aspects of catastrophic cognitions about pain—rumination, magnification, and helplessness. To facilitate assessment and clinical application, we aimed to (a) develop a short version on the basis of its factorial structure and the items’ correlations with key pain-related outcomes, and (b) identify the threshold on the short form indicative of risk for depression.
Social centers for older people.
664 Chinese older adults with chronic pain.
Besides the PCS, pain intensity, pain disability, and depressive symptoms were assessed.
For the full scale, confirmatory factor analysis showed that the hypothesized 3-factor model fit the data moderately well. On the basis of the factor loadings, two items were selected from each of the three dimensions. An additional item significantly associated with pain disability and depressive symptoms, over and above these six items, was identified through regression analyses. A short-PCS composed of seven items was formed, which correlated at r=0.97 with the full scale. Subsequently, receiver operating characteristic (ROC) curves were plotted against clinically significant depressive symptoms, defined as a score of ≥12 on a 10-item version of the Center for Epidemiologic Studies-Depression Scale. This analysis showed a score of ≥7 to be the optimal cutoff for the short-PCS, with sensitivity = 81.6% and specificity = 78.3% when predicting clinically significant depressive symptoms.
The short-PCS may be used in lieu of the full scale and as a brief screen to identify individuals with serious catastrophizing.
A comprehensive analysis of early dinosaur relationships raised the possibility that the group may have originated in Laurasia (Northern Hemisphere), rather than Gondwana (Southern Hemisphere) as often thought. However, that study focused solely on morphology and phylogenetic relationships and did not quantitatively evaluate this issue. Here, we investigate dinosaur origins using a novel Bayesian framework uniting tip-dated phylogenetics with dynamic, time-sliced biogeographic methods, which explicitly account for the age and locality of fossils and the changing interconnections of areas through time due to tectonic and eustatic change. Our analysis finds strong support for a Gondwanan origin of Dinosauria, with 99 % probability for South America (83 % for southern South America). Parsimony analysis gives concordant results. Inclusion of time-sliced biogeographic information affects ancestral state reconstructions (e.g., high connectivity between two regions increases uncertainty over which is the ancestral area) and influences tree topology (disfavouring uniting fossil taxa from localities that were widely separated during the relevant time slice). Our approach directly integrates plate tectonics with phylogenetics and divergence dating, and in doing so reaffirms southern South America as the most likely area for the geographic origin of Dinosauria.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
Chlamydia trachomatis (CT) infections remain highly prevalent. CT reinfection occurs frequently within months after treatment, likely contributing to sustaining the high CT infection prevalence. Sparse studies have suggested CT reinfection is associated with a lower organism load, but it is unclear whether CT load at the time of treatment influences CT reinfection risk. In this study, women presenting for treatment of a positive CT screening test were enrolled, treated and returned for 3- and 6-month follow-up visits. CT organism loads were quantified at each visit. We evaluated for an association of CT bacterial load at initial infection with reinfection risk and investigated factors influencing the CT load at baseline and follow-up in those with CT reinfection. We found no association of initial CT load with reinfection risk. We found a significant decrease in the median log10 CT load from baseline to follow-up in those with reinfection (5.6 CT/ml vs. 4.5 CT/ml; P = 0.015). Upon stratification of reinfected subjects based upon presence or absence of a history of CT infections prior to their infection at the baseline visit, we found a significant decline in the CT load from baseline to follow-up (5.7 CT/ml vs. 4.3 CT/ml; P = 0.021) exclusively in patients with a history of CT infections prior to our study. Our findings suggest repeated CT infections may lead to possible development of partial immunity against CT.
Red Supergiant Stars (RSGs) are important probes of stellar and chemical evolution in star-forming environments. They represent the brightest near-IR stellar components of external galaxies and probe the most recent stellar population to provide robust, independent abundance estimates. The Local Group dwarf irregular galaxy, NGC6822, is a reasonably isolated galaxy with an interesting structure and turbulent history. Using RSGs as chemical abundance probes, we estimate metallicities in the central region of NGC6822, finding a suggestion of a metallicity gradient (in broad agreement with nebular tracers), however, this requires further study for confirmation. With intermediate resolution Multi-object spectroscopy (from e.g. KMOS, EMIR, MOSFIRE) combined with state-of-the-art stellar model atmospheres, we demonstrate how RSGs can be used to estimate stellar abundances in external galaxies. In this context, we compare stellar and nebular abundance tracers in NGC 6822 and by combining stellar and nebular tracers we estimate an abundance gradient of −0.18 ± 0.05 dex/kpc.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
Brain tumor behavior is driven by aberrations in the genome and epigenome. Many of these changes, such as IDH mutations in diffuse low-grade glioma (DLGG), are common amongst the same class of tumour and can be incorporated into the diagnostic criteria. However, any given tumor may have other, less common genomic aberrations that are essential for its biological behavior and may inform on underlying aberrant cellular pathways, and potential therapeutic agents. Precision oncology is a genomics-based approach which profiles these alterations to better manage cancer patients and has established itself within the practice of oncology and is slowly making its way into neuro-oncology. The BC Cancer’s Personalized OncoGenomics (POG) program has profiled 16 adult tumours originating from the central nervous system using whole genome and transcriptome analysis (WGTA), for the first time, within a meaningful clinical timeframe/setting. As expected, primary genomic drivers were consistent with their respective diagnoses, though secondary drivers were found to be unique to each tumour. Although these analyses did not result in altered clinical management for these patients, primarily due to availability of drug or clinical trials, they highlight the heterogeneity of secondary drivers in cancers and provide clinicians with meaningful biological information. Lastly, the data generated by POG has highlighted the frequency and complexity of novel driver fusions which are predicted to behave similarly to canonical driver events in their respective tumours. The information available to clinicians through POG has provided paramount knowledge into the biology of each unique tumour.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
Introduction: Pediatric musculoskeletal (MSK) image interpretation has been identified as a knowledge gap among emergency medicine trainees. The main objective of this study was to implement a validated on-line pediatric MSK radiograph interpretation system with a performance-based competency endpoint into pediatric emergency fellowship programs and examine the number of cases needed to achieve a competency threshold of 80% accuracy, sensitivity and specificity. We further determined proportion who successfully achieved competency in a given module and the change in accuracy from baseline to competency. Methods: This was a prospective cohort multi-centre study. There were seven MSK radiograph modules, each containing 200-400 cases (demo-https://imagesim.com/course-information/demo/). Thirty-seven pediatric emergency medicine fellows participated for 12 months. Participants did cases until they reached competency, defined as at least 80% accuracy, sensitivity and specificity. We calculated the overall and per module median number of cases required to achieve competency, proportion of participants who achieved competency, median time on case, and the mean change in accuracy from baseline to competency. Results: Overall, the median number of cases required to achieve competency was 76 (min 54, max 756). Between different body parts, there was a significant difference in the median number of cases needed to achieve competency, p <0.0001, with ankle and knee being among the most challenging modules. Proportions of those who started a module and completed it to competency varied significantly, and ranged from 32.4% in the ankle module to 97.1% in the forearm/hand, p<0.0001. The overall median time on each case was 34.1 (min 7.6, max 89.5) seconds. The overall change in accuracy from baseline to 80% competency was 13.5% (95% CI 12.1, 14.8), with the respective Cohens effect size of 1.98. The change in accuracy was different between modules, p=0.001, with post-hoc analyses demonstrating that the ankle/foot radiograph module had a greater increase in accuracy relative to elbow (p=0.009) and pelvis/femur (p=0.006). Conclusion: It was feasible for pediatric emergency medicine fellows to complete each learning pediatric MSK learning module to competency within approximately one hour, with the exception of the ankle module. Learners who completed the modules to competency demonstrated very significant increases in interpretation skill.
Introduction: Gastroenteritis accounts for 1.7 million emergency department visits by children annually in the United States. We conducted a double-blind trial to determine whether twice daily probiotic administration for 5 days, improves outcomes. Methods: 886 children aged 348 months with gastroenteritis were enrolled in six Canadian pediatric emergency departments. Participants were randomly assigned to twice daily Lactobacillus rhamnosus R0011 and Lactobacillus helveticus R0052, 4.0 x 109 CFU, in a 95:5 ratio or placebo. Primary outcome was development of moderate-severe disease within 14 days of randomization defined by a Modified Vesikari Scale score 9. Secondary outcomes included duration of diarrhea and vomiting, subsequent physician visits and adverse events. Results: Moderate-severe disease occurred in 108 (26.1%) participants administered probiotics and 102 (24.7%) participants allocated to placebo (OR 1.06; 95%CI: 0.77, 1.46; P=0.72). After adjustment for site, age, and frequency of vomiting and diarrhea, treatment assignment did not predict moderate-severe disease (OR, 1.11, 95%CI, 0.80 to 1.56; P=0.53). In the probiotic versus placebo groups, there were no differences in the median duration of diarrhea [52.5 (18.3, 95.8) vs. 55.5 (20.2, 102.3) hours; P=0.31], vomiting [17.7 (0, 58.6) vs. 18.7 (0, 51.6) hours; P=0.18], physician visits (30.2% vs. 26.6%; OR 1.19; 95% CI0.87. 1.62; P=0.27), or adverse events (32.9% vs. 36.8%; OR 0.83; 95%CI 0.62. 1.11; P=0.21). Conclusion: In children presenting to an emergency department with gastroenteritis, twice daily administration of 4.0 x 109 CFU of a Lactobacillus rhamnosus/helveticus probiotic does not prevent development of moderate-severe disease or improvements in other outcomes measured.
Children with CHD and acquired heart disease have unique, high-risk physiology. They may have a higher risk of adverse tracheal-intubation-associated events, as compared with children with non-cardiac disease.
Materials and methods
We sought to evaluate the occurrence of adverse tracheal-intubation-associated events in children with cardiac disease compared to children with non-cardiac disease. A retrospective analysis of tracheal intubations from 38 international paediatric ICUs was performed using the National Emergency Airway Registry for Children (NEAR4KIDS) quality improvement registry. The primary outcome was the occurrence of any tracheal-intubation-associated event. Secondary outcomes included the occurrence of severe tracheal-intubation-associated events, multiple intubation attempts, and oxygen desaturation.
A total of 8851 intubations were reported between July, 2012 and March, 2016. Cardiac patients were younger, more likely to have haemodynamic instability, and less likely to have respiratory failure as an indication. The overall frequency of tracheal-intubation-associated events was not different (cardiac: 17% versus non-cardiac: 16%, p=0.13), nor was the rate of severe tracheal-intubation-associated events (cardiac: 7% versus non-cardiac: 6%, p=0.11). Tracheal-intubation-associated cardiac arrest occurred more often in cardiac patients (2.80 versus 1.28%; p<0.001), even after adjusting for patient and provider differences (adjusted odds ratio 1.79; p=0.03). Multiple intubation attempts occurred less often in cardiac patients (p=0.04), and oxygen desaturations occurred more often, even after excluding patients with cyanotic heart disease.
The overall incidence of adverse tracheal-intubation-associated events in cardiac patients was not different from that in non-cardiac patients. However, the presence of a cardiac diagnosis was associated with a higher occurrence of both tracheal-intubation-associated cardiac arrest and oxygen desaturation.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.