To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Introduction: Many drugs, including cannabis and alcohol, cause impairment and contribute to motor vehicle collisions (MVCs). Policy makers require knowledge of the prevalence of drug use in crash-involved drivers, and types of drugs used in order to develop effective prevention programs. This issue is particularly relevant with the recent legalization of cannabis. We aim to study the prevalence of alcohol, cannabis, sedating medications, and other drugs in injured drivers from 4 Canadian Provinces. Methods: This prospective cohort study obtained excess clinical blood samples from consecutive injured drivers who attended a participating Canadian trauma centre following a MVC. Blood samples were analyzed using a broad spectrum toxicology screen capable of detecting cannabinoids, cocaine, amphetamines (including their major analogues), and opioids as well as psychotropic pharmaceuticals (including antihistamines, benzodiazepines, other hypnotics, and sedating antidepressants). Alcohol and cannabinoids were quantified. Health records were reviewed to extract demographic, medical, and MVC information using a standardized data collection tool. Results: This study has been collecting data in 4 trauma centres in British Columbia (BC) since 2011 and was launched in 2 trauma centres in Alberta (AB), 1 in Saskatchewan (SK), and 2 in Ontario (ON) in 2018. In preliminary results from BC (n = 2412), 8% of injured drivers tested positive for THC and 13% for alcohol. Preliminary results from other provinces (n = 301) suggest a regional variation in prevalence of drivers testing positive for THC (10% - 27%), alcohol (17% - 29%), and other drugs. By May 2018, an estimated 4500 cases from BC, 600 from AB, 150 from SK, and 650 from ON will have been analyzed. We will report the prevalence of positive tests for alcohol, THC, other recreational drugs, and sedating medications, pre and post cannabis legalization. The number of cases with alcohol and/or THC levels above Canadian per se limits will also be reported. Results will be reported according to province, driver sex, age, single vs. multi vehicle crashes, and requirement for hospital admission. Conclusion: This will be among the largest international datasets on drug use by injured drivers. Our findings will provide patterns of drug and alcohol impairment in 4 Canadian provinces pre and post cannabis legalization. The significance of these findings and implication for impaired driving policy and prevention programs in Canada will be discussed.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Infrared absorption spectroscopy is a powerful tool for structural and functional studies of biomolecules. The technique enables direct access to the vibrational fingerprints of molecular bonds in the mid-infrared spectral region (3-20μm). Although intrinsic absorption cross-sections are nearly ten orders of magnitude greater than corresponding Raman cross-sections, they are still small in comparison with those of fluorescent molecules. Sensitivity improvements are therefore required for the method to be applicable to single molecule / molecular layer studies. In this work, we demonstrate the use of lithographically patterned arrays of nanoantennas to enhance the absorption signature of the protein amide-I and II backbone vibrations. Strong absorption signals from monolayer thickness films are obtained. By arranging ensembles of tailored antennas in specific lattices, higher quality factor resonances and increased near-field intensities are possible. These features are leveraged to obtain 104-105 fold signal enhancements and the direct measurement of vibrational spectra of proteins at zepto-mole sensitivity levels.
A new ESCA (electron spectroscopy for chemical analysis) instrument has been developed to provide high sensitivity and efficient operation for laboratory analysis of composition and chemical bonding in very thin surface layers of solid samples. High sensitivity is achieved by means of the high-intensity, efficient X-ray source described by Davies and Herglotz at the 1968 Denver X-Ray Conference, in combination with the new electron energy analyzer described by Lee at the 1972 Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy. A sample chamber designed to provide for rapid introduction and replacement of samples has adequate facilities for various sample treatments and conditioning followed immediafely by ESCA analysis of the sample.
Examples of application are presented, demonstrating the sensitivity and resolution achievable with this instrument. Its usefulness in trace surface analysis is shown and some “chemical shifts” measured by the instrument are compared with those obtained by X-ray spectroscopy.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
We observed pediatric S. aureus hospitalizations decreased 36% from 26.3 to 16.8 infections per 1,000 admissions from 2009 to 2016, with methicillin-resistant S. aureus (MRSA) decreasing by 52% and methicillin-susceptible S. aureus decreasing by 17%, among 39 pediatric hospitals. Similar decreases were observed for days of therapy of anti-MRSA antibiotics.
The white-backed planthopper, Sogatella furcifera (Horváth) (Hemiptera, Delphacidae), has emerged as a serious rice pest in Asia. In the present study, 12 microsatellite markers were employed to investigate the genetic structure, diversity and migration route of 43 populations sampled from seven Asian countries (Bangladesh, China, Korea, Laos, Nepal, Thailand, and Vietnam). According to the isolation by distance analysis, a significant positive correlation was observed between genetic and geographic distances by the Mantel test (r2 = 0.4585, P = 0.01), indicating the role of geographic isolation in the genetic structure of S. furcifera. A population assignment test using the first-generation migrants detection method (thresholds a = 0.01) revealed southern China and northern Vietnam as the main sources of S. furcifera in Korea. Nepal and Bangladesh might be additional potential sources via interconnection with Vietnam populations. This paper provides useful data for the migration route and origin of S. furcifera in Korea and will contribute to planthopper resistance management.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Chlamydia trachomatis (CT) infections remain highly prevalent. CT reinfection occurs frequently within months after treatment, likely contributing to sustaining the high CT infection prevalence. Sparse studies have suggested CT reinfection is associated with a lower organism load, but it is unclear whether CT load at the time of treatment influences CT reinfection risk. In this study, women presenting for treatment of a positive CT screening test were enrolled, treated and returned for 3- and 6-month follow-up visits. CT organism loads were quantified at each visit. We evaluated for an association of CT bacterial load at initial infection with reinfection risk and investigated factors influencing the CT load at baseline and follow-up in those with CT reinfection. We found no association of initial CT load with reinfection risk. We found a significant decrease in the median log10 CT load from baseline to follow-up in those with reinfection (5.6 CT/ml vs. 4.5 CT/ml; P = 0.015). Upon stratification of reinfected subjects based upon presence or absence of a history of CT infections prior to their infection at the baseline visit, we found a significant decline in the CT load from baseline to follow-up (5.7 CT/ml vs. 4.3 CT/ml; P = 0.021) exclusively in patients with a history of CT infections prior to our study. Our findings suggest repeated CT infections may lead to possible development of partial immunity against CT.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
OBJECTIVES/SPECIFIC AIMS: Reducing radiologic exams has been a focus of cost reduction in healthcare systems. The utility and justification of obtaining cross-sectional imaging (PPCSI) before surgical intervention continues to be evaluated. For peripheral artery disease (PAD) consensus guidelines regarding PPCSI do not exist and may be influenced by patient complexity, variation of disease presentation, and physician preference. The objective of this study was to determine the utility of PPCSI before percutaneous PAD intervention. METHODS/STUDY POPULATION: Patients receiving first-time endovascular revascularization procedure for PAD from 2013 to 2015 were evaluated for PPCSI done within 180 days prior to revascularization. Patient and physician demographics, perioperative characteristics, and disease distribution/severity were evaluated. The primary outcome was technical success defined as improving inflow and/or revascularization of the target outflow vessels to <50% stenosis. RESULTS/ANTICIPATED RESULTS: Of the 348 patients who underwent an attempted revascularization procedure 159 (45.7%) patients underwent PPCSI, including 151 CTA and 8 MRA. Of these, 48% were ordered by the referring provider (84% at an outside institution), and 52% were ordered by the treating physician. PPCSI was performed a median of 26 days (IQR 9-53) prior to procedure. Individual vascular surgeon practice identified PPCSI rates ranging from 31% to 70%. On multivariate analysis chronic kidney disease (OR=0.35; CI 0.17–0.73) had the strongest effect against of PPCSI, and Inpatient/ED evaluation (OR=3.20; CI 1.58–6.50), aorto-iliac (OR=2.78; CI 1.46–5.29) and femoral-popliteal occlusions (OR=2.51; CI 1.38–4.55) most strongly predicted PPCSI. After excluding 31 diagnostic procedures, technical success did not differ between endovascular procedures with PPSCI (91.3%) or without PPCSI (85.6%), p=0.11. When analyzing 89 femoral-popliteal occlusions, technical success was higher with PPCSI (88%) compared to procedures without PPSCI (69%), p=0.026. DISCUSSION/SIGNIFICANCE OF IMPACT: PPCSI use is influenced by inpatient status, chronic kidney disease, and anatomic consideration. PPCSI was not associated with overall technical success although it appeared beneficial for femoral-popliteal occlusions. Routine practices of ordering of PPCSI may not be warranted when considering technical success but may be important in treatment planning. Further studies are warranted to determine if radiation, cost, and contrast load justify PPCSI.
Childhood obesity rates are higher among Indigenous compared with non-Indigenous Australian children. It has been hypothesized that early-life influences beginning with the intrauterine environment predict the development of obesity in the offspring. The aim of this paper was to assess, in 227 mother–child dyads from the Gomeroi gaaynggal cohort, associations between prematurity, Gestation Related-Optimal Weight (GROW) centiles, maternal adiposity (percentage body fat, visceral fat area), maternal non-fasting plasma glucose levels (measured at mean gestational age of 23.1 weeks) and offspring BMI and adiposity (abdominal circumference, subscapular skinfold thickness) in early childhood (mean age 23.4 months). Maternal non-fasting plasma glucose concentrations were positively associated with infant birth weight (P=0.005) and GROW customized birth weight centiles (P=0.008). There was a significant association between maternal percentage body fat (P=0.02) and visceral fat area (P=0.00) with infant body weight in early childhood. Body mass index (BMI) in early childhood was significantly higher in offspring born preterm compared with those born at term (P=0.03). GROW customized birth weight centiles was significantly associated with body weight (P=0.01), BMI (P=0.007) and abdominal circumference (P=0.039) at early childhood. Our findings suggest that being born preterm, large for gestational age or exposed to an obesogenic intrauterine environment and higher maternal non-fasting plasma glucose concentrations are associated with increased obesity risk in early childhood. Future strategies should aim to reduce the prevalence of overweight/obesity in women of child-bearing age and emphasize the importance of optimal glycemia during pregnancy, particularly in Indigenous women.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Introduction: Gastroenteritis accounts for 1.7 million emergency department visits by children annually in the United States. We conducted a double-blind trial to determine whether twice daily probiotic administration for 5 days, improves outcomes. Methods: 886 children aged 348 months with gastroenteritis were enrolled in six Canadian pediatric emergency departments. Participants were randomly assigned to twice daily Lactobacillus rhamnosus R0011 and Lactobacillus helveticus R0052, 4.0 x 109 CFU, in a 95:5 ratio or placebo. Primary outcome was development of moderate-severe disease within 14 days of randomization defined by a Modified Vesikari Scale score 9. Secondary outcomes included duration of diarrhea and vomiting, subsequent physician visits and adverse events. Results: Moderate-severe disease occurred in 108 (26.1%) participants administered probiotics and 102 (24.7%) participants allocated to placebo (OR 1.06; 95%CI: 0.77, 1.46; P=0.72). After adjustment for site, age, and frequency of vomiting and diarrhea, treatment assignment did not predict moderate-severe disease (OR, 1.11, 95%CI, 0.80 to 1.56; P=0.53). In the probiotic versus placebo groups, there were no differences in the median duration of diarrhea [52.5 (18.3, 95.8) vs. 55.5 (20.2, 102.3) hours; P=0.31], vomiting [17.7 (0, 58.6) vs. 18.7 (0, 51.6) hours; P=0.18], physician visits (30.2% vs. 26.6%; OR 1.19; 95% CI0.87. 1.62; P=0.27), or adverse events (32.9% vs. 36.8%; OR 0.83; 95%CI 0.62. 1.11; P=0.21). Conclusion: In children presenting to an emergency department with gastroenteritis, twice daily administration of 4.0 x 109 CFU of a Lactobacillus rhamnosus/helveticus probiotic does not prevent development of moderate-severe disease or improvements in other outcomes measured.
Introduction: Pulse check by manual palpation (MP) is an unreliable skill even in the hands of healthcare professionals. In the context of cardiac arrest, this may translate into inappropriate chest compressions when a pulse is present, or conversely omitting chest compressions when one is absent. To date, no study has assessed the utility of B-mode ultrasound (US) for the detection of a carotid pulse. The primary objective of this study is to assess the time required to detect a carotid pulse in live subjects using US compared to the standard MP method. Methods: This is a prospective randomized controlled cross-over non-inferiority trial. Health care professionals from various backgrounds were invited to participate. They attended a 15 minute focused US workshop on identification of the carotid pulse. Following a washout period, they were randomized to detect a pulse in live subjects either by MP first or by US first. Both pulse check methods were timed for each participant on 2 different subjects. The primary outcome measure was time to carotid pulse detection in seconds. Secondary outcome measures included comfort levels of carotid pulse detection measured on a 100mm visual analog scale (VAS), and rates of prolonged pulse checks (greater than 5 or 10 seconds) for each technique. Mean pulse detection times were compared using Students t-test. The study was powered to determine whether US was not slower than MP by greater than 2 seconds. Results: A total of 93 participants completed the study. Time to detect pulse was 4.2 (SD=3.4) seconds by US compared with 4.7 (SD=6.5) seconds by MP (P=0.43). Seventeen (18%) participants took >5 seconds to identify the carotid pulse using US compared to 19 (20%) by MP (P=0.74). Eight (9%) candidates took >10 seconds to identify the pulse using US compared to 9 (10%) by MP (P=0.81). Prior to training, participants had a higher comfort level using MP than US pulse checks (67 vs 26 mm, P<0.001). Following the study, participants reported higher comfort levels using US than MP (88 vs 78 mm, P<0.001). Conclusion: Carotid pulse detection in live subjects was not slower using US as compared to MP in this study. A brief teaching session was sufficient to improve confidence of carotid pulse identification even in those with little to no previous US training. The preliminary results from this study provide the groundwork for larger studies to evaluate this pulse check method for patients in actual cardiac arrest.
Introduction: Pediatric musculoskeletal (MSK) image interpretation has been identified as a knowledge gap among emergency medicine trainees. The main objective of this study was to implement a validated on-line pediatric MSK radiograph interpretation system with a performance-based competency endpoint into pediatric emergency fellowship programs and examine the number of cases needed to achieve a competency threshold of 80% accuracy, sensitivity and specificity. We further determined proportion who successfully achieved competency in a given module and the change in accuracy from baseline to competency. Methods: This was a prospective cohort multi-centre study. There were seven MSK radiograph modules, each containing 200-400 cases (demo-https://imagesim.com/course-information/demo/). Thirty-seven pediatric emergency medicine fellows participated for 12 months. Participants did cases until they reached competency, defined as at least 80% accuracy, sensitivity and specificity. We calculated the overall and per module median number of cases required to achieve competency, proportion of participants who achieved competency, median time on case, and the mean change in accuracy from baseline to competency. Results: Overall, the median number of cases required to achieve competency was 76 (min 54, max 756). Between different body parts, there was a significant difference in the median number of cases needed to achieve competency, p <0.0001, with ankle and knee being among the most challenging modules. Proportions of those who started a module and completed it to competency varied significantly, and ranged from 32.4% in the ankle module to 97.1% in the forearm/hand, p<0.0001. The overall median time on each case was 34.1 (min 7.6, max 89.5) seconds. The overall change in accuracy from baseline to 80% competency was 13.5% (95% CI 12.1, 14.8), with the respective Cohens effect size of 1.98. The change in accuracy was different between modules, p=0.001, with post-hoc analyses demonstrating that the ankle/foot radiograph module had a greater increase in accuracy relative to elbow (p=0.009) and pelvis/femur (p=0.006). Conclusion: It was feasible for pediatric emergency medicine fellows to complete each learning pediatric MSK learning module to competency within approximately one hour, with the exception of the ankle module. Learners who completed the modules to competency demonstrated very significant increases in interpretation skill.