To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
Several research teams have previously traced patterns of emerging conduct problems (CP) from early or middle childhood. The current study expands on this previous literature by using a genetically-informed, experimental, and long-term longitudinal design to examine trajectories of early-emerging conduct problems and early childhood discriminators of such patterns from the toddler period to adolescence. The sample represents a cohort of 731 toddlers and diverse families recruited based on socioeconomic, child, and family risk, varying in urbanicity and assessed on nine occasions between ages 2 and 14. In addition to examining child, family, and community level discriminators of patterns of emerging conduct problems, we were able to account for genetic susceptibility using polygenic scores and the study's experimental design to determine whether random assignment to the Family Check-Up (FCU) discriminated trajectory groups. In addition, in accord with differential susceptibility theory, we tested whether the effects of the FCU were stronger for those children with higher genetic susceptibility. Results augmented previous findings documenting the influence of child (inhibitory control [IC], gender) and family (harsh parenting, parental depression, and educational attainment) risk. In addition, children in the FCU were overrepresented in the persistent low versus persistent high CP group, but such direct effects were qualified by an interaction between the intervention and genetic susceptibility that was consistent with differential susceptibility. Implications are discussed for early identification and specifically, prevention efforts addressing early child and family risk.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Introduction: Simulation is becoming widely adopted across medical disciplines and by different medical professionals. For medical students, emergency medicine simulation has been shown to increase knowledge, confidence and satisfaction. At the University of Ottawa Skills and Simulation Centre, third-year medical students participate in simulated scenarios common to Emergency Medicine (EM) as part of their mandatory EM clerkship rotation. This study aims to evaluate simulation as part of the EM clerkship rotation by assessing changes in student confidence following a simulation session. Methods: In groups of seven, third year medical students at the University of Ottawa completed simulation sessions of the following: Status Asthmaticus, Status Epilepticus, Urosepsis and Breaking Bad News. Student confidence with each topic was assessed before and after simulation with a written survey. Confidence scores pre- and post-simulation were compared with the Wilcoxon signed rank test. Results: Forty-eight third years medical students in their core EM clerkship rotation, between September 2017 and August 2018 participated in this study. Medical student confidence with diagnosis of status asthmaticus (N = 44, p = 0.0449) and status epilepticus (N = 45, p = 0.0011) increased significantly following simulation, whereas confidence with diagnosis of urosepsis was unchanged (N = 45, p = 0.0871). Treatment confidence increased significantly for status asthmaticus (N = 47, p = 0.0009), status epilepticus (N = 48, p = 0.0005) and urosepsis (N = 48, p < 0.0001). Confidence for breaking bad news was not significantly changed after simulation (N = 47, p = 0.0689). Conclusion: Simulation training in our EM clerkship rotation significantly increased the confidence of medical students for certain common EM presentations, but not for all. Further work will aim to understand why some simulation scenarios did not improve confidence, and look to improve existing scenarios.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
Declining mortality following invasive pneumococcal disease (IPD) has been observed concurrent with a reduced incidence due to effective pneumococcal conjugate vaccines. However, with IPD now increasing due to serotype replacement, we undertook a statistical analysis to estimate the trend in all-cause 30-day case fatality rate (CFR) in the North East of England (NEE) following IPD. Clinical, microbiological and demographic data were obtained for all laboratory-confirmed IPD cases (April 2006–March 2016) and the adjusted association between CFR and epidemiological year estimated using logistic regression. Of the 2510 episodes of IPD included in the analysis, 486 died within 30 days of IPD (CFR 19%). Increasing age, male sex, a diagnosis of septicaemia, being in ⩾1 clinical risk groups, alcohol abuse and individual serotypes were independently associated with increased CFR. A significant decline in CFR over time was observed following adjustment for these significant predictors (adjusted odds ratio 0.93, 95% confidence interval 0.89–0.98; P = 0.003). A small but significant decline in 30-day all-cause CFR following IPD has been observed in the NEE. Nonetheless, certain population groups remain at increased risk of dying following IPD. Despite the introduction of effective vaccines, further strategies to reduce the ongoing burden of mortality from IPD are needed.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Total reflection X-ray Fluorescence (TXRF) has been applied to the detection and quantification of metal contamination on the surface and near-surface regions of silicon wafers in the semiconductor industry The need for improving the sensitivity and detection limit of the TXRF technique is driven by the progress in producing thinner films and finer features in the development of larger Mbit DRAMS.
Advancements in trace clement analysis require improvements in both the signal-to-noise ratio and accurate background correction. With a sequential spectrometer, one can obtain detection limits of around 0.1 ppm for medium to heavy Z elements. Conditions can be individually optimized for each element, for example, selection of filters, collimators, crystals and background subtraction. The disadvantage is that the analysis time may become “long” if many elements are to be analyzed. This long exposure time can lead to the deterioration of some samples.
Over the past few years there has been substantial progress in the TXRF analysis of heavy element surface contamination on silicon wafers. Further advances and improvements are desired in the analytical performance and hardware. Extension of the analytical range to include the light elements is particularly desirable.
In the case of light element analysis, sodium and aluminum impurities have been monitored in the IC production process. The increase of the sodium impurity in a silicon wafer gives rise to a decrease in the insulation in IC devices and the growth of the SiO2 film is disturbed by the prsence of aluminum impurity on the silicon wafer surface.
Aging processes exhibiting cluster to precipitate transitions were studied in polycrystalline line austenitic iron-base alloys with a Siemens' Guinier camera. This camera combines the Seemann-Bohlin focusing geometry with a curved-crystal monochromator arid thus maximizes the resolution of observed sidebands and the weak precipitate lines. Growth studies encompassing a cluster-size range of 15 to 70 unit cells were followed. For the systems of interest, this coincided with a variation from detectable hardness increase to a stage of maximum hardness immediately preceding precipitation, Cluster sizes were calculated on the basis of the Guinier model; variation with time and temperature permitted calculations of an apparent activation energy in the one system where decomposition was spontaneous. An iron-nickeltitanium alloy was used to study aging in a ternary system. Behavior was classic in that the cluster size present on quenching grew with aging coincident with a simultaneous hardness increase. Calculation of activation energies indicated strongly that transportation of nickel to, or iron from, the cluster was rate determining. Upon overaging, the nickel-titanium enriched clusters gave way to the hexagonal Ni2Ti phase. An iron-nickel-chromium-niobium quaternary, in addition to presenting a clustering system similar to the above ternary, showed two rather interesting phenomena. First, chromium was necessary for precipitation; the ternary ironnickel-niobium did not age. Secondly, a stable Pe2Nb Laves phase present upon quenching from 2200°F disappeared on aging in favor of nickel-niobium clusters; an incubation time for the formation of these clusters existed, and its duration was about 4 hr. An asymmetry was noted in the diffraction intensities about the (311)γ line in both systems. In the iron-nickel-titanium case, the asymmetry was only in intensity, whereas, with the iron-nickel-chromium-niobium alloys, the asymmetry existed in both intensity and position. Interpretation of these observations is made on the basis of anticipated variations in scattering factors, lattice spacings, and cluster sizes.
Several interesting phenomena involving ultra-soft X-rays and synthetic multilayer crystals were studied as a result of the on-going process of improving the Rigaku Mode] 3630 Wafer Analyzer for the measurement of BPSG (1000-2500 Å) and other thin films.1-3 These phenomena can be divided into four categories; “ghost” peaks, diffraction from the substrate, fluorescence from the multilayer and higher order lines from the multilayer. Each of these is a potential snurce nf error in the measurement of ultra-soft X-rays, Fortunately, as will be shown, each can be readily dealt with.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
Roger Ascham's dialogue Toxophilus is a book made of books. Ascham himself indicated the importance of classical sources to his ‘schole of shootinge’ by providing an apparatus of marginal references in Toxophilus. In its substance his dialogue of expert knowledge envisioned the old learning, drawing upon its forms in a new imaginative style. It is remarkable that for his use and conception of classical learning Ascham always had a model. He admired the great teachers Cheke and Sturm, and his writing abounds in judgments and appreciations of other classical scholars of his time. But his model for his relation to the literary past was Cicero. I do not mean Cicero the master orator but Cicero the student of the ancient Greeks.
Dementia is a leading cause of morbidity and mortality without pharmacologic prevention or cure. Mounting evidence suggests that adherence to a Mediterranean dietary pattern may slow cognitive decline, and is important to characterise in at-risk cohorts. Thus, we determined the reliability and validity of the Mediterranean Diet and Culinary Index (MediCul), a new tool, among community-dwelling individuals with mild cognitive impairment (MCI). A total of sixty-eight participants (66 % female) aged 75·9 (sd 6·6) years, from the Study of Mental and Resistance Training study MCI cohort, completed the fifty-item MediCul at two time points, followed by a 3-d food record (FR). MediCul test–retest reliability was assessed using intra-class correlation coefficients (ICC), Bland–Altman plots and κ agreement within seventeen dietary element categories. Validity was assessed against the FR using the Bland–Altman method and nutrient trends across MediCul score tertiles. The mean MediCul score was 54·6/100·0, with few participants reaching thresholds for key Mediterranean foods. MediCul had very good test–retest reliability (ICC=0·93, 95 % CI 0·884, 0·954, P<0·0001) with fair-to-almost-perfect agreement for classifying elements within the same category. Validity was moderate with no systematic bias between methods of measurement, according to the regression coefficient (y=−2·30+0·17x) (95 % CI −0·027, 0·358; P=0·091). MediCul over-estimated the mean FR score by 6 %, with limits of agreement being under- and over-estimated by 11 and 23 %, respectively. Nutrient trends were significantly associated with increased MediCul scoring, consistent with a Mediterranean pattern. MediCul provides reliable and moderately valid information about Mediterranean diet adherence among older individuals with MCI, with potential application in future studies assessing relationships between diet and cognitive function.