To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In a tertiary-care hospital and affiliated long-term care facility, a stewardship intervention focused on patients with Clostridioides difficile infection (CDI) was associated with a significant reduction in unnecessary non-CDI antibiotic therapy. However, there was no significant reduction in total non-CDI therapy or in the frequency of CDI recurrence.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
Building on prior work using Tom Dishion's Family Check-Up, the current article examined intervention effects on dysregulated irritability in early childhood. Dysregulated irritability, defined as reactive and intense response to frustration, and prolonged angry mood, is an ideal marker of neurodevelopmental vulnerability to later psychopathology because it is a transdiagnostic indicator of decrements in self-regulation that are measurable in the first years of life that have lifelong implications for health and disease. This study is perhaps the first randomized trial to examine the direct effects of an evidence- and family-based intervention, the Family Check-Up (FCU), on irritability in early childhood and the effects of reductions in irritability on later risk of child internalizing and externalizing symptomatology. Data from the geographically and sociodemographically diverse multisite Early Steps randomized prevention trial were used. Path modeling revealed intervention effects on irritability at age 4, which predicted lower externalizing and internalizing symptoms at age 10.5. Results indicate that family-based programs initiated in early childhood can reduce early childhood irritability and later risk for psychopathology. This holds promise for earlier identification and prevention approaches that target transdiagnostic pathways. Implications for future basic and prevention research are discussed.
Kochia is one of the most problematic weeds in the United States. Field studies were conducted in five states (Wyoming, Colorado, Kansas, Nebraska, and South Dakota) over 2 yr (2010 and 2011) to evaluate kochia control with selected herbicides registered in five common crop scenarios: winter wheat, fallow, corn, soybean, and sugar beet to provide insight for diversifying kochia management in crop rotations. Kochia control varied by experimental site such that more variation in kochia control and biomass production was explained by experimental site than herbicide choice within a crop. Kochia control with herbicides currently labeled for use in sugar beet averaged 32% across locations. Kochia control was greatest and most consistent from corn herbicide programs (99%), followed by soybean (96%) and fallow (97%) herbicide programs. Kochia control from wheat herbicide programs was 93%. With respect to the availability of effective herbicide options, glyphosate-resistant kochia control was easiest in corn, soybean, and fallow, followed by wheat; and difficult to manage with herbicides in sugar beet.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Dementia is a leading cause of morbidity and mortality without pharmacologic prevention or cure. Mounting evidence suggests that adherence to a Mediterranean dietary pattern may slow cognitive decline, and is important to characterise in at-risk cohorts. Thus, we determined the reliability and validity of the Mediterranean Diet and Culinary Index (MediCul), a new tool, among community-dwelling individuals with mild cognitive impairment (MCI). A total of sixty-eight participants (66 % female) aged 75·9 (sd 6·6) years, from the Study of Mental and Resistance Training study MCI cohort, completed the fifty-item MediCul at two time points, followed by a 3-d food record (FR). MediCul test–retest reliability was assessed using intra-class correlation coefficients (ICC), Bland–Altman plots and κ agreement within seventeen dietary element categories. Validity was assessed against the FR using the Bland–Altman method and nutrient trends across MediCul score tertiles. The mean MediCul score was 54·6/100·0, with few participants reaching thresholds for key Mediterranean foods. MediCul had very good test–retest reliability (ICC=0·93, 95 % CI 0·884, 0·954, P<0·0001) with fair-to-almost-perfect agreement for classifying elements within the same category. Validity was moderate with no systematic bias between methods of measurement, according to the regression coefficient (y=−2·30+0·17x) (95 % CI −0·027, 0·358; P=0·091). MediCul over-estimated the mean FR score by 6 %, with limits of agreement being under- and over-estimated by 11 and 23 %, respectively. Nutrient trends were significantly associated with increased MediCul scoring, consistent with a Mediterranean pattern. MediCul provides reliable and moderately valid information about Mediterranean diet adherence among older individuals with MCI, with potential application in future studies assessing relationships between diet and cognitive function.
So far, only one distribution function giving rise to a collisionless nonlinear force-free current sheet equilibrium allowing for a plasma beta less than one is known (Allanson et al., Phys. Plasmas, vol. 22 (10), 2015, 102116; Allanson et al., J. Plasma Phys., vol. 82 (3), 2016a, 905820306). This distribution function can only be expressed as an infinite series of Hermite functions with very slow convergence and this makes its practical use cumbersome. It is the purpose of this paper to present a general method that allows us to find distribution functions consisting of a finite number of terms (therefore easier to use in practice), but which still allow for current sheet equilibria that can, in principle, have an arbitrarily low plasma beta. The method involves using known solutions and transforming them into new solutions using transformations based on taking integer powers (
) of one component of the pressure tensor. The plasma beta of the current sheet corresponding to the transformed distribution functions can then, in principle, have values as low as
. We present the general form of the distribution functions for arbitrary
and then, as a specific example, discuss the case for
The Molonglo Observatory Synthesis Telescope (MOST) is an 18000 m2 radio telescope located 40 km from Canberra, Australia. Its operating band (820–851 MHz) is partly allocated to telecommunications, making radio astronomy challenging. We describe how the deployment of new digital receivers, Field Programmable Gate Array-based filterbanks, and server-class computers equipped with 43 Graphics Processing Units, has transformed the telescope into a versatile new instrument (UTMOST) for studying the radio sky on millisecond timescales. UTMOST has 10 times the bandwidth and double the field of view compared to the MOST, and voltage record and playback capability has facilitated rapid implementaton of many new observing modes, most of which operate commensally. UTMOST can simultaneously excise interference, make maps, coherently dedisperse pulsars, and perform real-time searches of coherent fan-beams for dispersed single pulses. UTMOST operates as a robotic facility, deciding how to efficiently target pulsars and how long to stay on source via real-time pulsar folding, while searching for single pulse events. Regular timing of over 300 pulsars has yielded seven pulsar glitches and three Fast Radio Bursts during commissioning. UTMOST demonstrates that if sufficient signal processing is applied to voltage streams, innovative science remains possible even in hostile radio frequency environments.
The class of radio transients called Fast Radio Bursts (FRBs) encompasses enigmatic single pulses, each unique in its own way, hindering a consensus for their origin. The key to demystifying FRBs lies in discovering many of them in order to identity commonalities – and in real time, in order to find potential counterparts at other wavelengths. The recently upgraded UTMOST in Australia, is undergoing a backend transformation to rise as a fast transient detection machine. The first interferometric detections of FRBs with UTMOST, place their origin beyond the near-field region of the telescope thus ruling out local sources of interference as a possible origin. We have localised these bursts to much better than the ones discovered at the Parkes radio telescope and have plans to upgrade UTMOST to be capable of much better localisation still.
Post-traumatic stress disorder (PTSD) is often associated with attention allocation and emotional regulation difficulties, but the brain dynamics underlying these deficits are unknown. The emotional Stroop task (EST) is an ideal means to monitor these difficulties, because participants are asked to attend to non-emotional aspects of the stimuli. In this study, we used magnetoencephalography (MEG) and the EST to monitor attention allocation and emotional regulation during the processing of emotionally charged stimuli in combat veterans with and without PTSD.
A total of 31 veterans with PTSD and 20 without PTSD performed the EST during MEG. Three categories of stimuli were used, including combat-related, generally threatening and neutral words. MEG data were imaged in the time-frequency domain and the network dynamics were probed for differences in processing threatening and non-threatening words.
Behaviorally, veterans with PTSD were significantly slower in responding to combat-related relative to neutral and generally threatening words. Veterans without PTSD exhibited no significant differences in responding to the three different word types. Neurophysiologically, we found a significant three-way interaction between group, word type and time period across multiple brain regions. Follow-up testing indicated stronger theta-frequency (4–8 Hz) responses in the right ventral prefrontal (0.4–0.8 s) and superior temporal cortices (0.6–0.8 s) of veterans without PTSD compared with those with PTSD during the processing of combat-related words.
Our data indicated that veterans with PTSD exhibited deficits in attention allocation and emotional regulation when processing trauma cues, while those without PTSD were able to regulate emotion by directing attention away from threat.
Introduction: Cellulitis and erysipelas are common presentations for the general practitioner. Antibiotic therapy targeting beta-hemolytic streptococci and Staphylococcus aureus is the mainstay of treatment for children and adults with these infections. Although evidence-based Canadian guidelines for appropriate management exist, inconsistent practices persist. Our objective was to determine the level of adherence to current evidence by emergency physicians at two academic hospitals in Kingston, Ontario. Methods: We conducted a retrospective chart review of 200 randomly selected electronic medical records. Records belonged to patients with a discharge diagnosis of cellulitis or erysipelas who were seen in the emergency departments of Kingston General Hospital or Hotel Dieu Hospital between January 1 and June 30, 2015. We manually collected data describing patient demographics, medical history, and medical management. Results: There were 707 total visits to the emergency departments in the study period for cellulitis or erysipelas. In our random sample, for those diagnosed with cellulitis, 44% received oral cephalexin alone, which was the most common form of therapy for uncomplicated infection. Of all the patients who received any antibiotics, 36% received at least one dose of parenteral antibiotics, despite only 6.7% showing systemic signs of illness. Emergency physicians chose ceftriaxone for 88% of the patients who received parenteral antibiotics. Conclusion: There was wide variation in antibiotic selection and route of administration for patients with cellulitis or erysipelas. Ceftriaxone was chosen for most patients receiving parenteral antibiotics, but it may not have been the most effective antibiotic in some cases. Overuse of antibiotics is common, and we believe medication choice should be justified based on disease severity, spectrum of activity, and regional antibiotic resistance patterns, among other factors. In conclusion, we found that emergency physicians could more closely align management plans with current guidelines to improve management of uncomplicated infection and reduce unnecessary administration of parenteral antibiotics.
Previous research regarding anxiety as a predictor of future cognitive decline in older adults is limited and inconsistent. We examined the independent relationship between anxiety symptoms and subsequent cognitive decline.
We included 2,818 community-dwelling older men (mean age = 76.1, SD ±5.3 years) who were followed on an average for 3.4 years. We assessed anxiety symptoms at baseline using the Goldberg Anxiety Scale (GAS; range = 0–9). We assessed cognitive function at baseline and at two subsequent visits using the Modified Mini-Mental State Examination (3MS; global cognition) and the Trails B test (executive function).
At baseline, there were 690 (24%) men with mild anxiety symptoms (GAS 1–4) and 226 (8%) men with moderate/severe symptoms (GAS 5–9). Men with anxiety symptoms were more likely to have depressed mood, poor sleep, more chronic medical conditions, and more impairment in activities of daily living compared to those with no anxiety symptoms. Compared to those with no anxiety symptoms at baseline, men with any anxiety symptoms were more likely to have substantial worsening in Trails B completion time (OR = 1.56, 95% CI 1.19, 2.05). The association was attenuated after adjusting for potential confounders, including depression and poor sleep, but remained significant (OR = 1.40, 95% CI 1.04, 1.88).
In cognitively healthy older men, mild anxiety symptoms may potentially predict future decline in executive functioning. Anxiety is likely a manifestation of an underlying neurodegenerative process rather than a cause.
Accurate models of X-ray absorption and re-emission in partly stripped ions are necessary to calculate the structure of stars, the performance of hohlraums for inertial confinement fusion and many other systems in high-energy-density plasma physics. Despite theoretical progress, a persistent discrepancy exists with recent experiments at the Sandia Z facility studying iron in conditions characteristic of the solar radiative–convective transition region. The increased iron opacity measured at Z could help resolve a longstanding issue with the standard solar model, but requires a radical departure for opacity theory. To replicate the Z measurements, an opacity experiment has been designed for the National Facility (NIF). The design uses established techniques scaled to NIF. A laser-heated hohlraum will produce X-ray-heated uniform iron plasmas in local thermodynamic equilibrium (LTE) at temperatures
eV and electron densities
. The iron will be probed using continuum X-rays emitted in a
diameter source from a 2 mm diameter polystyrene (CH) capsule implosion. In this design,
of the NIF beams deliver 500 kJ to the
mm diameter hohlraum, and the remaining
directly drive the CH capsule with 200 kJ. Calculations indicate this capsule backlighter should outshine the iron sample, delivering a point-projection transmission opacity measurement to a time-integrated X-ray spectrometer viewing down the hohlraum axis. Preliminary experiments to develop the backlighter and hohlraum are underway, informing simulated measurements to guide the final design.
Our study aimed to evaluate changes in the epidemiology of pathogens causing surgical site infections (SSIs) in England between 2000 and 2013 in the context of intensified national interventions to reduce healthcare-associated infections introduced since 2006. National prospective surveillance data on target surgical procedures were used for this study. Data on causative organism were available for 72% of inpatient-detected SSIs meeting the standard case definitions for superficial, deep and organ-space infections (9767/13 531) which were analysed for trends. A multivariable logistic linear mixed model with hospital random effects was fitted to evaluate trends by pathogen. Staphylococcus aureus was the predominant cause of SSI between 2000 (41%) and 2009 (24%), decreasing from 2006 onwards reaching 16% in 2013. Data for 2005–2013 showed that the odds of SSI caused by S. aureus decreased significantly by 14% per year [adjusted odds ratio (aOR) 0·86, 95% confidence interval (CI) 0·83–0·89] driven by significant decreases in methicillin-resistant S. aureus (MRSA) (aOR 0·71, 95% CI 0·68–0·75). However a small significant increase in methicillin-sensitive S. aureus was identified (aOR 1·06, 95% CI 1·02–1·10). Enterobacteriaceae were stable during 2000–2007 (12% of cases overall), increasing from 2008 (18%) onwards, being present in 25% of cases in 2013; the model supported these increasing trends during 2007–2013 (aOR 1·12, 95% CI 1·07–1·18). The decreasing trends in S. aureus SSIs from 2006 and the increases in Enterobacteriaceae SSIs from 2008 may be related to intensified national efforts targeted at reducing MRSA bacteraemia combined with changes in antibiotic use aimed at controlling C. difficile infections.
A recent consideration in aircraft design is the use of folding wing-tips with the aim of enabling higher aspect ratio aircraft with less induced drag while also meeting airport gate limitations. This study investigates the effect of exploiting folding wing-tips in flight as a device to reduce both static and dynamic loads. A representative civil jet aircraft aeroelastic model was used to explore the effect of introducing a wing-tip device, connected to the wings with an elastic hinge, on the load behaviour. For the dynamic cases, vertical discrete gusts and continuous turbulence were considered. The effects of hinge orientation, stiffness, damping and wing-tip weight on the static and dynamic response were investigated. It was found that significant reductions in both the static and dynamic loads were possible. For the case considered, a 25% increase in span using folding wing-tips resulted in almost no increase in loads.