We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
The primary purpose of this statement is to improve neuroprognostication after devastating brain injury (DBI), with a secondary benefit of potential organ and tissue donation.
Obtaining objective, dietary exposure information from individuals is challenging because of the complexity of food consumption patterns and the limitations of self-reporting tools (e.g., FFQ and diet diaries). This hinders research efforts to associate intakes of specific foods or eating patterns with population health outcomes.
Design:
Dietary exposure can be assessed by the measurement of food-derived chemicals in urine samples. We aimed to develop methodologies for urine collection that minimised impact on the day-to-day activities of participants but also yielded samples that were data-rich in terms of targeted biomarker measurements.
Setting:
Urine collection methodologies were developed within home settings.
Participants:
Different cohorts of free-living volunteers.
Results:
Home collection of urine samples using vacuum transfer technology was deemed highly acceptable by volunteers. Statistical analysis of both metabolome and selected dietary exposure biomarkers in spot urine collected and stored using this method showed that they were compositionally similar to urine collected using a standard method with immediate sample freezing. Even without chemical preservatives, samples can be stored under different temperature regimes without any significant impact on the overall urine composition or concentration of forty-six exemplar dietary exposure biomarkers. Importantly, the samples could be posted directly to analytical facilities, without the need for refrigerated transport and involvement of clinical professionals.
Conclusions:
This urine sampling methodology appears to be suitable for routine use and may provide a scalable, cost-effective means to collect urine samples and to assess diet in epidemiological studies.
For patients with methicillin-resistant Staphylococcus aureus (MRSA) colonization, a traditional fist-bump greeting did not significantly reduce MRSA transfer in comparison to a handshake. However, transfer was reduced with a modified fist bump that minimized the surface area of contact and when hand hygiene was performed before the handshake.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
Methods
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Results
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Conclusions
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
Methods:
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
Results:
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
Conclusions:
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Introduction: Simulation is becoming widely adopted across medical disciplines and by different medical professionals. For medical students, emergency medicine simulation has been shown to increase knowledge, confidence and satisfaction. At the University of Ottawa Skills and Simulation Centre, third-year medical students participate in simulated scenarios common to Emergency Medicine (EM) as part of their mandatory EM clerkship rotation. This study aims to evaluate simulation as part of the EM clerkship rotation by assessing changes in student confidence following a simulation session. Methods: In groups of seven, third year medical students at the University of Ottawa completed simulation sessions of the following: Status Asthmaticus, Status Epilepticus, Urosepsis and Breaking Bad News. Student confidence with each topic was assessed before and after simulation with a written survey. Confidence scores pre- and post-simulation were compared with the Wilcoxon signed rank test. Results: Forty-eight third years medical students in their core EM clerkship rotation, between September 2017 and August 2018 participated in this study. Medical student confidence with diagnosis of status asthmaticus (N = 44, p = 0.0449) and status epilepticus (N = 45, p = 0.0011) increased significantly following simulation, whereas confidence with diagnosis of urosepsis was unchanged (N = 45, p = 0.0871). Treatment confidence increased significantly for status asthmaticus (N = 47, p = 0.0009), status epilepticus (N = 48, p = 0.0005) and urosepsis (N = 48, p < 0.0001). Confidence for breaking bad news was not significantly changed after simulation (N = 47, p = 0.0689). Conclusion: Simulation training in our EM clerkship rotation significantly increased the confidence of medical students for certain common EM presentations, but not for all. Further work will aim to understand why some simulation scenarios did not improve confidence, and look to improve existing scenarios.
Declining mortality following invasive pneumococcal disease (IPD) has been observed concurrent with a reduced incidence due to effective pneumococcal conjugate vaccines. However, with IPD now increasing due to serotype replacement, we undertook a statistical analysis to estimate the trend in all-cause 30-day case fatality rate (CFR) in the North East of England (NEE) following IPD. Clinical, microbiological and demographic data were obtained for all laboratory-confirmed IPD cases (April 2006–March 2016) and the adjusted association between CFR and epidemiological year estimated using logistic regression. Of the 2510 episodes of IPD included in the analysis, 486 died within 30 days of IPD (CFR 19%). Increasing age, male sex, a diagnosis of septicaemia, being in ⩾1 clinical risk groups, alcohol abuse and individual serotypes were independently associated with increased CFR. A significant decline in CFR over time was observed following adjustment for these significant predictors (adjusted odds ratio 0.93, 95% confidence interval 0.89–0.98; P = 0.003). A small but significant decline in 30-day all-cause CFR following IPD has been observed in the NEE. Nonetheless, certain population groups remain at increased risk of dying following IPD. Despite the introduction of effective vaccines, further strategies to reduce the ongoing burden of mortality from IPD are needed.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.
To test the feasibility of using telehealth to support antimicrobial stewardship at Veterans Affairs medical centers (VAMCs) that have limited access to infectious disease-trained specialists.
Design
A prospective quasi-experimental pilot study.
Setting
Two rural VAMCs with acute-care and long-term care units.
Intervention
At each intervention site, medical providers, pharmacists, infection preventionists, staff nurses, and off-site infectious disease physicians formed a videoconference antimicrobial stewardship team (VAST) that met weekly to discuss cases and antimicrobial stewardship-related education.
Methods
Descriptive measures included fidelity of implementation, number of cases discussed, infectious syndromes, types of recommendations, and acceptance rate of recommendations made by the VAST. Qualitative results stemmed from semi-structured interviews with VAST participants at the intervention sites.
Results
Each site adapted the VAST to suit their local needs. On average, sites A and B discussed 3.5 and 3.1 cases per session, respectively. At site A, 98 of 140 cases (70%) were from the acute-care units; at site B, 59 of 119 cases (50%) were from the acute-care units. The most common clinical syndrome discussed was pneumonia or respiratory syndrome (41% and 35% for sites A and B, respectively). Providers implemented most VAST recommendations, with an acceptance rate of 73% (186 of 256 recommendations) and 65% (99 of 153 recommendations) at sites A and B, respectively. Qualitative results based on 24 interviews revealed that participants valued the multidisciplinary aspects of the VAST sessions and felt that it improved their antimicrobial stewardship efforts and patient care.
Conclusions
This pilot study has successfully demonstrated the feasibility of using telehealth to support antimicrobial stewardship at rural VAMCs with limited access to local infectious disease expertise.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
Children reared in impoverished environments are at risk for enduring psychological and physical health problems. Mechanisms by which poverty affects development, however, remain unclear. To explore one potential mechanism of poverty's impact on social–emotional and cognitive development, an experimental examination of a rodent model of scarcity-adversity was conducted and compared to results from a longitudinal study of human infants and families followed from birth (N = 1,292) who faced high levels of poverty-related scarcity-adversity. Cross-species results supported the hypothesis that altered caregiving is one pathway by which poverty adversely impacts development. Rodent mothers assigned to the scarcity-adversity condition exhibited decreased sensitive parenting and increased negative parenting relative to mothers assigned to the control condition. Furthermore, scarcity-adversity reared pups exhibited decreased developmental competence as indicated by disrupted nipple attachment, distress vocalization when in physical contact with an anesthetized mother, and reduced preference for maternal odor with corresponding changes in brain activation. Human results indicated that scarcity-adversity was inversely correlated with sensitive parenting and positively correlated with negative parenting, and that parenting fully mediated the association of poverty-related risk with infant indicators of developmental competence. Findings are discussed from the perspective of the usefulness of bidirectional–translational research to inform interventions for at-risk families.
The Molonglo Observatory Synthesis Telescope (MOST) is an 18000 m2 radio telescope located 40 km from Canberra, Australia. Its operating band (820–851 MHz) is partly allocated to telecommunications, making radio astronomy challenging. We describe how the deployment of new digital receivers, Field Programmable Gate Array-based filterbanks, and server-class computers equipped with 43 Graphics Processing Units, has transformed the telescope into a versatile new instrument (UTMOST) for studying the radio sky on millisecond timescales. UTMOST has 10 times the bandwidth and double the field of view compared to the MOST, and voltage record and playback capability has facilitated rapid implementaton of many new observing modes, most of which operate commensally. UTMOST can simultaneously excise interference, make maps, coherently dedisperse pulsars, and perform real-time searches of coherent fan-beams for dispersed single pulses. UTMOST operates as a robotic facility, deciding how to efficiently target pulsars and how long to stay on source via real-time pulsar folding, while searching for single pulse events. Regular timing of over 300 pulsars has yielded seven pulsar glitches and three Fast Radio Bursts during commissioning. UTMOST demonstrates that if sufficient signal processing is applied to voltage streams, innovative science remains possible even in hostile radio frequency environments.
The class of radio transients called Fast Radio Bursts (FRBs) encompasses enigmatic single pulses, each unique in its own way, hindering a consensus for their origin. The key to demystifying FRBs lies in discovering many of them in order to identity commonalities – and in real time, in order to find potential counterparts at other wavelengths. The recently upgraded UTMOST in Australia, is undergoing a backend transformation to rise as a fast transient detection machine. The first interferometric detections of FRBs with UTMOST, place their origin beyond the near-field region of the telescope thus ruling out local sources of interference as a possible origin. We have localised these bursts to much better than the ones discovered at the Parkes radio telescope and have plans to upgrade UTMOST to be capable of much better localisation still.