To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are gram-negative bacteria resistant to at least 1 carbapenem and are associated with high mortality (50%). Carbapenemase-producing CRE (CP-CRE) are particularly serious because they are more likely to transmit carbapenem resistance genes to other gram-negative bacteria and they are resistant to all carbapenem antibiotics. Few studies have evaluated risk factors associated with CP-CRE colonization. The goal of this study was to determine the risk factors associated with CP-CRE colonization in a cohort of US veterans. Methods: We conducted a retrospective cohort study of patients seen at VA medical centers between 2013 and 2018 who had positive cultures for CRE from any site, defined by resistance to at least 1 of the following carbapenems: imipenem, meropenem, doripenem, or ertapenem. CP-CRE was defined via antibiotic sensitivity data that coded the culture as being ‘carbapenemase producing,’ being ‘Hodge test positive,’ or ‘KPC producing.’ Only the first positive culture for CRE was included. Patient demographics (year of culture, age, sex, race, major comorbidities, infectious organism, culture site, inpatient status, and CP-CRE status) and facility demographics (rurality, geographic region, and facility complexity) were collected. Bivariate analysis and multiple logistic regression were performed to determine variables associated with CP-CRE versus non–CP-CRE. Results: In total, 3,322 patients were identified with a positive CRE culture: 546 (16.4%) with CP-CRE and 2,776 (83.63%) with non–CP-CRE. Most patients were men (95%) and were older (mean age, 71; SD, 12.5) and were diagnosed at a high-complexity VA medical center (65%). Most of the cultures were urine (63%), followed by sputum (13%), and blood (7%). Most were from inpatients (46%), followed by outpatients (42%), and long-term care facilities (12%). Multivariable analysis showed the following variables to be associated with CP-CRE positive cultures: congestive heart failure (P = .0136), African American (P = .0760), Klebsiella spp (P < .0001), GI cancers (P = .0087), culture collected in 2017 (P = .0004), and culture collected in 2018 (P < .0001). There were also significant differences CP-CRE frequencies by geographic region (P < .001). Discussion: CP-CRE diagnoses are relatively rare; however, the serious complications associated make them important infections to investigate. In our analysis, we found that congestive heart failure and gastric cancer were comorbidities strongly associated with CP-CRE. In 2017, the VA formalized their CP-CRE definition, which led to more accurate reporting. Conclusions: After the guideline was implemented, CP-CRE detection dramatically increased in noncontinental US facilities. More work should be done in the future to determine the different risk factors between non–CP-CRE and CP-CRE infections.
Decisions to treat large-vessel occlusion with endovascular therapy (EVT) or intravenous alteplase depend on how physicians weigh benefits against risks when considering patients’ comorbidities. We explored EVT/alteplase decision-making by stroke experts in the setting of comorbidity/disability.
In an international multi-disciplinary survey, experts chose treatment approaches under current resources and under assumed ideal conditions for 10 of 22 randomly assigned case scenarios. Five included comorbidities (cancer, cardiac/respiratory/renal disease, mild cognitive impairment [MCI], physical dependence). We examined scenario/respondent characteristics associated with EVT/alteplase decisions using multivariable logistic regressions.
Among 607 physicians (38 countries), EVT was chosen less often in comorbidity-related scenarios (79.6% under current resources, 82.7% assuming ideal conditions) versus six “level-1A” scenarios for which EVT/alteplase was clearly indicated by current guidelines (91.1% and 95.1%, respectively, odds ratio [OR] [current resources]: 0.38, 95% confidence interval 0.31–0.47). However, EVT was chosen more often in comorbidity-related scenarios compared to all other 17 scenarios (79.6% versus 74.4% under current resources, OR: 1.34, 1.17–1.54). Responses favoring alteplase for comorbidity-related scenarios (e.g. 75.0% under current resources) were comparable to level-1A scenarios (72.2%) and higher than all others (60.4%). No comorbidity independently diminished EVT odds when considering all scenarios. MCI and dependence carried higher alteplase odds; cancer and cardiac/respiratory/renal disease had lower odds. Being older/female carried lower EVT odds. Relevant respondent characteristics included performing more EVT cases/year (higher EVT-, lower alteplase odds), practicing in East Asia (higher EVT odds), and in interventional neuroradiology (lower alteplase odds vs neurology).
Moderate-to-severe comorbidities did not consistently deter experts from EVT, suggesting equipoise about withholding EVT based on comorbidities. However, alteplase was often foregone when respondents chose EVT. Differences in decision-making by patient age/sex merit further study.
The music that was produced in Dunedin, New Zealand, during the 1980's occupies a unique place in the global indie music canon. In writing about this supposed ‘Dunedin sound,' critics and scholars alike have fixated on the city's remoteness: it is believed to be distant from metropolitan centres of music industry power and influence, and consequently supported a subversive and democratised local music scene. This article explores the implications of the ongoing historicisation of Dunedin's popular music scene along these lines, and highlights the ways in which the valorisation of the city’s musical heritage obstructs problematic power dynamics that impact the way young musicians in the city express place and musical identity. Our research applies an embedded participatory ethnography to unpack the ideological positions occupied by contemporary local musicians, and to critique factions within the contemporary musical scene in the city.
Over the past decade, a growing interest has developed on the archaeology, palaeontology, and palaeoenvironments of the Arabian Peninsula. It is now clear that hominins repeatedly dispersed into Arabia, notably during pluvial interglacial periods when much of the peninsula was characterised by a semiarid grassland environment. During the intervening glacial phases, however, grasslands were replaced with arid and hyperarid deserts. These millennial-scale climatic fluctuations have subjected bones and fossils to a dramatic suite of environmental conditions, affecting their fossilisation and preservation. Yet, as relatively few palaeontological assemblages have been reported from the Pleistocene of Arabia, our understanding of the preservational pathways that skeletal elements can take in these types of environments is lacking. Here, we report the first widespread taxonomic and taphonomic assessment of Arabian fossil deposits. Novel fossil fauna are described and overall the fauna are consistent with a well-watered semiarid grassland environment. Likewise, the taphonomic results suggest that bones were deposited under more humid conditions than present in the region today. However, fossils often exhibit significant attrition, obscuring and fragmenting most finds. These are likely tied to wind abrasion, insolation, and salt weathering following fossilisation and exhumation, processes particularly prevalent in desert environments.
High-quality data are critical to the entire scientific enterprise, yet the complexity and effort involved in data curation are vastly under-appreciated. This is especially true for large observational, clinical studies because of the amount of multimodal data that is captured and the opportunity for addressing numerous research questions through analysis, either alone or in combination with other data sets. However, a lack of details concerning data curation methods can result in unresolved questions about the robustness of the data, its utility for addressing specific research questions or hypotheses and how to interpret the results. We aimed to develop a framework for the design, documentation and reporting of data curation methods in order to advance the scientific rigour, reproducibility and analysis of the data.
Forty-six experts participated in a modified Delphi process to reach consensus on indicators of data curation that could be used in the design and reporting of studies.
We identified 46 indicators that are applicable to the design, training/testing, run time and post-collection phases of studies.
The Data Acquisition, Quality and Curation for Observational Research Designs (DAQCORD) Guidelines are the first comprehensive set of data quality indicators for large observational studies. They were developed around the needs of neuroscience projects, but we believe they are relevant and generalisable, in whole or in part, to other fields of health research, and also to smaller observational studies and preclinical research. The DAQCORD Guidelines provide a framework for achieving high-quality data; a cornerstone of health research.
Policy makers across the political spectrum have extolled the virtues of volunteering in achieving social policy aims. Yet little is known about the role that volunteering plays in addressing one of the significant challenges of an ageing population: the provision of care and support to people with dementia. We combine organisational survey data, secondary social survey data, and in-depth interviews with people with dementia, family carers and volunteers in order to better understand the context, role and challenges in which volunteers support people with dementia. Social policies connecting volunteering and dementia care in homes and communities often remain separate and disconnected and our paper draws on the concept of policy ‘assemblages’ to suggest that dementia care is a dynamic mixture of formal and informal volunteering activities that bridge and blur traditional policy boundaries. Linking home and community environments is a key motivation, benefit and outcome for volunteers, carers and those living with dementia. The paper calls to widen the definition and investigation of volunteering in social policy to include and support informal volunteering activity.
Iron-rich meteorites are significantly underrepresented in collection statistics from Antarctica. This has led to a hypothesis that there is a sparse layer of iron-rich meteorites hidden below the surface of the ice, thereby explaining the apparent shortfall. As standard Antarctic meteorite collecting techniques rely upon a visual surface search approach, the need has thus arisen to develop a system that can detect iron objects under a few tens of centimetres of ice, where the expected number density is of the order one per square kilometre. To help answer this hypothesis, a large-scale pulse induction metal detector array has been constructed for deployment in Antarctica. The metal detector array is 6 m wide, able to travel at 15 km h-1 and can scan 1 km2 in ~11 hours. This paper details the construction of the metal detector system with respect to design criteria, notably the ruggedization of the system for Antarctic deployment. Some preliminary results from UK and Antarctic testing are presented. We show that the system performs as specified and should reach the pre-agreed target of the detection of a 100 g iron meteorite at 300 mm when deployed in Antarctica.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
Kochia is one of the most problematic weeds in the United States. Field studies were conducted in five states (Wyoming, Colorado, Kansas, Nebraska, and South Dakota) over 2 yr (2010 and 2011) to evaluate kochia control with selected herbicides registered in five common crop scenarios: winter wheat, fallow, corn, soybean, and sugar beet to provide insight for diversifying kochia management in crop rotations. Kochia control varied by experimental site such that more variation in kochia control and biomass production was explained by experimental site than herbicide choice within a crop. Kochia control with herbicides currently labeled for use in sugar beet averaged 32% across locations. Kochia control was greatest and most consistent from corn herbicide programs (99%), followed by soybean (96%) and fallow (97%) herbicide programs. Kochia control from wheat herbicide programs was 93%. With respect to the availability of effective herbicide options, glyphosate-resistant kochia control was easiest in corn, soybean, and fallow, followed by wheat; and difficult to manage with herbicides in sugar beet.
The need for assistive technologies in Canada is increasing, but access is inconsistent and fragmented which can result in unmet needs. We aimed to identify citizens’ values and preferences for how to enhance equitable access to assistive technologies and to engage policymakers, stakeholders, and researchers in deliberations to spark action. In spring 2017, we convened three citizen panels and a stakeholder dialogue. Key panel findings were included in an evidence brief that informed dialogue participants. Thirty-seven citizens participated in panels and emphasized the need for access to reliable information, equitable access to assistive technologies regardless of ability to pay, and the need for collaboration. Twenty-two dialogue participants focused on the need for a guiding framework that supports fundamental change across the country. The proposed policy framework can enhance access to assistive technologies through enabling simplified policies and programs, along with fostering robust data collection and evaluation to support countrywide innovation and accountability.
Dementia is a leading cause of morbidity and mortality without pharmacologic prevention or cure. Mounting evidence suggests that adherence to a Mediterranean dietary pattern may slow cognitive decline, and is important to characterise in at-risk cohorts. Thus, we determined the reliability and validity of the Mediterranean Diet and Culinary Index (MediCul), a new tool, among community-dwelling individuals with mild cognitive impairment (MCI). A total of sixty-eight participants (66 % female) aged 75·9 (sd 6·6) years, from the Study of Mental and Resistance Training study MCI cohort, completed the fifty-item MediCul at two time points, followed by a 3-d food record (FR). MediCul test–retest reliability was assessed using intra-class correlation coefficients (ICC), Bland–Altman plots and κ agreement within seventeen dietary element categories. Validity was assessed against the FR using the Bland–Altman method and nutrient trends across MediCul score tertiles. The mean MediCul score was 54·6/100·0, with few participants reaching thresholds for key Mediterranean foods. MediCul had very good test–retest reliability (ICC=0·93, 95 % CI 0·884, 0·954, P<0·0001) with fair-to-almost-perfect agreement for classifying elements within the same category. Validity was moderate with no systematic bias between methods of measurement, according to the regression coefficient (y=−2·30+0·17x) (95 % CI −0·027, 0·358; P=0·091). MediCul over-estimated the mean FR score by 6 %, with limits of agreement being under- and over-estimated by 11 and 23 %, respectively. Nutrient trends were significantly associated with increased MediCul scoring, consistent with a Mediterranean pattern. MediCul provides reliable and moderately valid information about Mediterranean diet adherence among older individuals with MCI, with potential application in future studies assessing relationships between diet and cognitive function.
The intent of this study was to determine whether there are differences in disaster preparedness between urban and rural community hospitals across New York State.
Descriptive and analytical cross-sectional survey study of 207 community hospitals; thirty-five questions evaluated 6 disaster preparedness elements: disaster plan development, on-site surge capacity, available materials and resources, disaster education and training, disaster preparedness funding levels, and perception of disaster preparedness.
Completed surveys were received from 48 urban hospitals and 32 rural hospitals.There were differences in disaster preparedness between urban and rural hospitals with respect to disaster plan development, on-site surge capacity, available materials and resources, disaster education and training, and perception of disaster preparedness. No difference was identified between these hospitals with respect to disaster preparedness funding levels.
The results of this study provide an assessment of the current state of disaster preparedness in urban and rural community hospitals in New York. Differences in preparedness between the two settings may reflect differing priorities with respect to perceived threats, as well as opportunities for improvement that may require additional advocacy and legislation. (Disaster Med Public Health Preparedness. 2019;13:424-428)
Rapid identification of esophageal intubations is critical to avoid patient morbidity and mortality. Continuous waveform capnography remains the gold standard for endotracheal tube (ETT) confirmation, but it has limitations. Point-of-care ultrasound (POCUS) may be a useful alternative for confirming ETT placement. The objective of this study was to determine the accuracy of paramedic-performed POCUS identification of esophageal intubations with and without ETT manipulation.
A prospective, observational study using a cadaver model was conducted. Local paramedics were recruited as subjects and each completed a survey of their demographics, employment history, intubation experience, and prior POCUS training. Subjects participated in a didactic session in which they learned POCUS identification of ETT location. During each study session, investigators randomly placed an ETT in either the trachea or esophagus of four cadavers, confirmed with direct laryngoscopy. Subjects then attempted to determine position using POCUS both without and with manipulation of the ETT. Manipulation of the tube was performed by twisting the tube. Descriptive statistics and logistic regression were used to assess the results and the effects of previous paramedic experience.
During 12 study sessions, from March 2014 through December 2015, 57 subjects participated, evaluating a total of 228 intubations: 113 tracheal and 115 esophageal. Subjects were 84.0% male, mean age of 39 years (range: 22 - 62 years), with median experience of seven years (range: 0.6 - 39 years). Paramedics correctly identified ETT location in 158 (69.3%) cases without and 194 (85.1%) with ETT manipulation. The sensitivity and specificity of identifying esophageal location without ETT manipulation increased from 52.2% (95% confidence interval [CI], 43.0-61.0) and 86.7% (95% CI, 81.0-93.0) to 87.0% (95% CI, 81.0-93.0) and 83.2% (95% CI, 0.76-0.90) after manipulation (P<.0001), without affecting specificity (P=.45). Subjects correctly identified 41 previously incorrectly identified esophageal intubations. Paramedic experience, previous intubations, and POCUS experience did not correlate with ability to identify tube location.
Paramedics can accurately identify esophageal intubations with POCUS, and manipulation improves identification. Further studies of paramedic use of dynamic POCUS to identify inadvertent esophageal intubations are needed.
LemaPC, O’BrienM, WilsonJ, St. JamesE, LindstromH, DeAngelisJ, CaldwellJ, MayP, ClemencyB.Avoid the Goose! Paramedic Identification of Esophageal Intubation by Ultrasound. Prehosp Disaster Med.2018;33(4):406–410
OBJECTIVES/SPECIFIC AIMS: Rodent models can be used to study neonatal abstinence syndrome (NAS), but the applicability of findings from the models to NAS in humans is not well understood. The objective of this study was to develop a rat model of norbuprenorphine-induced NAS and validate its translational value by comparing blood concentrations in the norbuprenorphine-treated pregnant rat to those previously reported in pregnant women undergoing buprenorphine treatment. METHODS/STUDY POPULATION: Pregnant Long-Evans rats were implanted with 14-day osmotic minipumps containing vehicle, morphine (positive control), or norbuprenorphine (0.3–3 mg/kg/d) on gestation day 9. Within 12 hours of delivery, pups were tested for spontaneous or precipitated opioid withdrawal by injecting them with saline (10 mL/kg, i.p.) or naltrexone (1 or 10 mg/kg, i.p), respectively, and observing them for well-validated neonatal withdrawal signs. Blood was sampled via indwelling jugular catheters from a subset of norbuprenorphine-treated dams on gestation day 8, 10, 13, 17, and 20. Norbuprenorphine concentrations in whole blood samples were quantified using LC/MS/MS. RESULTS/ANTICIPATED RESULTS: Blood concentrations of norbuprenorphine in rats exposed to 1–3 mg/kg/d of norbuprenorphine were similar to levels previously reported in pregnant women undergoing buprenorphine treatment. Pups born to dams treated with these doses exhibited robust withdrawal signs. Blood concentrations of norbuprenorphine decreased across gestation, which is similar to previous reports in humans. DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that dosing dams with 1–3 mg/kg/day norbuprenorphine produces maternal blood concentrations and withdrawal severity similar to those previously reported in humans. This provides evidence that, at these doses, this model is useful for testing hypotheses about norbuprenorphine that are applicable to NAS in humans.
OBJECTIVES/SPECIFIC AIMS: Test the hypothesis that neprilysin inhibition with sacubitril/valsartan will increase endogenous intact GLP-1 after a mixed meal compared with valsartan. METHODS/STUDY POPULATION: Adults 18–80 years with pre-diabetes or type 2 diabetes and elevated blood pressure. RESULTS/ANTICIPATED RESULTS: We anticipate higher intact GLP-1 area under the cure after the meal when subjects receive sacubitril/valsartan compared with valsartan. DISCUSSION/SIGNIFICANCE OF IMPACT: Neprilysin inhibition may be a target for anti-diabetes therapy by decreasing degradation of GLP-1.
Directors of Student Teaching from the Western Canadian provinces participated in focus groups about the realities and decision-making processes around practicum for preservice teachers with disabilities. Results showed current standards, when applied rigidly, served to reify a static, homogenous, and unrealistic definition of ‘teacher’ that marginalises preservice teachers with disabilities. However, the effort of directors to challenge this notion of ‘teacher’, framed within the constructionist model of disability, gives hope for a more inclusive future teaching force.