To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To investigate the timing and routes of contamination of the rooms of patients newly admitted to the hospital.
Observational cohort study and simulations of pathogen transfer.
A Veterans’ Affairs hospital.
Patients newly admitted to the hospital with no known carriage of healthcare-associated pathogens.
Interactions between the participants and personnel or portable equipment were observed, and cultures of high-touch surfaces, floors, bedding, and patients’ socks and skin were collected for up to 4 days. Cultures were processed for Clostridioides difﬁcile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Simulations were conducted with bacteriophage MS2 to assess plausibility of transfer from contaminated floors to high-touch surfaces and to assess the effectiveness of wearing slippers in reducing transfer.
Environmental cultures became positive for at least 1 pathogen in 10 (59%) of the 17 rooms, with cultures positive for MRSA, C. difficile, and VRE in the rooms of 10 (59%), 2 (12%), and 2 (12%) participants, respectively. For all 14 instances of pathogen detection, the initial site of recovery was the floor followed in a subset of patients by detection on sock bottoms, bedding, and high-touch surfaces. In simulations, wearing slippers over hospital socks dramatically reduced transfer of bacteriophage MS2 from the floor to hands and to high-touch surfaces.
Floors may be an underappreciated source of pathogen dissemination in healthcare facilities. Simple interventions such as having patients wear slippers could potentially reduce the risk for transfer of pathogens from floors to hands and high-touch surfaces.
Gloves and gowns are used during patient care to reduce contamination of personnel and prevent pathogen transmission.
To determine whether the use of gowns adds a substantial benefit over gloves alone in preventing patient-to-patient transfer of a viral DNA surrogate marker.
In total, 30 source patients had 1 cauliflower mosaic virus surrogate marker applied to their skin and clothing and a second to their bed rail and bedside table. Personnel caring for the source patients were randomized to wear gloves, gloves plus cover gowns, or no barrier. Interactions with up to 7 subsequent patients were observed, and the percentages of transfer of the DNA markers were compared among the 3 groups.
In comparison to the no-barrier group (57.8% transfer of 1 or both markers), there were significant reductions in transfer of the DNA markers in the gloves group (31.1% transfer; odds ratio [OR], 0.16; 95% confidence interval [CI], 0.02-0.73) and the gloves-plus-gown group (25.9% transfer; OR, 0.11; 95% CI, 0.01–0.51). The addition of a cover gown to gloves during the interaction with the source patient did not significantly reduce the transfer of the DNA marker (P = .53). During subsequent patient interactions, transfer of the DNA markers was significantly reduced if gloves plus gowns were worn and if hand hygiene was performed (P < .05).
Wearing gloves or gloves plus gowns reduced the frequency of patient-to-patient transfer of a viral DNA surrogate marker. The use of gloves plus gowns during interactions with the source patient did not reduce transfer in comparison to gloves alone.
The Cognitive Battery of the National Institutes of Health Toolbox (NIH-TB) is a collection of assessments that have been adapted and normed for administration across the lifespan and is increasingly used in large-scale population-level research. However, despite increasing adoption in longitudinal investigations of neurocognitive development, and growing recommendations that the Toolbox be used in clinical applications, little is known about the long-term temporal stability of the NIH-TB, particularly in youth.
The present study examined the long-term temporal reliability of the NIH-TB in a large cohort of youth (9–15 years-old) recruited across two data collection sites. Participants were invited to complete testing annually for 3 years.
Reliability was generally low-to-moderate, with intraclass correlation coefficients ranging between 0.31 and 0.76 for the full sample. There were multiple significant differences between sites, with one site generally exhibiting stronger temporal stability than the other.
Reliability of the NIH-TB Cognitive Battery was lower than expected given early work examining shorter test-retest intervals. Moreover, there were very few instances of tests meeting stability requirements for use in research; none of the tests exhibited adequate reliability for use in clinical applications. Reliability is paramount to establishing the validity of the tool, thus the constructs assessed by the NIH-TB may vary over time in youth. We recommend further refinement of the NIH-TB Cognitive Battery and its norming procedures for children before further adoption as a neuropsychological assessment. We also urge researchers who have already employed the NIH-TB in their studies to interpret their results with caution.
Decisions to treat large-vessel occlusion with endovascular therapy (EVT) or intravenous alteplase depend on how physicians weigh benefits against risks when considering patients’ comorbidities. We explored EVT/alteplase decision-making by stroke experts in the setting of comorbidity/disability.
In an international multi-disciplinary survey, experts chose treatment approaches under current resources and under assumed ideal conditions for 10 of 22 randomly assigned case scenarios. Five included comorbidities (cancer, cardiac/respiratory/renal disease, mild cognitive impairment [MCI], physical dependence). We examined scenario/respondent characteristics associated with EVT/alteplase decisions using multivariable logistic regressions.
Among 607 physicians (38 countries), EVT was chosen less often in comorbidity-related scenarios (79.6% under current resources, 82.7% assuming ideal conditions) versus six “level-1A” scenarios for which EVT/alteplase was clearly indicated by current guidelines (91.1% and 95.1%, respectively, odds ratio [OR] [current resources]: 0.38, 95% confidence interval 0.31–0.47). However, EVT was chosen more often in comorbidity-related scenarios compared to all other 17 scenarios (79.6% versus 74.4% under current resources, OR: 1.34, 1.17–1.54). Responses favoring alteplase for comorbidity-related scenarios (e.g. 75.0% under current resources) were comparable to level-1A scenarios (72.2%) and higher than all others (60.4%). No comorbidity independently diminished EVT odds when considering all scenarios. MCI and dependence carried higher alteplase odds; cancer and cardiac/respiratory/renal disease had lower odds. Being older/female carried lower EVT odds. Relevant respondent characteristics included performing more EVT cases/year (higher EVT-, lower alteplase odds), practicing in East Asia (higher EVT odds), and in interventional neuroradiology (lower alteplase odds vs neurology).
Moderate-to-severe comorbidities did not consistently deter experts from EVT, suggesting equipoise about withholding EVT based on comorbidities. However, alteplase was often foregone when respondents chose EVT. Differences in decision-making by patient age/sex merit further study.
The false codling moth (FCM), Thaumatotibia leucotreta (Lepidoptera: Tortricidae) is an insect pest which represents an important threat to the production and marketing of a wide range of agricultural crops in the African-Caribbean-Pacific (ACP) countries. The FCM reduces not only the yield and quality of the crop but also as a quarantine insect pest, restricts the trade of susceptible agricultural produce on the international market. In addition, little research has been conducted in the ACP countries on the bio-ecology and sustainable management of this pest, especially on vegetables for export. Thus, action-oriented research aimed at understanding the bio-ecology of this important pest is essential to achieve effective management. Various management interventions against this pest have been used in some parts of the world, especially in South Africa on citrus. Currently, farm sanitation is regarded as the key management strategy. Exploring and improving on other interventions such as Sterile Insect Technique, monitoring and mass trapping of male moths, augmentative biological control, use of bio-pesticides, protected cultivation and cold treatment may help to mitigate the expansion of FCM into other countries, especially in the European and Mediterranean Plant Protection Organization region where it has become a regulated insect pest since 2014. This review discussed the bio-ecology of FCM and highlighted some of the challenges and opportunities for its effective management and its implication for international trade, especially the export of chillies from the ACP countries into the European Union market which requires strict phytosanitary regulations.
Alluvial mineral sands rank among the most complex subjects for mineral characterization due to the diverse range of minerals present in the sediments, which may collectively contain a daunting number of elements (>20) in major or minor concentrations (>1 wt%). To comprehensively characterize the phase abundance and chemistry of these complex mineral specimens, a method was developed using hyperspectral x-ray and cathodoluminescence mapping in an electron probe microanalyser (EPMA), coupled with automated cluster analysis and quantitative analysis of clustered x-ray spectra. This method proved successful in identifying and quantifying over 40 phases from mineral sand specimens, including unexpected phases with low modal abundance (<0.1%). The standard-based quantification method measured compositions in agreement with expected stoichiometry, with elemental detection limits in the range of <10–1,000 ppm, depending on phase abundance, and proved reliable even for challenging mineral species, such as the multi-rare earth element (REE) bearing mineral xenotime [(Y,REE)PO4] for which 24 elements were analyzed, including 12 overlapped REEs. The mineral identification procedure was also capable of characterizing mineral groups that exhibit significant compositional variability due to the substitution of multiple elements, such as garnets (Mg, Ca, Fe, Mn, Cr), pyroxenes (Mg, Ca, Fe), and amphiboles (Na, Mg, Ca, Fe, Al).
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
The ALMA twenty-six arcmin2 survey of GOODS-S at one millimeter (ASAGAO) is a deep (1σ ∼ 61μJy/beam) and wide area (26 arcmin2) survey on a contiguous field at 1.2 mm. By combining with archival data, we obtained a deeper map in the same region (1σ ∼ 30μJy/beam−1, synthesized beam size 0.59″ × 0.53″), providing the largest sample of sources (25 sources at 5σ, 45 sources at 4.5σ) among ALMA blank-field surveys. The median redshift of the 4.5σ sources is 2.4. The number counts shows that 52% of the extragalactic background light at 1.2 mm is resolved into discrete sources. We create IR luminosity functions (LFs) at z = 1–3, and constrain the faintest luminosity of the LF at 2 < z < 3. The LFs are consistent with previous results based on other ALMA and SCUBA-2 observations, which suggests a positive luminosity evolution and negative density evolution.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
Dementia is a leading cause of morbidity and mortality without pharmacologic prevention or cure. Mounting evidence suggests that adherence to a Mediterranean dietary pattern may slow cognitive decline, and is important to characterise in at-risk cohorts. Thus, we determined the reliability and validity of the Mediterranean Diet and Culinary Index (MediCul), a new tool, among community-dwelling individuals with mild cognitive impairment (MCI). A total of sixty-eight participants (66 % female) aged 75·9 (sd 6·6) years, from the Study of Mental and Resistance Training study MCI cohort, completed the fifty-item MediCul at two time points, followed by a 3-d food record (FR). MediCul test–retest reliability was assessed using intra-class correlation coefficients (ICC), Bland–Altman plots and κ agreement within seventeen dietary element categories. Validity was assessed against the FR using the Bland–Altman method and nutrient trends across MediCul score tertiles. The mean MediCul score was 54·6/100·0, with few participants reaching thresholds for key Mediterranean foods. MediCul had very good test–retest reliability (ICC=0·93, 95 % CI 0·884, 0·954, P<0·0001) with fair-to-almost-perfect agreement for classifying elements within the same category. Validity was moderate with no systematic bias between methods of measurement, according to the regression coefficient (y=−2·30+0·17x) (95 % CI −0·027, 0·358; P=0·091). MediCul over-estimated the mean FR score by 6 %, with limits of agreement being under- and over-estimated by 11 and 23 %, respectively. Nutrient trends were significantly associated with increased MediCul scoring, consistent with a Mediterranean pattern. MediCul provides reliable and moderately valid information about Mediterranean diet adherence among older individuals with MCI, with potential application in future studies assessing relationships between diet and cognitive function.
The Zika virus was largely unknown to many health care systems before the outbreak of 2015. The unique public health threat posed by the Zika virus and the evolving understanding of its pathology required continuous communication between a health care delivery system and a local public health department. By leveraging an existing relationship, NYC Health+Hospitals worked closely with New York City Department of Health and Mental Hygiene to ensure that Zika-related processes and procedures within NYC Health+Hospitals facilities aligned with the most current Zika virus guidance. Support given by the public health department included prenatal clinical and laboratory support and the sharing of data on NYC Health+Hospitals Zika virus screening and testing rates, thus enabling this health care delivery system to make informed decisions and practices. The close coordination, collaboration, and communication between the health care delivery system and the local public health department examined in this article demonstrate the importance of working together to combat a complex public health emergency and how this relationship can serve as a guide for other jurisdictions to optimize collaboration between external partners during major outbreaks, emerging threats, and disasters that affect public health. (Disaster Med Public Health Preparedness. 2018;12:689-691)
Arthropod communities in the tropics are increasingly impacted by rapid changes in land use. Because species showing distinct seasonal patterns of activity are thought to be at higher risk of climate-related extirpation, global warming is generally considered a lower threat to arthropod biodiversity in the tropics than in temperate regions. To examine changes associated with land use and weather variables in tropical arthropod communities, we deployed Malaise traps at three major anthropogenic forests (secondary reserve forest, oil palm forest, and urban ornamental forest (UOF)) in Peninsular Malaysia and collected arthropods continuously for 12 months. We used metabarcoding protocols to characterize the diversity within weekly samples. We found that changes in the composition of arthropod communities were significantly associated with maximum temperature in all the three forests, but shifts were reversed in the UOF compared with the other forests. This suggests arthropods in forests in Peninsular Malaysia face a double threat: community shifts and biodiversity loss due to exploitation and disturbance of forests which consequently put species at further risk related to global warming. We highlight the positive feedback mechanism of land use and temperature, which pose threats to the arthropod communities and further implicates ecosystem functioning and human well-being. Consequently, conservation and mitigation plans are urgently needed.