To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Enteric illness outbreaks are complex events, therefore, outbreak investigators use many different hypothesis generation methods depending on the situation. This scoping review was conducted to describe methods used to generate a hypothesis during enteric illness outbreak investigations. The search included five databases and grey literature for articles published between 1 January 2000 and 2 May 2015. Relevance screening and article characterisation were conducted by two independent reviewers using pretested forms. There were 903 outbreaks that described hypothesis generation methods and 33 papers which focused on the evaluation of hypothesis generation methods. Common hypothesis generation methods described are analytic studies (64.8%), descriptive epidemiology (33.7%), food or environmental sampling (32.8%) and facility inspections (27.9%). The least common methods included the use of a single interviewer (0.4%) and investigation of outliers (0.4%). Most studies reported using two or more methods to generate hypotheses (81.2%), with 29.2% of studies reporting using four or more. The use of multiple different hypothesis generation methods both within and between outbreaks highlights the complexity of enteric illness outbreak investigations. Future research should examine the effectiveness of each method and the contexts for which each is most effective in efficiently leading to source identification.
To describe an outbreak of bacteremia caused by vancomycin-sensitive Enterococcus faecalis (VSEfe).
An investigation by retrospective case control and molecular typing by whole-genome sequencing (WGS).
A tertiary-care neonatal unit in Melbourne, Australia.
Risk factors for 30 consecutive neonates with VSEfe bacteremia from June 2011 to December 2014 were analyzed using a case control study. Controls were neonates matched for gestational age, birth weight, and year of birth. Isolates were typed using WGS, and multilocus sequence typing (MLST) was determined.
Bacteremia for case patients occurred at a median time after delivery of 23.5 days (interquartile range, 14.9–35.8). Previous described risk factors for nosocomial bacteremia did not contribute to excess risk for VSEfe. WGS typing results designated 43% ST179 as well as 14 other sequence types, indicating a polyclonal outbreak. A multimodal intervention that included education, insertion checklists, guidelines on maintenance and access of central lines, adjustments to the late onset sepsis antibiotic treatment, and the introduction of diaper bags for disposal of soiled diapers after being handled inside the bed, led to termination of the outbreak.
Typing using WGS identified this outbreak as predominately nonclonal and therefore not due to cross transmission. A multimodal approach was then sought to reduce the incidence of VSEfe bacteremia.
Two-dimensional particle-in-cell (PIC) simulations have been used to investigate the interaction between a laser pulse and a foil exposed to an external strong longitudinal magnetic field. Compared with that in the absence of the external magnetic field, the divergence of proton with the magnetic field in radiation pressure acceleration (RPA) regimes has improved remarkably due to the restriction of the electron transverse expansion. During the RPA process, the foil develops into a typical bubble-like shape resulting from the combined action of transversal ponderomotive force and instabilities. However, the foil prefers to be in a cone-like shape by using the magnetic field. The dependence of proton divergence on the strength of magnetic field has been studied, and an optimal magnetic field of nearly 60 kT is achieved in these simulations.
To determine the effectiveness and ease of use of an electronic reminder device in reducing urinary catheterization duration.
A randomized controlled trial with a cross-sectional anonymous online survey and focus group.
Ten wards in an Australian hospital.
All hospitalized patients with a urinary catheter.
An electronic reminder system, the CATH TAG, applied to urinary catheter bags to prompt removal of urinary catheters.
Catheterization duration and perceptions of nurses about the ease of use.
A Cox proportional hazards model was used to assess the rate of removal of catheters. A phenomenological approach underpinned data collection and analysis methods associated with the focus group.
In total, 1,167 patients with a urinary catheter were included. The mean durations in control and intervention phases were 5.51 days (95% confidence interval [CI], 4.9–6.2) and 5.08 days (95% CI, 4.6–5.6), respectively. For patients who had a CATH TAG applied, the hazard ratio (HR) was 1.02 (95% CI, 0.91–1.14; P = .75). A subgroup analysis excluded patients in an intensive care unit (ICU), and the use of the CATH TAG was associated with a 23% decrease in the mean, from 5.00 days (95% CI, 4.44–5.56) to 3.84 days (95% CI, 3.47–4.21). Overall, 82 nurses completed a survey and 5 nurses participated in a focus group. Responses regarding the device were largely positive, and benefits for patient care were identified.
The CATH TAG did not reduce the duration of catheterization, but potential benefits in patients outside the ICU were identified. Electronic reminders may be useful to aid prompt removal of urinary catheters in the non-ICU hospital setting.
Rabies is one of the major public health problems in China, and the mortality rate of rabies remains the highest among all notifiable infectious diseases. A meta-analysis was conducted to investigate the post-exposure prophylaxis (PEP) vaccination rate and risk factors for human rabies in mainland China. The PubMed, Web of Science, Chinese National Knowledge Infrastructure, Chinese Science and Technology Periodical and Wanfang databases were searched for articles on rabies vaccination status (published between 2007 and 2017). In total, 10 174 human rabies cases from 136 studies were included in this meta-analysis. Approximately 97.2% (95% confidence interval (CI) 95.1–98.7%) of rabies cases occurred in rural areas and 72.6% (95% CI 70.0–75.1%) occurred in farmers. Overall, the vaccination rate in the reported human rabies cases was 15.4% (95% CI 13.7–17.4%). However, among vaccinated individuals, 85.5% (95% CI 79.8%–83.4%) did not complete the vaccination regimen. In a subgroup analysis, the PEP vaccination rate in the eastern region (18.8%, 95% CI 15.9–22.1%) was higher than that in the western region (13.3%, 95% CI 11.1–15.8%) and this rate decreased after 2007. Approximately 68.9% (95% CI 63.6–73.8%) of rabies cases experienced category-III exposures, but their PEP vaccination rate was 27.0% (95% CI 14.4–44.9%) and only 6.1% (95% CI 4.4–8.4%) received rabies immunoglobulin. Together, these results suggested that the PEP vaccination rate among human rabies cases was low in mainland China. Therefore, standardised treatment and vaccination programs of dog bites need to be further strengthened, particularly in rural areas.
The response of soil microbial communities to soil quality changes is a sensitive indicator of soil ecosystem health. The current work investigated soil microbial communities under different fertilization treatments in a 31-year experiment using the phospholipid fatty acid (PLFA) profile method. The experiment consisted of five fertilization treatments: without fertilizer input (CK), chemical fertilizer alone (MF), rice (Oryza sativa L.) straw residue and chemical fertilizer (RF), low manure rate and chemical fertilizer (LOM), and high manure rate and chemical fertilizer (HOM). Soil samples were collected from the plough layer and results indicated that the content of PLFAs were increased in all fertilization treatments compared with the control. The iC15:0 fatty acids increased significantly in MF treatment but decreased in RF, LOM and HOM, while aC15:0 fatty acids increased in these three treatments. Principal component (PC) analysis was conducted to determine factors defining soil microbial community structure using the 21 PLFAs detected in all treatments: the first and second PCs explained 89.8% of the total variance. All unsaturated and cyclopropyl PLFAs except C12:0 and C15:0 were highly weighted on the first PC. The first and second PC also explained 87.1% of the total variance among all fertilization treatments. There was no difference in the first and second PC between RF and HOM treatments. The results indicated that long-term combined application of straw residue or organic manure with chemical fertilizer practices improved soil microbial community structure more than the mineral fertilizer treatment in double-cropped paddy fields in Southern China.
Our objective was to identify predictors of severe acute respiratory infection in hospitalised patients and understand the impact of vaccination and neuraminidase inhibitor administration on severe influenza. We analysed data from a study evaluating influenza vaccine effectiveness in two Michigan hospitals during the 2014–2015 and 2015–2016 influenza seasons. Adults admitted to the hospital with an acute respiratory infection were eligible. Through patient interview and medical record review, we evaluated potential risk factors for severe disease, defined as ICU admission, 30-day readmission, and hospital length of stay (LOS). Two hundred sixteen of 1119 participants had PCR-confirmed influenza. Frailty score, Charlson score and tertile of prior-year healthcare visits were associated with LOS. Charlson score >2 (OR 1.5 (1.0–2.3)) was associated with ICU admission. Highest tertile of prior-year visits (OR 0.3 (0.2–0.7)) was associated with decreased ICU admission. Increasing tertile of visits (OR 1.5 (1.2–1.8)) was associated with 30-day readmission. Frailty and prior-year healthcare visits were associated with 30-day readmission among influenza-positive participants. Neuraminidase inhibitors were associated with decreased LOS among vaccinated participants with influenza A (HR 1.6 (1.0–2.4)). Overall, frailty and lack of prior-year healthcare visits were predictors of disease severity. Neuraminidase inhibitors were associated with reduced severity among vaccine recipients.
Dengue is the fastest spreading mosquito-transmitted disease in the world. In China, Guangzhou City is believed to be the most important epicenter of dengue outbreaks although the transmission patterns are still poorly understood. We developed an autoregressive integrated moving average model incorporating external regressors to examine the association between the monthly number of locally acquired dengue infections and imported cases, mosquito densities, temperature and precipitation in Guangzhou. In multivariate analysis, imported cases and minimum temperature (both at lag 0) were both associated with the number of locally acquired infections (P < 0.05). This multivariate model performed best, featuring the lowest fitting root mean squared error (RMSE) (0.7520), AIC (393.7854) and test RMSE (0.6445), as well as the best effect in model validation for testing outbreak with a sensitivity of 1.0000, a specificity of 0.7368 and a consistency rate of 0.7917. Our findings suggest that imported cases and minimum temperature are two key determinants of dengue local transmission in Guangzhou. The modelling method can be used to predict dengue transmission in non-endemic countries and to inform dengue prevention and control strategies.
Chilling injury is an important natural stress that can threaten cotton production, especially at the sowing and seedling stages in early spring. It is therefore important for cotton production to improve chilling tolerance at these stages. The current work examines the potential for glycine betaine (GB) treatment of seeds to increase the chilling tolerance of cotton at the seedling stage. Germination under cold stress was increased significantly by GB treatment. Under low temperature, the leaves of seedlings from treated seeds exhibited a higher net photosynthetic rate (PN), higher antioxidant enzyme activity including superoxide dismutase, ascorbate peroxidase and catalase, lower hydrogen peroxide (H2O2) content and less damage to the cell membrane. Enzyme activity was correlated negatively with H2O2 content and degree of damage to the cell membrane but correlated positively with GB content. The experimental results suggested that although GB was only used to treat cotton seed, the beneficial effect caused by the preliminary treatment of GB could play a significant role during germination that persisted to at least the four-leaf seedling stage. Therefore, it is crucial that this method is employed in agricultural production to improve chilling resistance in the seedling stage by soaking the seeds in GB.
Central line-associated bloodstream infections (CLABSIs) in intensive care units (ICUs) result in poor clinical outcomes and increased costs. Although frequently regarded as preventable, infection risk may be influenced by non-modifiable factors. The objectives of this study were to evaluate organisational factors associated with CLABSI in Victorian ICUs to determine the nature and relative contribution of modifiable and non-modifiable risk factors. Data captured by the Australian and New Zealand Intensive Care Society regarding ICU-admitted patients and resources were linked to CLABSI surveillance data collated by the Victorian Healthcare Associated Infection Surveillance System between 1 January 2010 and 31 December 2013. Accepted CLABSI surveillance methods were applied and hospital/patient characteristics were classified as ‘modifiable’ and ‘non-modifiable’, enabling longitudinal Poisson regression modelling of CLABSI risk. In total, 26 ICUs were studied. Annual CLABSI rates were 1·72, 1·37, 1·00 and 0·93/1000 CVC days for 2010–2013. Of non-modifiable factors, the number of non-invasively ventilated patients standardised to total ICU bed days was found to be independently associated with infection (RR 1·07; 95% CI 1·01–1·13; P = 0·030). Modelling of modifiable risk factors demonstrated the existence of a policy for mandatory ultrasound guidance for central venous catheter (CVC) localisation (RR 0·51; 95% CI 0·37–0·70; P < 0·001) and increased number of sessional specialist full-time equivalents (RR 0·52; 95% CI 0·29–0·93; P = 0·027) to be independently associated with protection against infection. Modifiable factors associated with reduced CLABSI risk include ultrasound guidance for CVC localisation and increased availability of sessional medical specialists.
Aging is accompanied by cognitive decline that is escalated in older adults reporting extreme sleep duration. Social relationships can influence health outcomes and thus may qualify the association between sleep duration and cognitive function. The present study examines the moderating effects of marital status, household size, and social network with friends and relatives on the sleep–cognition association among older adults.
Data (N = 4,169) came from the Social Isolation, Health, and Lifestyles Survey, a nationally representative survey of community-dwelling older Singaporeans (≥ 60 years). Sleep duration and social relationships were self-reported. Cognitive function was assessed with the Short Portable Mental Status Questionnaire.
Regression analysis revealed that the inverted U-shaped association between sleep duration and cognitive function was less profound among older adults who were married (vs. unmarried) and those who had stronger (vs. weaker) social networks. In contrast, it was more prominent among individuals who had more (vs. fewer) household members.
Being married and having stronger social networks may buffer against the negative cognitive impact of extreme sleep duration. But larger household size might imply more stress for older persons, and therefore strengthen the sleep duration–cognitive function association. We discuss the potential biological underpinnings and the policy implications of the findings. Although our findings are based on a large sample, replication studies using objective measures of sleep duration and other cognitive measures are needed.
Introduction: Recently, campaigns placing considerable emphasis on improving emergency department (ED) care by reducing unnecessary tests, treatments, and/or procedures have been initiated. This study explored how Canadian emergency physicians (EPs) conceptualize unnecessary care in the ED. Methods: An online 60-question survey was distributed to EP-members of the Canadian Association of Emergency Physicians (CAEP) with valid emails. The survey explored respondents awareness/support for initiatives to improve ED care (i.e., reduce unnecessary tests, treatments and/or procedures) and asked respondents to define “unnecessary care” in the ED. Thematic qualitative analysis was performed on these responses to identify key themes and sub-themes and explore variation among EPs definitions of unnecessary care. Results: A total of 324 surveys were completed (response rate: 18%); 300 provided free-text definitions of unnecessary care. Most commonly, unnecessary ED care was defined as: 1) performing tests, treatments, procedures, and/or consults that were not indicated or potentially harmful (n=169) and/or 2) care that should have been provided within a non-emergent context for a non-urgent patient (n=143). Emergency physicians highlighted the role of system-level factors and system failures that result in ED presentations as definitions of unnecessary care (n=69). They also noted a distinction between providing necessary care for a non-urgent patient and performing inappropriate/non-evidenced based care. Finally, a tension emerged in their description of frustration with patient expectations (n=17) and/or non-ED referrals (n=24) for specific tests, treatments, and/or procedures. These frustrations were juxtaposed by participants who asserted that “in a patient-centred care environment, no care is unnecessary” (Participant 50; n=12). Conclusion: Variation in the definition of unnecessary ED care is evident among EPs and illustrates that EPs’ conceptualization of unnecessary care is more nuanced than current campaigns addressing ED care improvements represent. This may contribute to a perceived lack of uptake or support for these initiatives. Further exploring EPs perceptions of these campaigns has the potential to improve EP engagement and influence the language utilized by these programs.
Introduction: Choosing Wisely Canada® (CWC) launched in April 2012. Since then, the Emergency Medicine (EM) top-10 list of tests, treatments and procedures to avoid has been released and initiatives are on-going. This study explored CWC awareness and support among emergency physicians. Methods: A 60-question online survey was distributed to Canadian Association of Emergency Physicians (CAEP) members with valid e-mails. The survey collected information on demographics, awareness/support for CWC as well as physicians’ perceived barriers and facilitators to implementation. Descriptive statistics were performed in SPSS (Version 24). Results: Overall, 324 surveys were completed (response rate: 18%). Respondents were more often male (64%) and practiced at academic/tertiary care hospitals (56%) with mixed patient populations (74%) with annual ED volumes of >50,000 (70%). Respondents were familiar with campaigns to improve care (90%). Among these respondents, 98% were specifically familiar with CWC and 73% felt these campaigns assisted them in providing high-quality care. Respondents felt that the top-5 EM recommendations were supported by high quality evidence, specifically the first 4 recommendations (>90% each). The most frequently reported barriers to implementation were: patients’ expectations/requests (33%), the possibility of missing severe condition(s) (20%), and requirements of ED consultations (12%). Potential facilitators were identified as: strong evidence-base for recommendations (37%), medico-legal protection for clinicians who adhere to guidelines (13%), and support from institutional leadership (11%). Conclusion: CWC is well-known and supported by emergency physicians. Despite the low response rate, exploring the barriers and facilitators identified here could enhance CWC’s uptake in Canadian emergency departments.
Durations of nocturnal sleep and daytime nap influence the well-being of older adults. It is thus essential to understand their determinants. However, much previous research did not assess sleep duration and nap duration individually, and longitudinal data is lacking. This study aimed at examining the impact of demographic, psychosocial, and health factors, including ethnicity, social networks outside the household, smoking and physical exercise on sleep duration and nap duration among community-dwelling elderly.
Our study involved over 2,600 older adults (≥60 years) from a longitudinal, nationally representative survey – the Panel on Health and Ageing of Singaporean Elderly. Sleep and nap durations at Time 2 (two years later) were regressed on predictors measured at Time 1.
Time 2 short nocturnal sleep duration was predicted by Malay ethnicity (relative to Chinese and Indian), older age, lower education level, more depressive symptoms, and obesity, whereas future long nocturnal sleep duration was predicted by weaker social networks, older age, and more chronic diseases. Furthermore, smoking, obesity, Malay or Indian (relative to Chinese), older age, male gender, and cognitive impairment predicted longer daytime nap duration in the future.
Older adults’ nocturnal sleep and daytime nap durations may be affected by different demographic, psychosocial, and health factors. Thus, it is important to differentiate these two attributes in this age group.
The objective of the present study was to investigate live weight (LW) gain, urinary nitrogen (UN) excretion and urination behaviour of dairy heifers grazing pasture, chicory and plantain in autumn and spring. The study comprised a 35-day autumn trial (with a 7-day acclimation period) and a 28-days spring trial (with a 7-day acclimation period). For each trial, 56 Friesian × Jersey heifers were blocked into five dietary treatments balanced for their LW and breeding worth (i.e. genetic merit of a cow for production and reproduction): 1·00 perennial ryegrass–white clover pasture (PA); 1·00 chicory (CH); 1·00 plantain (PL); 0·50 pasture + 0·50 chicory (PA + CH); and 0·50 pasture + 0·50 plantain (PA + PL). A fresh allocation of the herbage was offered every 3 days with allowance calculated according to feed requirement for maintenance plus gain of 1·0 kg LW/day. In both trials, LW gain was lower on CH than other treatments. In the spring trial, UN concentration and UN excretion were lower in CH and PL than other treatments. In autumn, a higher urination frequency was observed over the first 6 h after forage allocation in CH and PA + CH than other treatments. Data from the present study indicate that feeding CH alone limited heifer LW gain. However, heifers grazing swards containing chicory (CH and PA + CH) and plantain (PL and PA + PL) had the potential to lower nitrous oxide emissions and nitrate leaching from soil compared with heifers grazing PA, by reducing N loading in urine patches.
The present study investigated alteration of brain resting-state activity induced by antidepressant treatment and attempted to investigate whether treatment efficacy can be predicted at an early stage of pharmacological treatment.
Forty-eight first-episode medication-free patients diagnosed with major depression received treatment with escitalopram. Resting-state functional magnetic resonance imaging was administered prior to treatment, 5 h after the first dose, during the course of pharmacological treatment (week 4) and at endpoint (week 8). Resting-state activity was evaluated in the course of the 8-week treatment and in relation to clinical improvement.
Escitalopram dynamically modified resting-state activity in depression during the treatment. After 5 h the antidepressant induced a significant decrease in the signal in the occipital cortex and an increase in the dorsolateral and dorsomedial prefrontal cortices and middle cingulate cortex. Furthermore, while remitters demonstrated more obvious changes following treatment, these were more modest in non-responders suggesting possible tonic and dynamic differences in the serotonergic system. Changes after 5 h in the caudate, occipital and temporal cortices were the best predictor of clinical remission at endpoint.
This study revealed the possibility of using the measurement of resting-state neural changes a few hours after acute administration of antidepressant to identify individuals likely to remit after a few weeks of treatment.
To study the association between gastrointestinal colonization of carbapenemase-producing Enterobacteriaceae (CPE) and proton pump inhibitors (PPIs).
We analyzed 31,526 patients with prospective collection of fecal specimens for CPE screening: upon admission (targeted screening) and during hospitalization (opportunistic screening, safety net screening, and extensive contact tracing), in our healthcare network with 3,200 beds from July 1, 2011, through December 31, 2015. Specimens were collected at least once weekly during hospitalization for CPE carriers and subjected to broth enrichment culture and multiplex polymerase chain reaction.
Of 66,672 fecal specimens collected, 345 specimens (0.5%) from 100 patients (0.3%) had CPE. The number and prevalence (per 100,000 patient-days) of CPE increased from 2 (0.3) in 2012 to 63 (8.0) in 2015 (P<.001). Male sex (odds ratio, 1.91 [95% CI, 1.15–3.18], P=.013), presence of wound or drain (3.12 [1.70–5.71], P<.001), and use of cephalosporins (3.06 [1.42–6.59], P=.004), carbapenems (2.21 [1.10–4.48], P=.027), and PPIs (2.84 [1.72–4.71], P<.001) in the preceding 6 months were significant risk factors by multivariable analysis. Of 79 patients with serial fecal specimens, spontaneous clearance of CPE was noted in 57 (72.2%), with a median (range) of 30 (3–411) days. Comparing patients without use of antibiotics and PPIs, consumption of both antibiotics and PPIs after CPE identification was associated with later clearance of CPE (hazard ratio, 0.35 [95% CI, 0.17–0.73], P=.005).
Concomitant use of antibiotics and PPIs prolonged duration of gastrointestinal colonization by CPE.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.
We calibrated portions of the radiocarbon time scale with combined 230Th, 231Pa, 14C measurements of corals collected from Espiritu Santo, Vanuatu and the Huon Peninsula, Papua New Guinea. The new data map 14C variations ranging from the current limit of the tree-ring calibration [11,900 calendar years before present (cal BP), Kromer and Spurk 1998, now updated to 12,400 cal B P, see Kromer et al., this issue], to the 14C-dating limit of 50,000 cal BP, with detailed structure between 14 to 16 cal kyr BP and 19 to 24 cal kyr BP. Samples older than 25,000 cal BP were analyzed with high-precision 231Pa dating methods (Pickett et al. 1994; Edwards et al. 1997) as a rigorous second check on the accuracy of the 230Th ages. These are the first coral calibration data to receive this additional check, adding confidence to the age data forming the older portion of the calibration. Our results, in general, show that the offset between calibrated and 14C ages generally increases with age until about 28,000 cal BP, when the recorded 14C age is nearly 6800 yr too young. The gap between ages before this time is less; at 50,000 cal BP, the recorded 14C age is 4600 yr too young. Two major 14C-age plateaus result from a 130 drop in Δ14C between 14–15 cal kyr BP and a 700 drop in Δ14C between 22–25 cal kyr BP. In addition, a large atmospheric Δ14C excursion to values over 1000 occurs at 28 cal kyr BP. Between 20 and 10 cal kyr BP, a component of atmospheric Δ14C anti-correlates with Greenland ice δ18O, indicating that some portion of the variability in atmospheric Δ14C is related to climate change, most likely through climate-related changes in the carbon cycle. Furthermore, the 28-kyr excursion occurs at about the time of significant climate shifts. Taken as a whole, our data indicate that in addition to a terrestrial magnetic field, factors related to climate change have affected the history of atmospheric 14C.
Carbapenem-resistant Acinetobacter baumannii (CRAB) with diverse multilocus sequence typing emerged among our nursing home residents (6.5%) with a high background rate of MRSA (32.2%). Rectal swabs yielded a higher rate of CRAB detection than axillary or nasal swabs. Bed-bound status, use of adult diapers, and nasogastric tube were risk factors for CRAB colonization.