To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
Antimicrobial resistance (AMR) is one of the defining global health threats of our time, but no international legal instrument currently offers the framework and mechanisms needed to address it. Fortunately, the actions needed to address AMR have considerable overlap with the actions needed to confront other pandemic threats.
Ceftazidime/avibactam (C/A), ceftolozane/tazobactam (C/T), imipenem/relebactam (I/R), and meropenem/vaborbactam (M/V) combine either a cephalosporin (C/T and C/A) or a carbapenem antibiotic (M/V and I/R) with a β-lactamase inhibitor. They are used to treat carbapenem-resistant Enterobacterales (CRE) and/or multidrug-resistant Pseudomonas aeruginosa (MDRPA).
We compared the pooled clinical success of these medications to older therapies.
PubMed and EMBASE were searched from January 1, 2012, through September 2, 2020, for C/A, C/T, I/R, and M/V studies. The main outcome was clinical success, which was assessed using random-effects models. Stratified analyses were conducted for study drug, sample size, quality, infection source, study design, and multidrug-resistant gram-negative organism (MDRGNO) population. Microbiological success and 28- and 30-day mortality were assessed as secondary outcomes. Heterogeneity was determined using I2 values.
Overall, 25 articles met the inclusion criteria; 8 observational studies and 17 randomized control trials. We detected no difference in clinical success comparing new combination antibiotics with standard therapies for all included organisms (pooled OR, 1.21; 95% CI, 0.96–1.51). We detected a moderate level of heterogeneity among the included studies I2 = 56%. Studies that focused on patients with CRE or MDRPA infections demonstrated a strong association between treatment with new combination antibiotics and clinical success (pooled OR, 2.20; 95% CI, 1.60–3.57).
C/T, C/A, I/R, and M/V are not inferior to standard therapies for treating various complicated infections, but they may have greater clinical success for treating MDRPA and CRE infections. More studies that evaluate the use of these antibiotics for drug-resistant infections are needed to determine their effectiveness.
We assessed trends in treatment of patients with CRE from 2012 through 2018. We detected decreased utilization of aminoglycosides and colistin and increased utilization in extended-spectrum cephalosporins and ceftazidime-avibactam. We found significant uptake of ceftazidime-avibactam, a newly approved antibiotic, to treat CRE infections.
For 40 patients with methicillin-resistant Staphylococcus aureus (MRSA) colonization, fist bump and elbow bump greetings resulted in frequent transfer of MRSA (25% vs 15%, respectively), but significantly fewer colonies were transferred via the elbow bump. Noncontact greetings should be encouraged to reduce the risk of transfer of healthcare-associated pathogens.
A single spray application of a continuously active disinfectant on portable equipment resulted in significant reductions in aerobic colony counts over 7 days and in recovery of Staphylococcus aureus and enterococci: 3 of 93 cultures (3%) versus 11 of 97 (11%) and 20 of 97 (21%) in quaternary ammonium disinfectant and untreated control groups, respectively.
To assess the potential for contamination of personnel, patients, and the environment during use of contaminated N95 respirators and to compare the effectiveness of interventions to reduce contamination.
Simulation study of patient care interactions using N95 respirators contaminated with a higher and lower inocula of the benign virus bacteriophage MS2.
In total, 12 healthcare personnel performed 3 standardized examinations of mannequins including (1) control with suboptimal respirator handling technique, (2) improved technique with glove change after each N95 contact, and (3) control with 1-minute ultraviolet-C light (UV-C) treatment prior to donning. The order of the examinations was randomized within each subject. The frequencies of contamination were compared among groups. Observations and simulations with fluorescent lotion were used to assess routes of transfer leading to contamination.
With suboptimal respirator handling technique, bacteriophage MS2 was frequently transferred to the participants, mannequin, and environmental surfaces and fomites. Improved technique resulted in significantly reduced transfer of MS2 in the higher inoculum simulations (P < .01), whereas UV-C treatment reduced transfer in both the higher- and lower-inoculum simulations (P < .01). Observations and simulations with fluorescent lotion demonstrated multiple potential routes of transfer to participants, mannequin, and surfaces, including both direct contact with the contaminated respirator and indirect contact via contaminated gloves.
Reuse of contaminated N95 respirators can result in contamination of personnel and the environment even when correct technique is used. Decontamination technologies, such as UV-C, could reduce the risk for transmission.
Recent estimates of global salt marsh area sit at 5.5 million hectares (Mcowen et al. 2017). Conservatively, this translates to $1 trillion of ecosystem services per annum, potentially as much as $5 trillion (De Groot et al. 2012, Mehvar et al. 2018), equivalent to the entire US federal budget for 2019. There can be little debate as to the value of salt marshes, both in terms of the ecosystem services they provide and the key part they play in helping us understand past climate and sea level trends. This chapter summarizes the preceding work and draws together some key observations and notable knowledge gaps highlighted in the previous chapters. We provide a focus on the expected response of salt marshes to the stresses created by a changing climate.
Salt marshes are considered some of the most biologically diverse and ecologically important regions on Earth, containing thousands of species of robust salt-tolerant plants, crabs, fish, mollusks, zooplankton, algae, and bacteria. Isolated between topographic headlands, laterally continuous behind protective barriers, or associated with extensive delta landscapes, salt marshes are regulated by a variety of physical forces such as waves, tides, rivers, and storm surges, but they are also impacted by climatic variations in temperature and precipitation, riverine flooding, local tectonics, and subsidence (i.e., a deltaic process that describes the lowering of the land surface). Biological forces also play important roles in controlling salt marsh landscapes as many species shape geomorphic development. As these landscapes form and evolve, there exist significant interactions between biology, hydrology, and geology; thus it is impossible to consider salt marsh geomorphology – i.e., how the landscape changes over time – without taking into account these principal interactions.
Vitamin D deficiency is associated with an increased risk of falls and fractures. Assuming this association is causal, we aimed to identify the number and proportion of hospitalisations for falls and hip fractures attributable to vitamin D deficiency (25 hydroxy D (25(OH)D) <50 nmol/l) in Australians aged ≥65 years. We used 25(OH)D data from the 2011/12 Australian Health Survey and relative risks from published meta-analyses to calculate population-attributable fractions for falls and hip fracture. We applied these to data published by the Australian Institute of Health and Welfare to calculate the number of events each year attributable to vitamin D deficiency. In men and women combined, 8·3 % of hospitalisations for falls (7991 events) and almost 8 % of hospitalisations for hip fractures (1315 events) were attributable to vitamin D deficiency. These findings suggest that, even in a sunny country such as Australia, vitamin D deficiency contributes to a considerable number of hospitalisations as a consequence of falls and for treatment of hip fracture in older Australians; in countries where the prevalence of vitamin D deficiency is higher, the impact will be even greater. It is important to mitigate vitamin D deficiency, but whether this should occur through supplementation or increased sun exposure needs consideration of the benefits, harms, practicalities and costs of both approaches.
To investigate the timing and routes of contamination of the rooms of patients newly admitted to the hospital.
Observational cohort study and simulations of pathogen transfer.
A Veterans’ Affairs hospital.
Patients newly admitted to the hospital with no known carriage of healthcare-associated pathogens.
Interactions between the participants and personnel or portable equipment were observed, and cultures of high-touch surfaces, floors, bedding, and patients’ socks and skin were collected for up to 4 days. Cultures were processed for Clostridioides difﬁcile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Simulations were conducted with bacteriophage MS2 to assess plausibility of transfer from contaminated floors to high-touch surfaces and to assess the effectiveness of wearing slippers in reducing transfer.
Environmental cultures became positive for at least 1 pathogen in 10 (59%) of the 17 rooms, with cultures positive for MRSA, C. difficile, and VRE in the rooms of 10 (59%), 2 (12%), and 2 (12%) participants, respectively. For all 14 instances of pathogen detection, the initial site of recovery was the floor followed in a subset of patients by detection on sock bottoms, bedding, and high-touch surfaces. In simulations, wearing slippers over hospital socks dramatically reduced transfer of bacteriophage MS2 from the floor to hands and to high-touch surfaces.
Floors may be an underappreciated source of pathogen dissemination in healthcare facilities. Simple interventions such as having patients wear slippers could potentially reduce the risk for transfer of pathogens from floors to hands and high-touch surfaces.
Gloves and gowns are used during patient care to reduce contamination of personnel and prevent pathogen transmission.
To determine whether the use of gowns adds a substantial benefit over gloves alone in preventing patient-to-patient transfer of a viral DNA surrogate marker.
In total, 30 source patients had 1 cauliflower mosaic virus surrogate marker applied to their skin and clothing and a second to their bed rail and bedside table. Personnel caring for the source patients were randomized to wear gloves, gloves plus cover gowns, or no barrier. Interactions with up to 7 subsequent patients were observed, and the percentages of transfer of the DNA markers were compared among the 3 groups.
In comparison to the no-barrier group (57.8% transfer of 1 or both markers), there were significant reductions in transfer of the DNA markers in the gloves group (31.1% transfer; odds ratio [OR], 0.16; 95% confidence interval [CI], 0.02-0.73) and the gloves-plus-gown group (25.9% transfer; OR, 0.11; 95% CI, 0.01–0.51). The addition of a cover gown to gloves during the interaction with the source patient did not significantly reduce the transfer of the DNA marker (P = .53). During subsequent patient interactions, transfer of the DNA markers was significantly reduced if gloves plus gowns were worn and if hand hygiene was performed (P < .05).
Wearing gloves or gloves plus gowns reduced the frequency of patient-to-patient transfer of a viral DNA surrogate marker. The use of gloves plus gowns during interactions with the source patient did not reduce transfer in comparison to gloves alone.
This is an epidemiological study of carbapenem-resistant Enterobacteriaceae (CRE) in Veterans’ Affairs medical centers (VAMCs). In 2017, almost 75% of VAMCs had at least 1 CRE case. We observed substantial geographic variability, with more cases in urban, complex facilities. This supports the benefit of tailoring infection control strategies to facility characteristics.
There is controversy regarding whether the addition of cover gowns offers a substantial benefit over gloves alone in reducing personnel contamination and preventing pathogen transmission.
Simulated patient care interactions.
To evaluate the efficacy of different types of barrier precautions and to identify routes of transmission.
In randomly ordered sequence, 30 personnel each performed 3 standardized examinations of mannequins contaminated with pathogen surrogate markers (cauliflower mosaic virus DNA, bacteriophage MS2, nontoxigenic Clostridioides difficile spores, and fluorescent tracer) while wearing no barriers, gloves, or gloves plus gowns followed by examination of a noncontaminated mannequin. We compared the frequency and routes of transfer of the surrogate markers to the second mannequin or the environment.
For a composite of all surrogate markers, transfer by hands occurred at significantly lower rates in the gloves-alone group (OR, 0.02; P < .001) and the gloves-plus-gown group (OR, 0.06; P = .002). Transfer by stethoscope diaphragms was common in all groups and was reduced by wiping the stethoscope between simulations (OR, 0.06; P < .001). Compared to the no-barriers group, wearing a cover gown and gloves resulted in reduced contamination of clothing (OR, 0.15; P < .001), but wearing gloves alone did not.
Wearing gloves alone or gloves plus gowns reduces hand transfer of pathogens but may not address transfer by devices such as stethoscopes. Cover gowns reduce the risk of contaminating the clothing of personnel.
The hands of healthcare personnel are the most important source for transmission of healthcare-associated pathogens. The role of contaminated fomites such as portable equipment, stethoscopes, and clothing of personnel in pathogen transmission is unclear.
To study routes of transmission of cauliflower mosaic virus DNA markers from 31 source patients and from environmental surfaces in their rooms.
A 3-month observational cohort study.
A Veterans’ Affairs hospital.
After providing care for source patients, healthcare personnel were observed during interactions with subsequent patients. Putative routes of transmission were identified based on recovery of DNA markers from sites of contact with the patient or environment. To assess plausibility of fomite-mediated transmission, we assessed the frequency of transfer of methicillin-resistant Staphylococcus aureus (MRSA) from the skin of 25 colonized patients via gloved hands versus fomites.
Of 145 interactions involving contact with patients and/or the environment, 41 (28.3%) resulted in transfer of 1 or both DNA markers to the patient and/or the environment. The DNA marker applied to patients’ skin and clothing was transferred most frequently by stethoscopes, hands, and portable equipment, whereas the marker applied to environmental surfaces was transferred only by hands and clothing. The percentages of MRSA transfer from the skin of colonized patients via gloved hands, stethoscope diaphragms, and clothing were 52%, 40%, and 48%, respectively.
Fomites such as stethoscopes, clothing, and portable equipment may be underappreciated sources of pathogen transmission. Simple interventions such as decontamination of fomites between patients could reduce the risk for transmission.
Reduction in the use of fluoroquinolone antibiotics has been associated with reductions in Clostridioides difficile infections (CDIs) due to fluoroquinolone-resistant strains.
To determine whether facility-level fluoroquinolone use predicts healthcare facility-associated (HCFA) CDI due to fluoroquinolone-resistant 027 strains.
Using a nationwide cohort of hospitalized patients in the Veterans’ Affairs Healthcare System, we identified hospitals that categorized >80% of CDI cases as positive or negative for the 027 strain for at least one-quarter of fiscal years 2011–2018. Within these facilities, we used visual summaries and multilevel logistic regression models to assess the association between facility-level fluoroquinolone use and rates of HCFA-CDI due to 027 strains, controlling for time and facility complexity level, and adjusting for correlated outcomes within facilities.
Between 2011 and 2018, 55 hospitals met criteria for reporting 027 results, including a total of 5,091 HCFA-CDI cases, with 1,017 infections (20.0%) due to 027 strains. Across these facilities, the use of fluoroquinolones decreased by 52% from 2011 to 2018, with concurrent reductions in the overall HCFA-CDI rate and the proportion of HCFA-CDI cases due to the 027 strain of 13% and 55%, respectively. A multilevel logistic model demonstrated a significant effect of facility-level fluoroquinolone use on the proportion of infections in the facility due to the 027 strain, most noticeably in low-complexity facilities.
Our findings provide support for interventions to reduce use of fluroquinolones as a control measure for CDI, particularly in settings where fluoroquinolone use is high and fluoroquinolone-resistant strains are common causes of infection.
For patients with methicillin-resistant Staphylococcus aureus (MRSA) colonization, a traditional fist-bump greeting did not significantly reduce MRSA transfer in comparison to a handshake. However, transfer was reduced with a modified fist bump that minimized the surface area of contact and when hand hygiene was performed before the handshake.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.