To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The average yearly net ablation rate on permanently ice-covered Lake Fryxell, Victoria Land, Antarctica, is 30 to 40 cm. This figure was calculated by a novel method utilizing a record of ablation which is incorporated in the ice cover of the lake. These values are higher than those measured on Ross Island 80 km. to the east; the difference in ablation rates for the two areas is attributed to the prevalence of katabatic winds in the climate of Taylor Valley. The Lake Fryxell ablation figure is applied to nearby Canada and Commonwealth Glaciers in the calculation of their ice budgets.
This study was conducted to examine the incidence trend of campylobacteriosis in Michigan over a 10-year period and to investigate risk factors and clinical outcomes associated with infection. Campylobacter case data from 2004 to 2013 was obtained from the Michigan Disease Surveillance System. We conducted statistical and spatial analyses to examine trends and identify factors linked to campylobacteriosis as well as ecological associations using animal density data from the National Agricultural Statistics Service. An increasing trend of Campylobacter incidence and hospitalization was observed, which was linked to specific age groups and rural residence. Cases reporting ruminant contact and well water as the primary drinking source had a higher risk of campylobacteriosis, while higher cattle density was associated with an increased risk at the county level. Additional studies are needed to identify age-specific risk factors and examine prevalence and transmission dynamics in ruminants and the environment to aid in the development of more effective preventive strategies.
Geological disposal facilities (GDF) are intended to isolate and contain radioactive waste within multiple protective barriers, deep underground, to ensure that no harmful quantities of radioactivity reach the surface environment. The last line of defense in a multi-barrier GDF is the geosphere, where iron is present in the host rock mineralogy as either Fe(II) or Fe(III), and in groundwater as Fe(II) under reducing conditions. The mobility of risk-driving radionuclides, including uranium and technetium, in the environment is affected significantly by their valence state. Due to its low redox potential, Fe(II) can mediate reduction of these radionuclides from their oxidized, highly mobile, soluble state to their reduced, insoluble state, preventing them from reaching the biosphere. Here a study of five types of potential host rocks, two granitoids, an andesite, a mudstone and a clay-rich carbonate, is reported. The bulk rocks and their minerals were analysed for iron content, Fe(II/III) ratio, and for the speciation and fine-grained nature of alteration product minerals that might have important controls on groundwater interaction. Total iron content varies between 0.9% in clays to 5.6% in the andesite. X-ray absorption spectroscopy reveals that Fe in the granitoids and andesite is predominantly Fe(II), and in mudstones, argillaceous limestone and terrestrial sandstone is predominantly Fe(III). The redox reactivity of the potential host rocks both in the presence and absence of Fe(II)-containing 'model' groundwater was investigated using an azo dye as a probe molecule. Reduction rates as determined by reactivity with the azo dye were correlated with the ability of the rocks to uptake Fe(II) from groundwater rather than with initial Fe(II) content. Potential GDF host rocks must be characterized in terms of mineralogy, texture, grain size and bulk geochemistry to assess how they might interact with groundwater. This study highlights the importance of redox reactivity, not just total iron and Fe(II)/(III) ratio, when considering the host rock performance as a barrier material to limit transport of radionuclides from the GDF.
To determine the relative risk of invasive methicillin-resistant Staphylococcus aureus (MRSA) infection among non-colonized (NC) patients, intermittently colonized (IC) patients, and persistently colonized (PC) patients.
Observational cohort study of patient data collected longitudinally over a 41-month period.
Department of Veterans Affairs Eastern Colorado Healthcare System, a tertiary care medical center.
Any patient who received ≥5 MRSA nasal swab tests between February 20, 2010, and July 26, 2013. In total, 3,872 patients met these criteria, 0 were excluded, 95% were male, 71% were white, and the mean age was 62.9 years on the date of study entry.
Patients were divided into cohorts based on MRSA colonization status. Physicians reviewed medical records to identify invasive infection and were blinded to colonization status. Cox and Kaplan-Meier analyses were used to assess the relationship between colonization status and invasive infection.
In total, 102 patients developed invasive MRSA infections, 16.3% of these were PC patients, 11.2% of these were IC patients, and 0.5% of these were NC patients. PC patients were at higher risk of invasive infection than NC patients (hazard ratio [HR] 36.8; 95% CI, 18.4–73.6; P<.001). IC patients were also at higher risk than NC patients (HR, 22.8; 95% CI, 13.3–39.3; P<.001). The difference in risk between PC and IC patients was not statistically significant (HR, 1.61; 95% CI, 0.94–2.78, P=.084). Alternate analysis methods confirmed these results.
The risk of invasive MRSA infection is much higher among PC and IC patients, supporting routine clinical testing for colonization. However, this risk is similar among PC and IC patients, suggesting that distinguishing between the 2 colonization states may not be clinically important.
Infect. Control Hosp. Epidemiol. 2015;36(11):1292–1297
The mechanisms by which agriculture spread across Europe in the Neolithic, and the speed at which it happened, have long been debated. Attempts to quantify the process by constructing spatio-temporal models have given a diversity of results. In this paper, a new approach to the problem of modelling is advanced. Data from over 300 Neolithic sites from Asia Minor and Europe are used to produce a global picture of the emergence of farming across Europe which also allows for variable local conditions. Particular attention is paid to coastal enhancement: the more rapid advance of the Neolithic along coasts and rivers, as compared with inland or terrestrial domains. The key outcome of this model is hence to confirm the importance of waterways and coastal mobilities in the spread of farming in the early Neolithic, and to establish the extent to which this importance varied regionally.
Patients with psychiatric disorders have an increased rate of cardiovascular morbidity and mortality compared with the general population. Metabolic issues such as weight gain, dyslipidemia, diabetes mellitus, diabetic ketoacidosis, and pancreatitis have been reported with the use of antipsychotic agents. Although atypical antipsychotics have not been linked directly to the development of metabolic syndrome, these medications have been shown to increase risk factors that can lead to metabolic and endocrine disturbances. Therefore, clinicians should provide ongoing monitoring for patients who are being treated for psychiatric disorders with these agents. According to the 2004 Consensus Report on Antipsychotics, screening measures should include baseline and follow-up monitoring of personal/family histories, weight (body mass index), waist circumference, blood pressure, fasting plasma glucose, and fasting lipid profile.
Duchenne muscular dystrophy (DMD) is a severe muscle disease that affects afflicted males from a young age, and the mdx mouse is an animal model of this disease. Although new drugs are in development, it is also essential to assess potential dietary therapies that could assist in the management of DMD. In the present study, we compared two diets, high-MUFA diet v. high-PUFA diet, in mdx mice. To generate the high-PUFA diet, a portion of dietary MUFA (oleic acid) was replaced with the dietary essential n-3 PUFA α-linolenic acid (ALA). We sought to determine whether ALA, compared with oleic acid, was beneficial in mdx mice. Consumption of the high-PUFA diet resulted in significantly higher n-3 PUFA content and reduced arachidonic acid content in skeletal muscle phospholipids (PL), while the high-MUFA diet led to higher oleate content in PL. Mdx mice on the high-MUFA diet exhibited 2-fold lower serum creatine kinase activity than those on the high-PUFA diet (P< 0·05) as well as a lower body fat percentage (P< 0·05), but no significant difference in skeletal muscle histopathology results. There was no significant difference between the dietary groups with regard to phosphorylated p65 (an inflammatory marker) in skeletal muscle. In conclusion, alteration of PL fatty acid (FA) composition by the high-PUFA diet made mdx muscle more susceptible to sarcolemmal leakiness, while the high-MUFA diet exhibited a more favourable impact. These results may be important for designing dietary treatments for DMD patients, and future work on dietary FA profiles, such as comparing other FA classes and dose effects, is needed.
In England, people with a serious mental illness are offered a standardized care plan under the Care Programme Approach (CPA). A crisis plan is a mandatory part of this standard; however, the quality and in particular the level of individualisation of these crisis plans are unknown. In this context, the aim of this study was to assess the quality of crisis planning and the impact of exposure to a specialized crisis planning intervention.
The crisis plans of 424 participants were assessed, before and after exposure to the Joint Crisis Plan (JCP) intervention, for ‘individualisation’ (i.e., at least one item of specific and identifiable information about an individual). Associations of individualisation were investigated.
A total of 15% of crisis plans were individualised at baseline. There was little or no improvement following exposure to the JCP. Individualised crisis plans were not associated with a history of prior crises or incidences of harm to self and others.
Routine crisis planning for individuals with serious mental illness is not influenced by clinical risk profiles. ‘Top down’ implementation of the policy is unlikely to generate the best practice and compliance if clinicians do not perceive the clinical value in the process.
This report updates US Public Health Service recommendations for the management of healthcare personnel (HCP) who experience occupational exposure to blood and/or other body fluids that might contain human immunodeficiency virus (HIV). Although the principles of exposure management remain unchanged, recommended HIV postexposure prophylaxis (PEP) regimens and the duration of HIV follow-up testing for exposed personnel have been updated. This report emphasizes the importance of primary prevention strategies, the prompt reporting and management of occupational exposures, adherence to recommended HIV PEP regimens when indicated for an exposure, expert consultation in management of exposures, follow-up of exposed HCP to improve adherence to PEP, and careful monitoring for adverse events related to treatment, as well as for virologie, immunologic, and serologic signs of infection. To ensure timely postexposure management and administration of HIV PEP, clinicians should consider occupational exposures as urgent medical concerns, and institutions should take steps to ensure that staff are aware of both the importance of and the institutional mechanisms available for reporting and seeking care for such exposures. The following is a summary of recommendations: (1) PEP is recommended when occupational exposures to HIV occur; (2) the HIV status of the exposure source patient should be determined, if possible, to guide need for HIV PEP; (3) PEP medication regimens should be started as soon as possible after occupational exposure to HIV, and they should be continued for a 4-week duration; (4) new recommendation—PEP medication regimens should contain 3 (or more) antiretroviral drugs (listed in Appendix A) for all occupational exposures to HIV; (5) expert consultation is recommended for any occupational exposures to HIV and at a minimum for situations described in Box 1; (6) close follow-up for exposed personnel (Box 2) should be provided that includes counseling, baseline and follow-up HIV testing, and monitoring for drug toxicity; follow-up appointments should begin within 72 hours of an HIV exposure; and (7) new recommendation—if a newer fourth-generation combination HIV p24 antigen-HIV antibody test is utilized for follow-up HIV testing of exposed HCP, HIV testing may be concluded 4 months after exposure (Box 2); if a newer testing platform is not available, follow-up HIV testing is typically concluded 6 months after an HIV exposure.
Although livestock production accounts for a sizeable share of global greenhouse gas emissions, numerous technical options have been identified to mitigate these emissions. In this review, a subset of these options, which have proven to be effective, are discussed. These include measures to reduce CH4 emissions from enteric fermentation by ruminants, the largest single emission source from the global livestock sector, and for reducing CH4 and N2O emissions from manure. A unique feature of this review is the high level of attention given to interactions between mitigation options and productivity. Among the feed supplement options for lowering enteric emissions, dietary lipids, nitrates and ionophores are identified as the most effective. Forage quality, feed processing and precision feeding have the best prospects among the various available feed and feed management measures. With regard to manure, dietary measures that reduce the amount of N excreted (e.g. better matching of dietary protein to animal needs), shift N excretion from urine to faeces (e.g. tannin inclusion at low levels) and reduce the amount of fermentable organic matter excreted are recommended. Among the many ‘end-of-pipe’ measures available for manure management, approaches that capture and/or process CH4 emissions during storage (e.g. anaerobic digestion, biofiltration, composting), as well as subsurface injection of manure, are among the most encouraging options flagged in this section of the review. The importance of a multiple gas perspective is critical when assessing mitigation potentials, because most of the options reviewed show strong interactions among sources of greenhouse gas (GHG) emissions. The paper reviews current knowledge on potential pollution swapping, whereby the reduction of one GHG or emission source leads to unintended increases in another.