To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hendra virus (HeV) continues to cause fatal infection in horses and threaten infection in close-contact humans in eastern Australia. Species of Pteropus bats (flying-foxes) are the natural reservoir of the virus. We caught and sampled flying-foxes from a multispecies roost in southeast Queensland, Australia on eight occasions between June 2013 and June 2014. The effects of sample date, species, sex, age class, body condition score (BCS), pregnancy and lactation on HeV antibody prevalence, log-transformed median fluorescent intensity (lnMFI) values and HeV RNA status were assessed using unbalanced generalised linear models. A total of 1968 flying-foxes were sampled, comprising 1012 Pteropus alecto, 742 P. poliocephalus and 214 P. scapulatus. Sample date, species and age class were each statistically associated with HeV RNA status, antibody status and lnMFI values; BCS was statistically associated with HeV RNA status and antibody status. The findings support immunologically naïve sub-adult P. alecto playing an important role in maintaining HeV infection at a population level. The biological significance of the association between BCS and HeV RNA status, and BCS and HeV antibody status, is less clear and warrants further investigation. Contrary to previous studies, we found no direct association between HeV infection and pregnancy or lactation. The findings in P. poliocephalus suggest that HeV exposure in this species may not result in systemic infection and virus excretion, or alternatively, may reflect assay cross-reactivity with another (unidentified) henipavirus.
Determine the effectiveness of a personal protective equipment (PPE)-free zone intervention on healthcare personnel (HCP) entry hand hygiene (HH) and PPE donning compliance in rooms of patients in contact precautions.
Quasi-experimental, multicenter intervention, before-and-after study with concurrent controls.
All patient rooms on contact precautions on 16 units (5 medical-surgical, 6 intensive care, 5 specialty care units) at 3 acute-care facilities (2 academic medical centers, 1 Veterans Affairs hospital). Observations of PPE donning and entry HH compliance by HCP were conducted during both study phases. Surveys of HCP perceptions of the PPE-free zone were distributed in both study phases.
A PPE-free zone, where a low-risk area inside door thresholds of contact precautions rooms was demarcated by red tape on the floor. Inside this area, HCP were not required to wear PPE.
We observed 3,970 room entries. HH compliance did not change between study phases among intervention units (relative risk [RR], 0.92; P = .29) and declined in control units (RR, 0.70; P = .005); however, the PPE-free zone did not significantly affect compliance (P = .07). The PPE-free zone effect on HH was significant only for rooms on enteric precautions (P = .008). PPE use was not significantly different before versus after the intervention (P = .15). HCP perceived the zone positively; 65% agreed that it facilitated communication and 66.8% agreed that it permitted checking on patients more frequently.
HCP viewed the PPE-free zone favorably and it did not adversely affect PPE or HH compliance. Future infection prevention interventions should consider the complex sociotechnical system factors influencing behavior change.
Medical procedures and patient care activities may facilitate environmental dissemination of healthcare-associated pathogens such as methicillin-resistant Staphylococcus aureus (MRSA).
Observational cohort study of MRSA-colonized patients to determine the frequency of and risk factors for environmental shedding of MRSA during procedures and care activities in carriers with positive nares and/or wound cultures. Bivariate analyses were performed to identify factors associated with environmental shedding.
A Veterans Affairs hospital.
This study included 75 patients in contact precautions for MRSA colonization or infection.
Of 75 patients in contact precautions for MRSA, 55 (73%) had MRSA in nares and/or wounds and 25 (33%) had positive skin cultures. For the 52 patients with MRSA in nares and/or wounds and at least 1 observed procedure, environmental shedding of MRSA occurred more frequently during procedures and care activities than in the absence of a procedure (59 of 138, 43% vs 8 of 83, 10%; P < .001). During procedures, increased shedding occurred ≤0.9 m versus >0.9 m from the patient (52 of 138, 38% vs 25 of 138, 18%; P = .0004). Contamination occurred frequently on surfaces touched by personnel (12 of 38, 32%) and on portable equipment used for procedures (25 of 101, 25%). By bivariate analysis, the presence of a wound with MRSA was associated with shedding (17 of 29, 59% versus 6 of 23, 26%; P = .04).
Environmental shedding of MRSA occurs frequently during medical procedures and patient care activities. There is a need for effective strategies to disinfect surfaces and equipment after procedures.
The Supreme Court's decision in City of Los Angeles Department of Water and Power v. Manhart has engendered a considerable debate, much of which has appeared in the pages of this Journal. Defenders of the Manhart decision take its critics to task for failure to appreciate the place of that decision in the overall jurisprudence of employment discrimination. In this article, the authors challenge the underlying conception of the law of sex discrimination that is said to dictate the result in Manhart. Far from erecting a per se rule against all sex classifications, the Civil Rights Act of 1964 is shown to recognize both the relevance of prevalent social norms about sex differences and the legitimacy of certain interests of employers as limited justifications for the maintenance of sex-conscious lines in some circumstances, a recognition that contrasts sharply with the statute's categorical prohibition on racial classifications. It follows from this discussion that Manhart's outcome was not ordained by the ethos of the laws against sex discrimination.
Fedchenko Glacier experienced a large thickness loss since the first scientific investigations in 1928. As the largest glacier in the Pamir Mountains, this glacier plays an important role for the regional glacier mass budget. We use a series of Global Navigation Satellite Systems observations from 2009 to 2016 and TanDEM-X elevation models from 2011 to 2016 to investigate recent elevation changes. Accounting for radar wave penetration minimizes biases in elevation that can otherwise reach up to 6 m in dry snow on Fedchenko Glacier, with mean values of 3–4 m in the high accumulation regions. The seasonal elevation changes reach up to ±5 m. The glacier surface elevation decreased along its entire length over multi-year periods. Thinning rates increased between 2000 and 2016 by a factor of 1.8 compared with 1928–2000, resulting in peak values of 1.5 m a−1. Even the highest accumulation basins above 5000 m elevation have been affected by glacier thinning with change rates between −0.2 and −0.4 m a−1 from 2009 to 2016. The estimated glacier-wide mass-balance rates are −0.27 ± 0.05 m w.e. a−1 for 2000 to 2011 and −0.51 ± 0.04 m w.e. a−1 between 2011 and 2016.
To test the hypothesis that long-term care facility (LTCF) residents with Clostridium difficile infection (CDI) or asymptomatic carriage of toxigenic strains are an important source of transmission in the LTCF and in the hospital during acute-care admissions.
A 6-month cohort study with identification of transmission events was conducted based on tracking of patient movement combined with restriction endonuclease analysis (REA) and whole-genome sequencing (WGS).
Veterans Affairs hospital and affiliated LTCF.
The study included 29 LTCF residents identified as asymptomatic carriers of toxigenic C. difficile based on every other week perirectal screening and 37 healthcare facility-associated CDI cases (ie, diagnosis >3 days after admission or within 4 weeks of discharge to the community), including 26 hospital-associated and 11 LTCF-associated cases.
Of the 37 CDI cases, 7 (18·9%) were linked to LTCF residents with LTCF-associated CDI or asymptomatic carriage, including 3 of 26 hospital-associated CDI cases (11·5%) and 4 of 11 LTCF-associated cases (36·4%). Of the 7 transmissions linked to LTCF residents, 5 (71·4%) were linked to asymptomatic carriers versus 2 (28·6%) to CDI cases, and all involved transmission of epidemic BI/NAP1/027 strains. No incident hospital-associated CDI cases were linked to other hospital-associated CDI cases.
Our findings suggest that LTCF residents with asymptomatic carriage of C. difficile or CDI contribute to transmission both in the LTCF and in the affiliated hospital during acute-care admissions. Greater emphasis on infection control measures and antimicrobial stewardship in LTCFs is needed, and these efforts should focus on LTCF residents during hospital admissions.
A novel, alloy-agnostic, nanofunctionalization process has been utilized to produce metal matrix composites (MMCs) via additive manufacturing, providing new geometric freedom for MMC design. MMCs were produced with the addition of tungsten carbide nanoparticles to commercially available AlSi10Mg alloy powder. Tungsten carbide was chosen due to the potential for coherent crystallographic phases that were identified utilizing a lattice-matching approach to promote wetting and increase dislocation interactions. Structures were produced with evenly distributed strengthening phases leading to tensile strengths >385 MPa and a 50% decrease in wear rate over the commercially available AlSi10Mg alloy at only 1 vol% loading of tungsten carbide.
Rib bone biopsy samples are often used to estimate changes in skeletal mineral reserves in cattle but differences in sampling procedures and the bone measurements reported often make interpretation and comparisons among experiments difficult. ‘Full-core’ rib bone biopsy samples, which included the external cortical bone, internal cortical bone and trabecular bone (CBext, CBint and Trab, respectively), were obtained from cattle known to be in phosphorus (P) adequate (Padeq) or severely P-deficient (Pdefic) status. Experiments 1 and 2 examined growing steers and Experiment 3 mature breeder cows. The thickness of cortical bone, specific gravity (SG), and the amount and concentration of ash and P per unit fresh bone volume, differed among CBext, CBint and Trab bone. P concentration (mg/cc) was closely correlated with both SG and ash concentrations (pooled data, r=0.99). Thickness of external cortical bone (CBText) was correlated with full-core P concentration (FC-Pconc) (pooled data, r=0.87). However, an index, the amount of P in CBext per unit surface area of CBext (PSACB; mg P/mm2), was more closely correlated with the FC-Pconc (pooled data, FC-Pconc=37.0+146×PSACB; n=42, r=0.94, RSD=7.7). Results for measured or estimated FC-Pconc in 10 published studies with cattle in various physiological states and expected to be Padeq or in various degrees of Pdefic status were collated and the ranges of FC-Pconc indicative of P adequacy and P deficiency for various classes of cattle were evaluated. FC-Pconc was generally in the range 130 to 170 and 100 to 120 mg/cc fresh bone in Padeq mature cows and young growing cattle, respectively. In conclusion, the FC-Pconc could be estimated accurately from biopsy samples of CBext. This allows comparisons between studies where full-core or only CBext biopsy samples of rib bone have been obtained to estimate changes in the skeletal P status of cattle and facilitates evaluation of the P status of cattle.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
Sugarcane is an important forage resource in sub-tropical and tropical areas as it is used during the winter or dry season when the growth rate of pastures is significantly reduced. The current research study assessed the effect of four vertical sections of sugarcane in a pen trial and the level of sugarcane utilization in a grazing trial on the ingestive behaviour and forage intake of two age groups of steers (1 and 2 years old). The pen trial was comprised of two simultaneous 4 × 4 balanced Latin square designs (one for each age group of animals) of four periods, four animals and four feeding treatments, which consisted of four equal vertical sections of sugarcane. Dry matter (DM) and digestible DM (DDM) intake per kilogram of metabolic weight declined gradually from top to bottom of the sugarcane, with no significant differences between the age groups of steers. This difference in intake was associated with a decline in intake of neutral detergent fibre (NDF) as a proportion of the liveweight of the animal and an increase of total chewing time per kilogram of DM or NDF from top to bottom of the sugarcane. It was concluded that the toughness of plant material played a significant role regulating intake, which was higher for the top sections of sugarcane. In the grazing trial, steers of both age groups grazed down sugarcane in three plots over 9 days. Steers grazed up to four distinctive grazing strata. Digestible DM intake (DDM intake) was high at low levels of horizontal utilization of the top grazing stratum but DDM intake started to decline sharply when this stratum was removed in 0·92 of paddock area (i.e. equivalent to 0·08 of the pasture area remaining un-grazed). It was concluded that the proportion of un-grazed area of the pasture can be used as a grazing management strategy to control forage intake for sugarcane.
The paper analyzes the issues relating to the applicability of innovative material systems for flexible composite armors. The authors made several samplings of aramid fibers (Kevlar 49) by replacing the epoxy resin base, which is often described in the literature, with the thermoplastic matrix - polyethylene (HDPE) and polypropylene (PP). The samples were fired with .38 Special Full Metal Jacketed (FMJ) ammunition produced by the S&B Company, and then the process of firing was modeled in the ABAQUS program. The advantages and disadvantages of the new material system including the possibility of its use in the construction of hybrid composite armors have been presented on the basis of the results of numerical analyses and ballistic tests.
Introduction of biofortified cassava as school lunch can increase vitamin A intake, but may increase risk of other deficiencies due to poor nutrient profile of cassava. We assessed the potential effect of introducing a yellow cassava-based school lunch combined with additional food-based recommendations (FBR) on vitamin A and overall nutrient adequacy using Optifood (linear programming tool).
Cross-sectional study to assess dietary intakes (24 h recall) and derive model parameters (list of foods consumed, median serving sizes, food and food (sub)group frequency distributions, food cost). Three scenarios were modelled, namely daily diet including: (i) no school lunch; (ii) standard 5d school lunch with maize/beans; and (iii) 5d school lunch with yellow cassava. Each scenario and scenario 3 with additional FBR were assessed on overall nutrient adequacy using recommended nutrient intakes (RNI).
Primary-school children (n 150) aged 7–9 years.
Best food pattern of yellow cassava-based lunch scenario achieved 100 % RNI for six nutrients compared with no lunch (three nutrients) or standard lunch (five nutrients) scenario. FBR with yellow cassava and including small dried fish improved nutrient adequacy, but could not ensure adequate intake of fat (52 % of average requirement), riboflavin (50 % RNI), folate (59 % RNI) and vitamin A (49 % RNI).
Introduction of yellow cassava-based school lunch complemented with FBR potentially improved vitamin A adequacy, but alternative interventions are needed to ensure dietary adequacy. Optifood is useful to assess potential contribution of a biofortified crop to nutrient adequacy and to develop additional FBR to address remaining nutrient gaps.
This study provides an estimate of fresh water derived from ice melt for the ablation areas of glaciers in the Central Karakoram National Park (CKNP), Pakistan. In the CKNP there are ~700 glaciers, covering ~4600 km2, with widespread debris cover (518 km2). To assess meltwater volume we applied a distributed model able to describe both debris-covered and debris-free ice ablation. The model was calibrated using data collected in the field in the CKNP area and validated by comparison with ablation data collected in the field, independent of the data used in building the model. During 23 July–9 August 2011, the mean model-estimated ablation in the CKNP was 0.024 m w.e. d–1 in debris-covered areas and 0.037 m w.e. d–1 in debris-free areas. We found a mean error of +0.01 m w.e. (corresponding to 2%) and a root-mean-square error equal to 0.09 m w.e. (17%). According to our model, the ablation areas of all the glaciers in the CKNP produced a water volume of 1.963 km3 during the study period. Finally, we performed several sensitivity tests for assessing the impact of the input data variations.
In this paper we undertake a quantitative analysis of the dynamic process by which ice underneath a dry porous debris layer melts. We show that the incorporation of debris-layer airflow into a theoretical model of glacial melting can capture the empirically observed features of the so-called Østrem curve (a plot of the melt rate as a function of debris depth). Specifically, we show that the turning point in the Østrem curve can be caused by two distinct mechanisms: the increase in the proportion of ice that is debris-covered and/or a reduction in the evaporative heat flux as the debris layer thickens. This second effect causes an increased melt rate because the reduction in (latent) energy used for evaporation increases the amount of energy available for melting. Our model provides an explicit prediction for the melt rate and the temperature distribution within the debris layer, and provides insight into the relative importance of the two effects responsible for the maximum in the Østrem curve. We use the data of Nicholson and Benn (2006) to show that our model is consistent with existing empirical measurements.
Heat stress has a significant impact on all livestock and poultry species causing economic losses and animal well-being concerns. Providing shade is one heat-abatement strategy that has been studied for years. Material selected to provide shade for animals greatly influences the overall stress reduction provided by shade. A study was conducted to quantify both the environment and animal response, when cattle had no shade access during summertime exposure or were given access to shade provided by three different materials. A total of 32 Black Angus heifers were assigned to one of the four treatment pens according to weight (eight animals per pen). Each pen was assigned a shade treatment: No Shade, Snow Fence, 60% Aluminet Shade Cloth and 100% Shade Cloth. In the shaded treatment pens, the shade structure covered ~40% of the pen (7.5 m2/animal). Animals were moved to a different treatment every 2 weeks in a 4×4 Latin square design to ensure each treatment was applied to each group of animals. Both environmental parameters and physiological responses were measured during the experiment. Environmental parameters included dry-bulb temperature, relative humidity, wind speed, black globe temperature (BGT), solar radiation (SR) and feedlot surface temperature. Animal response measurements included manual respiration rate (RRm), electronic respiration rate (RRe), vaginal temperature (body temperature (BT)), complete blood count (CBC) and plasma cortisol. The environmental data demonstrated changes proportional to the quality of shade offered. However, the animal responses did not follow this same trend. Some of the data suggest that any amount of shade was beneficial to the animals. However, Snow Fence may not offer adequate protection to reduce BT. For some of the parameters (BT, CBC and cortisol), 60% Aluminet and 100% Shade Cloth offers similar protection. The 60% Aluminet lowered RRe the most during extreme conditions. When considering all parameters, environmental and physiological, 60% Aluminet Shade Cloth offered reductions of BGT, SR, feedlot surface temperature and the best (or equal to the best) overall protection for the animals (RRe, RRm, BT, blood parameters).
Hendra virus (HeV) was first described in 1994 in an outbreak of acute and highly lethal disease in horses and humans in Australia. Equine cases continue to be diagnosed periodically, yet the predisposing factors for infection remain unclear. We undertook an analysis of equine submissions tested for HeV by the Queensland government veterinary reference laboratory over a 20-year period to identify and investigate any patterns. We found a marked increase in testing from July 2008, primarily reflecting a broadening of the HeV clinical case definition. Peaks in submissions for testing, and visitations to the Government HeV website, were associated with reported equine incidents. Significantly differing between-year HeV detection rates in north and south Queensland suggest a fundamental difference in risk exposure between the two regions. The statistical association between HeV detection and stockhorse type may suggest that husbandry is a more important risk determinant than breed per se. The detection of HeV in horses with neither neurological nor respiratory signs poses a risk management challenge for attending veterinarians and laboratory staff, reinforcing animal health authority recommendations that appropriate risk management strategies be employed for all sick horses, and by anyone handling sick horses or associated biological samples.
Objectives: One of the most prominent features of schizophrenia is relatively lower general cognitive ability (GCA). An emerging approach to understanding the roots of variation in GCA relies on network properties of the brain. In this multi-center study, we determined global characteristics of brain networks using graph theory and related these to GCA in healthy controls and individuals with schizophrenia. Methods: Participants (N=116 controls, 80 patients with schizophrenia) were recruited from four sites. GCA was represented by the first principal component of a large battery of neurocognitive tests. Graph metrics were derived from diffusion-weighted imaging. Results: The global metrics of longer characteristic path length and reduced overall connectivity predicted lower GCA across groups, and group differences were noted for both variables. Measures of clustering, efficiency, and modularity did not differ across groups or predict GCA. Follow-up analyses investigated three topological types of connectivity—connections among high degree “rich club” nodes, “feeder” connections to these rich club nodes, and “local” connections not involving the rich club. Rich club and local connectivity predicted performance across groups. In a subsample (N=101 controls, 56 patients), a genetic measure reflecting mutation load, based on rare copy number deletions, was associated with longer characteristic path length. Conclusions: Results highlight the importance of characteristic path lengths and rich club connectivity for GCA and provide no evidence for group differences in the relationships between graph metrics and GCA. (JINS, 2016, 22, 240–249)
Studies have produced conflicting evidence regarding whether cognitive
control deficits in patients with schizophrenia result from dysfunction
within the cognitive control network (CCN; top-down) and/or unisensory
To investigate CCN and sensory cortex involvement during multisensory
cognitive control in patients with schizophrenia.
Patients with schizophrenia and healthy controls underwent functional
magnetic resonance imaging while performing a multisensory Stroop task
involving auditory and visual distracters.
Patients with schizophrenia exhibited an overall pattern of response
slowing, and these behavioural deficits were associated with a pattern of
patient hyperactivation within auditory, sensorimotor and posterior
parietal cortex. In contrast, there were no group differences in
functional activation within prefrontal nodes of the CCN, with small
effect sizes observed (incongruent–congruent trials). Patients with
schizophrenia also failed to upregulate auditory cortex with concomitant
increased attentional demands.
Results suggest a prominent role for dysfunction within auditory,
sensorimotor and parietal areas relative to prefrontal CCN nodes during
multisensory cognitive control.