To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To assess potential transmission of antibiotic-resistant organisms (AROs) using surrogate markers and bacterial cultures.
A 1,260-bed tertiary-care academic medical center.
The study included 25 patients (17 of whom were on contact precautions for AROs) and 77 healthcare personnel (HCP).
Fluorescent powder (FP) and MS2 bacteriophage were applied in patient rooms. HCP visits to each room were observed for 2–4 hours; hand hygiene (HH) compliance was recorded. Surfaces inside and outside the room and HCP skin and clothing were assessed for fluorescence, and swabs were collected for MS2 detection by polymerase chain reaction (PCR) and selective bacterial cultures.
Transfer of FP was observed for 20 rooms (80%) and 26 HCP (34%). Transfer of MS2 was detected for 10 rooms (40%) and 15 HCP (19%). Bacterial cultures were positive for 1 room and 8 HCP (10%). Interactions with patients on contact precautions resulted in fewer FP detections than interactions with patients not on precautions (P < .001); MS2 detections did not differ by patient isolation status. Fluorescent powder detections did not differ by HCP type, but MS2 was recovered more frequently from physicians than from nurses (P = .03). Overall, HH compliance was better among HCP caring for patients on contact precautions than among HCP caring for patients not on precautions (P = .003), among nurses than among other nonphysician HCP at room entry (P = .002), and among nurses than among physicians at room exit (P = .03). Moreover, HCP who performed HH prior to assessment had fewer fluorescence detections (P = .008).
Contact precautions were associated with greater HCP HH compliance and reduced detection of FP and MS2.
The number of medical mobile phone applications continues to grow. Although otorhinolaryngology-specific applications represent a small proportion, there are exciting innovations emerging for the specialty. This article will assess the number of applications available and review how they may be used in clinical practice.
The application stores of the two most popular mobile phone platforms, Apple and android, were searched using multiple search terms.
A total of 107 ENT applications were identified and categorised according to intended use. Eight applications were reviewed in more detail and assessed on whether a doctor or allied health professional was involved in their design and if they were evidence-based.
There are a number of ENT-specific smartphone applications currently available. As the technology progresses, their scope has extended beyond being purely for reference. Nevertheless, it remains difficult to assess the validity and security of these applications.
Sub-acute ruminal acidosis (SARA) can reduce the production efficiency and impair the welfare of cattle, potentially in all production systems. The aim of this study was to characterise measurable postmortem observations from divergently managed intensive beef finishing farms with high rates of concentrate feeding. At the time of slaughter, we obtained samples from 19 to 20 animals on each of 6 beef finishing units (119 animals in total) with diverse feeding practices, which had been subjectively classified as being high risk (three farms) or low risk (three farms) for SARA on the basis of the proportions of barley, silage and straw in the ration. We measured the concentrations of histamine, lipopolysaccharide (LPS), lactate and other short-chain fatty acids (SCFAs) in ruminal fluid, LPS and SCFA in caecal fluid. We also took samples of the ventral blind sac of the rumen for histopathology, immunohistopathology and gene expression. Subjective assessments were made of the presence of lesions on the ruminal wall, the colour of the lining of the ruminal wall and the shape of the ruminal papillae. Almost all variables differed significantly and substantially among farms. Very few pathological changes were detected in any of the rumens examined. The animals on the high-risk diets had lower concentrations of SCFA and higher concentrations of lactate and LPS in the ruminal fluid. Higher LPS concentrations were found in the caecum than the rumen but were not related to the risk status of the farm. The diameters of the stratum granulosum, stratum corneum and of the vasculature of the papillae, and the expression of the gene TLR4 in the ruminal epithelium were all increased on the high-risk farms. The expression of IFN-γ and IL-1β and the counts of cluster of differentiation 3 positive and major histocompatibility complex class two positive cells were lower on the high-risk farms. High among-farm variation and the unbalanced design inherent in this type of study in the field prevented confident assignment of variation in the dependent variables to individual dietary components; however, the CP percentage of the total mixed ration DM was the factor that was most consistently associated with the variables of interest. Despite the strong effect of farm on the measured variables, there was wide inter-animal variation.
What are the implications of international law for attitudes toward wartime violence? Existing research offers contrasting views on the ability of international legal principles to shape individual preferences, especially in difficult situations involving armed conflict. Employing cross-national survey evidence from several conflict and post-conflict countries, this article contributes to this debate by evaluating the relationship between individuals’ knowledge of the laws of war and attitudes toward wartime conduct. Findings show that exposure to international law is associated with a significant reduction in support for wartime abuses, though the results are stronger for prisoner treatment than for targeting civilians. Analysis further reveals that legal principles generate different expectations of conduct than alternative value systems that are rooted in strong moral foundations regarding the impermissibility of wartime abuses. The findings are relevant for understanding the relationship between international law and domestic actors, and how legal principles relate to the resort to violence.
This study evaluated in a rigorous 18-month randomized controlled trial the efficacy of an enhanced vocational intervention for helping individuals with a recent first schizophrenia episode to return to and remain in competitive work or regular schooling.
Individual Placement and Support (IPS) was adapted to meet the goals of individuals whose goals might involve either employment or schooling. IPS was combined with a Workplace Fundamentals Module (WFM) for an enhanced, outpatient, vocational intervention. Random assignment to the enhanced integrated rehabilitation program (N = 46) was contrasted with equally intensive clinical treatment at UCLA, including social skills training groups, and conventional vocational rehabilitation by state agencies (N = 23). All patients were provided case management and psychiatric services by the same clinical team and received oral atypical antipsychotic medication.
The IPS–WFM combination led to 83% of patients participating in competitive employment or school in the first 6 months of intensive treatment, compared with 41% in the comparison group (p < 0.005). During the subsequent year, IPS–WFM continued to yield higher rates of schooling/employment (92% v. 60%, p < 0.03). Cumulative number of weeks of schooling and/or employment was also substantially greater with the IPS–WFM intervention (45 v. 26 weeks, p < 0.004).
The results clearly support the efficacy of an enhanced intervention focused on recovery of participation in normative work and school settings in the initial phase of schizophrenia, suggesting potential for prevention of disability.
The diagnosis and management of personality disorders continues to evolve and develop alongside psychiatry internationally, however, not always in a linear fashion. Trainees working in a variety of clinical areas have regular exposure to personality disorder presentations. Psychiatry training bodies continue to adapt their training structure and curriculum, however, there seems to be a lack of sufficient emphasis with regards this area. We are now embarking on a new diagnostic system for personality disorders; this may impact on our clinical practice and perspective of these patients. The role of psychiatrists in diagnosing and managing personality disorders can be unclear at times and may benefit from on-going reflection and standardization.
Root-knot nematodes represent a serious threat to world coffee production, especially Meloidogyne incognita and M. paranaensis. Most cultivars of Coffea arabica are highly susceptible to these parasites and cultivation in infested areas has only been possible with the use of resistant C. canephora rootstocks. In this research, three elite clones of C. canephora, selected in areas infested by M. incognita and M. paranaensis, were evaluated in controlled conditions to assess levels of resistance against two populations of M. paranaensis, four populations of M. incognita and a mixed population of both species. The three clones were resistant to both species, but CcK1 and CcR2 were considered most promising because their vegetative growth was not impaired by nematodes.
Cover crop–based, organic rotational no-till (CCORNT) corn and soybean systems have been developed in the mid-Atlantic region to build soil health, increase management flexibility, and reduce labor. In this system, a roller-crimped cover crop mulch provides within-season weed suppression in no-till corn and soybean. A cropping system experiment was conducted in Pennsylvania, Maryland, and Delaware to test the cumulative effects of a multitactic weed management approach in a 3-yr hairy vetch/triticale–corn–cereal rye–soybean–winter wheat CCORNT rotation. Treatments included delayed planting dates (early, intermediate, late) and supplemental weed control using high-residue (HR) cultivation in no-till corn and soybean phases. In the no-till corn phase, HR cultivation decreased weed biomass relative to the uncultivated control by 58%, 23%, and 62% in Delaware, Maryland, and Pennsylvania, respectively. In the no-till soybean phase, HR cultivation decreased weed biomass relative to the uncultivated treatment planted in narrow rows (19 to 38 cm) by 20%, 41%, and 78% in Delaware, Maryland, and Pennsylvania, respectively. Common ragweed was more dominant in soybean (39% of total biomass) compared with corn (10% of total biomass), whereas giant foxtail and smooth pigweed were more dominant in corn, comprising 46% and 22% of total biomass, respectively. Common ragweed became less abundant as corn and soybean planting dates were delayed, whereas giant foxtail and smooth pigweed increased as a percentage of total biomass as planting dates were delayed. At the Pennsylvania location, inconsistent termination of cover crops with the roller-crimper resulted in volunteer cover crops in other phases of the rotation. Our results indicate that HR cultivation is necessary to achieve adequate weed control in CCORNT systems. Integration of winter grain or perennial forages into CCORNT systems will also be an important management tactic for truncating weed seedbank population increases.
In the mid-Atlantic region, there is increasing interest in the use of intercropping strategies to establish cover crops in corn cropping systems. However, intercropping may be limited by potential injury to cover crops from residual herbicide programs. Field experiments were conducted from 2013 to 2015 at Pennsylvania, Maryland, and New York locations (n=8) to evaluate the effect of common residual corn herbicides on interseeded red clover and annual ryegrass. Cover crop establishment and response to herbicide treatments varied across sites and years. S-metolachlor, pyroxasulfone, pendimethalin, and dimethenamid-P reduced annual ryegrass biomass relative to the nontreated check, whereas annual ryegrass biomass in acetochlor treatments was no different compared with the nontreated check. The rank order of observed annual ryegrass biomass reduction among chloroacetamide herbicides was S-metolachlor>pyroxasulfone>dimethenamid-P>acetochlor. Annual ryegrass biomass was not reduced by any of the broadleaf control herbicides. Mesotrione reduced red clover biomass 80% compared to the nontreated check. No differences in red clover biomass were observed between saflufenacil, rimsulfuron and atrazine treatments compared to the nontreated check. Red clover was not reduced by any of the grass control herbicides. This research suggests that annual ryegrass and red clover can be successfully interseeded in silt loam soils of Pennsylvania following use of several shorter-lived residual corn herbicides, but further research is needed in areas with soil types other than silt loam or outside of the mid-Atlantic cropping region.
To evaluate healthcare worker (HCW) risk of self-contamination when donning and doffing personal protective equipment (PPE) using fluorescence and MS2 bacteriophage.
Prospective pilot study.
A total of 36 HCWs were included in this study: 18 donned/doffed contact precaution (CP) PPE and 18 donned/doffed Ebola virus disease (EVD) PPE.
HCWs donned PPE according to standard protocols. Fluorescent liquid and MS2 bacteriophage were applied to HCWs. HCWs then doffed their PPE. After doffing, HCWs were scanned for fluorescence and swabbed for MS2. MS2 detection was performed using reverse transcriptase PCR. The donning and doffing processes were videotaped, and protocol deviations were recorded.
Overall, 27% of EVD PPE HCWs and 50% of CP PPE HCWs made ≥1 protocol deviation while donning, and 100% of EVD PPE HCWs and 67% of CP PPE HCWs made ≥1 protocol deviation while doffing (P=.02). The median number of doffing protocol deviations among EVD PPE HCWs was 4, versus 1 among CP PPE HCWs. Also, 15 EVD PPE protocol deviations were committed by doffing assistants and/or trained observers. Fluorescence was detected on 8 EVD PPE HCWs (44%) and 5 CP PPE HCWs (28%), most commonly on hands. MS2 was recovered from 2 EVD PPE HCWs (11%) and 3 CP PPE HCWs (17%).
Protocol deviations were common during both EVD and CP PPE doffing, and some deviations during EVD PPE doffing were committed by the HCW doffing assistant and/or the trained observer. Self-contamination was common. PPE donning/doffing are complex and deserve additional study.
Haiti has the highest human rabies burden in the Western Hemisphere. There is no published literature describing the public's perceptions of rabies in Haiti, information that is critical to developing effective interventions and government policies. We conducted a knowledge, attitudes and practices survey of 550 community members and 116 health professionals in Pétionville, Haiti in 2013 to understand the perception of rabies in these populations. The majority of respondents (85%) knew that dogs were the primary reservoir for rabies, yet only 1% were aware that bats and mongooses could transmit rabies. Animal bites were recognized as a mechanism of rabies transmission by 77% of the population and 76% were aware that the disease could be prevented by vaccination. Of 172 persons reporting a bite, only 37% sought medical treatment. The annual bite incidence rate in respondents was 0·9%. Only 31% of bite victims reported that they started the rabies vaccination series. Only 38% of respondents reported that their dog had been vaccinated against rabies. The majority of medical professionals recognized that dogs were the main reservoir for rabies (98%), but only 28% reported bats and 14% reported mongooses as posing a risk for rabies infection. Bites were reported as a mechanism of rabies transmission by 73% of respondents; exposure to saliva was reported by 20%. Thirty-four percent of medical professionals reported they would wash a bite wound with soap and water and 2·8% specifically mentioned rabies vaccination as a component of post-bite treatment. The majority of healthcare professionals recommended some form of rabies assessment for biting animals; 68·9% recommended a 14-day observation period, 60·4% recommended a veterinary consultation, and 13·2% recommended checking the vaccination status of the animal. Fewer than 15% of healthcare professionals had ever received training on rabies prevention and 77% did not know where to go to procure rabies vaccine for bite victims. Both study populations had a high level of knowledge about the primary reservoir for rabies and the mode of transmission. However, there is a need to improve the level of knowledge regarding the importance of seeking medical care for dog bites and additional training on rabies prevention for healthcare professionals. Distribution channels for rabies vaccines should be evaluated, as the majority of healthcare providers did not know where rabies vaccines could be obtained. Canine rabies vaccination is the primary intervention for rabies control programmes, yet most owned dogs in this population were not vaccinated.
This study was undertaken to further develop our understanding of the links between breed, diet and the rumen microbial community and determine their effect on production characteristics and methane (CH4) emissions from beef cattle. The experiment was of a 2×2 factorial design, comprising two breeds (crossbred Charolais (CHX); purebred Luing (LU)) and two diets (concentrate-straw or silage-based). In total, 80 steers were used and balanced for sire within each breed, farm of origin and BW across diets. The diets (fed as total mixed rations) consisted of (g/kg dry matter (DM)) forage to concentrate ratios of either 500 : 500 (Mixed) or 79 : 921 (Concentrate). Steers were adapted to the diets over a 4-week period and performance and feed efficiency were then measured over a 56-day test period. Directly after the 56-day test, CH4 and carbon dioxide (CO2) emissions were measured (six steers/week) over a 13-week period. Compared with LU steers, CHX steers had greater average daily gain (ADG; P<0.05) and significantly (P<0.001) lower residual feed intake. Crossbred Charolais steers had superior conformation and fatness scores (P<0.001) than LU steers. Although steers consumed, on a DM basis, more Concentrate than Mixed diet (P<0.01), there were no differences between diets in either ADG or feed efficiency during the 56-day test. At slaughter, however, Concentrate-fed steers were heavier (P<0.05) and had greater carcass weights than Mixed-fed steers (P<0.001). Breed of steer did not influence CH4 production, but it was substantially lower when the Concentrate rather than Mixed diet was fed (P<0.001). Rumen fluid from Concentrate-fed steers contained greater proportions of propionic acid (P<0.001) and lower proportions of acetic acid (P<0.001), fewer archaea (P<0.01) and protozoa (P=0.09), but more Clostridium Cluster XIVa (P<0.01) and Bacteroides plus Prevotella (P<0.001) than Mixed-fed steers. When the CH4 to CO2 molar ratio was considered as a proxy method for CH4 production (g/kg DM intake), only weak relationships were found within diets. In conclusion, although feeding Concentrate and Mixed diets produced substantial differences in CH4 emissions and rumen characteristics, differences in performance were influenced more markedly by breed.