To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The study adopts a naturalistic perspective, looking at the relationship between socio-economic status (SES), activities and variation sets in child-directed speech (CDS) to Spanish-speaking Argentinian toddlers. It aims to determine the effect of SES and type of activity on the proportion of words and utterances in variation sets and on the pragmatic function they serve in interaction. Thirty two children (mean: 14.3 months) and their families were audio-recorded for four hours and the middle two hours were analyzed using CLAN. We developed an automatic algorithm for variation sets extraction that compares noun, verb and adjective lexemes in successive utterances. Mixed-effects beta regression showed SES and activity type effects on the proportion of variation sets and on the pragmatic function served by variation sets. Findings revealed that the contextual variables considered impact on how interlocutors organize the information to young children at the local level of natural at home interactions.
Tamarins and chimpanzees differ in many aspects of their behavior, biology, and evolutionary history; however, both primates are heavily dependent on a diet of ripe fruits during all months of the year (reviewed in Digby et al. 2011; Stumpf 2011). In addition, previous research on cognition in tamarins and chimpanzees indicates that individuals retain spatial information concerning the location of many feeding sites (e.g., Garber 2000; Janmaat et al. 2013a; Normand et al. 2009). Since primates show a high level of site fidelity (Janmaat et al. 2009) and commonly rely on sessile food sources that are revisited many times over a limited part of the year (such as termite nests and trees producing fruits, leaves, flowers, and/or exudates), one might expect foragers to reuse a limited set of travel routes, return to previously visited feeding sites, and search for new food patches in locations nearby current feeding sites.
There is a need to develop feeding strategies to prevent the adverse effect of concentrate feeding in high-performance horses fed energy-dense diets aiming to maintain their health and welfare. The objective of this study is to determine the effect of a VistaEQ product containing 4% live yeast Saccharomyces cerevisiae (S. cerevisiae), with activity 5 × 108 colony-forming unit/g and fed 2 g/pony per day, on faecal microbial populations when supplemented with high-starch and high-fibre diets using Illumina next generation sequencing of the V3-V4 region of the 16S ribosomal RNA gene. The four treatments were allocated to eight mature Welsh section A pony geldings enrolled in a 4-period × 8 animal crossover design. Each 19-day experimental period consisted of an 18-day adaptation phase and a single collection day, followed by a 7-day wash out period. After DNA extraction from faeces and library preparation, α-diversity and linear discriminant analysis effect size were performed using 16S metagenomics pipeline in Quantitative Insights Into Microbial Ecology (QIIME™) and Galaxy/Hutlab. Differences between the groups were considered significant when linear discriminant analysis score was >2 corresponding to P < 0.05. The present study showed that S. cerevisiae used was able to induce positive changes in the equine microbiota when supplemented to a high-fibre diet: it increased relative abundance (RA) of Lachnospiraceae and Dehalobacteriaceae family members associated with a healthy core microbiome. Yeast supplementation also increased the RA of fibrolytic bacteria (Ruminococcus) when fed with a high-fibre diet and reduced the RA of lactate producing bacteria (Streptococcus) when a high-starch diet was fed. In addition, yeast increased the RA of acetic, succinic acid producing bacterial family (Succinivibrionaceae) and butyrate producing bacterial genus (Roseburia) when fed with high-starch and high-fibre diets, respectively. VistaEQ supplementation to equine diets can be potentially used to prevent acidosis and increase fibre digestibility. It may help to meet the energy requirements of performance horses while maintaining gut health.
Fibre is essential to maintain healthy gut; however, energy demands of performance horses can be too high to be met by forages alone. Yeast may support the function of cellulolytic bacteria to digest fibre. The aim of this work was to determine the effect of an oral supplement (VistaEQ) containing 4% live yeast on the in vitro and in vivo digestibility of high-starch (HS) and high-fibre diets (HF). Eight ponies were used in a 4 × 4 Latin square design consisting of 4- × 19-day periods and four diets: HF, HF + yeast (HFY), HS and HS + yeast (HSY). In vivo apparent digestibility (AD) was estimated using total collection technique, and faecal particle size was measured using NASCO digestive analyser. Faeces from the ponies were subsequently used as an inoculum in ANKOM RF gas production system to assess fermentation kinetics in vitro. Each module contained 1 g of feed substrate DM in the following combinations: 50% grass hay and 50% alfalfa (HF_50 : 50) or concentrate (HS_50 : 50), and 75% grass hay and 25% alfalfa (HF_75 : 25) or concentrate (HS_75 : 25) with or without yeast. Yeast was able to induce more gas production from HF_75 : 25, HS_75 : 25 and HF_50 : 50 feed substrates incubated with respective faecal inoculum base. Yeast did not affect pH in vitro when the substrates were incubated in 50 : 50 ratio, while the pH was higher for HF_75 : 25 incubated with correspondent faecal inoculum compared to HS_75 : 25 and HSY_75 : 25. Yeast had no effects on ADF and CP AD of either diet. Yeast addition increased DM (HF: 0.2%, HS: 0.4%), organic matter (HF: 0.7%, HS: 1.3%), NDF (HF: 0.5%, HS: 1.5%), total detergent fibre (HF: 0.7%; HS: 0.4%) (P < 0.05) and also tended to increase hemicellulose AD (HF: 0.9%, HS: 1.2%) (P < 0.10). Faecal pH in vivo was higher for both HF diets compared to HS diet without yeast supplementation (P < 0.001, HF and HFY: 6.8; HS: 6.6, HSY: 6.7). However, no difference was observed in faecal pH when HSY was compared to both HF diets. Yeast had no effect on the size of the faecal particles (P > 0.05). Yeast increased in vitro gas production, suggesting more energy could be extracted from the feed, and the in vivo AD of some of the nutrients when HF and HS diets were fed.
The job of an engine driver appeared ideal for research studies because of its extremely high responsibility coupled with a high level of permanent concentration.
The main elements of the study focused on applying the EQ-5D (Quality of Life Questionnaire), SVF 78 (Stress Processing Questionnaire) and FBL-R (Freiburg Complaint List; revised version).
It turned out hat exactly fifty percent of all train drivers regret having chosen their job and judged their professional strain very high (p< 0.001). The SVF 78 reveals the following: persons of the First Group achieved much higher values in the scales regarding “escape” (p=0.029), “mental preoccupation after work” (p=0.003), “resignation” (p=0,011) and in the total amount of negative strategies (p=0.004), while Group 2 presents only higher values in the scale concerning the “playing down” (p=0.039) effect.
There were also a wide-ranging differences between the two groups, especially in the scales about “general state of health” (p< 0.001), “cardiovascular complaints” (p< 0.001), “pain” (p< 0.001), “sensory perception” (p< 0.001), “emotional reactivity” (p=0.001) and “total sum of complaints” (p< 0.001). The EQ-5D showed that train drivers who are content about the choice of their job judge their quality of life much higher than those regretting their choice (p< 0.001).
The results make clear that the dissatisfaction about the chosen job has negative effects on both the physical and psychological well-being.
Evaluating the effectiveness of training courses for basic qualification for occupational re-entry, empowerment and self-determination.
Participants with psychiatric disorders attended a one-year course on basic qualifications and empowerment for occupational re-entry. The main elements of the course were empowerment topics and training of self-determination. On behalf of the psychiatric disorders data sets were collected in a narrative manner.
A total of 28 people were enrolled in the study. 57.2% of the participants were male and 42.8% female. Male mean age was 34.2a and female 33.3a (p>0.050). Among other psychiatric disorders 45.5% of the participants showed problems with alcohol. Before the programme 100% of the participants were jobless. The retention rate after one year was 9.1%. 27.3% could be placed in kind of a occupational training. The same rate of participants could get into a job after the one-year course. Participants judged self-determination significant higher after the one-year course. Training costs amounted to 4.867Euro/Person/year.
A high percentage of the study participants could be re-integrated into the working market. Basic qualification and empowerment programmes could be an effective tool for people with mental disorders regarding social and occupational issues.
Evaluation of QOL, stress and coping strategies of people with psychiatric disorders after a one-year training course for social and occupational reintegration.
The main elements of the study focused on the SVF 120 (Stress Processing Questionnaire) and PLC (Quality of Life with Chronic Disease questionnaire).
In total a number of 20 people were enrolled in the study. Mean age of the participants was 33.4 years. The reintegration programme showed a retention rate of 100%. The PLC questionnaire showed no statistically significant results (p>0.050). The SVF 120 showed a slightly better score after one year regarding social encapsulating (p= 0.056) and trivializing (p = 0.021). After the one-year training course 60% of the participants could be reintegrated in an occupational activity and 40% could achieve a better social status.
Our results showed that participant of the programme used stress coping strategies morte than before the course. Reintegration data make clear that training courses of people with psychiatric disorders can achieve valuable results.
NASA has put people in unique and extreme environments for over six decades. Supporting these individuals with a comprehensive health-care system has evolved over this period. As the Apollo program ended and NASA began to contemplate a space shuttle and space station program, societal pressures in the late 1960s and early 1970s caused federal agencies such as NASA to reconsider how to link the needs of the space program with a growing pressure to address societal needs by forging interagency partnerships. The Space Technology Applied to the Rural Papago Health Care (STARPAHC) project provides an example of how NASA sought to balance these two imperatives in an age of diminishing federal support. This project can provide lessons for today’s uncertain budgetary future for agencies such as NASA, which are once again being asked to find creative and innovative ways to support their missions while demonstrating their larger value to society.
Interfacility patient movement plays an important role in the dissemination of antimicrobial-resistant organisms throughout healthcare systems. We evaluated how 3 alternative measures of interfacility patient sharing were associated with C. difficile infection incidence in Ontario acute-care facilities.
The cohort included adult acute-care facility stays of ≥3 days between April 2003 and March 2016. We measured 3 facility-level metrics of patient sharing: general patient importation, incidence-weighted patient importation, and C. difficile case importation. Each of the 3 patient-sharing metrics were examined against the incidence of C. difficile infection in the facility per 1,000 stays, using Poisson regression models.
The analyzed cohort included 6.70 million stays at risk of C. difficile infection across 120 facilities. Over the 13-year period, we included 62,189 new cases of healthcare-associated CDI (incidence, 9.3 per 1,000 stays). After adjustment for facility characteristics, general importation was not strongly associated with C. difficile infection incidence (risk ratio [RR] per doubling, 1.10; 95% confidence interval [CI], 0.97–1.24; proportional change in variance [PCV], −2.0%). Incidence-weighted (RR per doubling, 1.18; 95% CI, 1.06–1.30; PCV, −8.4%) and C. difficile case importation (RR per doubling, 1.43; 95% CI, 1.29–1.58; PCV, −30.1%) were strongly associated with C. difficile infection incidence.
In this 13-year study of acute-care facilities in Ontario, interfacility variation in C. difficile infection incidence was associated with importation of patients from other high-incidence acute-care facilities or specifically of patients with a recent history of C. difficile infection. Regional infection control strategies should consider the potential impact of importation of patients at high risk of C. difficile shedding from outside facilities.
To better understand barriers and facilitators that contribute to antibiotic overuse in long-term care and to use this information to inform an evidence and theory-informed program.
Information on barriers and facilitators associated with the assessment and management of urinary tract infections were identified from a mixed-methods survey and from focus groups with stakeholders working in long-term care. Each barrier or facilitator was mapped to corresponding determinants of behavior change, as described by the theoretical domains framework (TDF). The Rx for Change database was used to identify strategies to address the key determinants of behavior change.
In total, 19 distinct barriers and facilitators were mapped to 8 domains from the TDF: knowledge, skills, environmental context and resources, professional role or identity, beliefs about consequences, social influences, emotions, and reinforcements. The assessment of barriers and facilitators informed the need for a multifaceted approach with the inclusion of strategies (1) to establish buy-in for the changes; (2) to align organizational policies and procedures; (3) to provide education and ongoing coaching support to staff; (4) to provide information and education to residents and families; (5) to establish process surveillance with feedback to staff; and (6) to deliver reminders.
The use of a stepped approach was valuable to ensure that locally relevant barriers and facilitators to practice change were addressed in the development of a regional program to help long-term care facilities minimize antibiotic prescribing for asymptomatic bacteriuria. This stepped approach provides considerable opportunity to advance the design and impact of antimicrobial stewardship programs.
Although emergency service personnel experience markedly elevated the rates of post-traumatic stress disorder (PTSD), there are no rigorously conducted trials for PTSD in this population. This study assessed the efficacy of cognitive behaviour therapy (CBT) for PTSD in emergency service personnel, and examined if brief exposure (CBT-B) to trauma memories is no less efficacious as prolonged exposure (CBT-L).
One hundred emergency service personnel with PTSD were randomised to either immediate CBT-L, CBT-B or wait-list (WL). Following post-treatment assessment, WL participants were randomised to an active treatment. Participants randomised to CBT-L or CBT-B were assessed at baseline, post-treatment and at 6-month follow-up. Both CBT conditions involved 12 weekly individual sessions comprising education, CBT skills building, imaginal exposure, in vivo exposure, cognitive restructuring and relapse prevention. Imaginal exposure occurred for 40 min per session in CBT-L and for 10 min in CBT-B.
At post-treatment, participants in WL had smaller reductions in PTSD severity (Clinician Administered PTSD Scale), depression, maladaptive appraisals about oneself and the world, and smaller improvements on psychological and social quality of life than CBT-L and CBT-B. There were no differences between CBT-L and CBT-B at follow-up on primary or secondary outcome measures but both CBT-L and CBT-B had large baseline to follow-up effect sizes for reduction of PTSD symptoms.
This study highlights that CBT, which can include either long or brief imaginal exposure, is efficacious in reducing PTSD in emergency service personnel.
This chapter examines Spinoza’s recommendation that all the patricians in an aristocracy “should be of the same Religion, a very simple and most Universal Religion, such as we described in that Treatise.” What does Spinoza mean here by the “very simple and most Universal Religion?”, he asks. Garber argues against the view that Spinoza intends the dogmas of the TTP outlining a religion of reason to replace traditional religions. Religion for Spinoza, Garber argues, is practice, not faith, and it involves imperatives to be followed and not dogmas or beliefs to be held. The “very simple and most Universal Religion,” he argues, consists only of the imperative to love one’s neighbor as oneself, and to love God above all. The dogmas of Universal Faith are needed only for those not capable of attaining religion through reason: for the rational agent, the imperatives are not laws, given by a divine lawgiver, but eternal truths
Clostridium difficile spores play an important role in transmission and can survive in the environment for several months. Optimal methods for measuring environmental C. difficile are unknown. We sought to determine whether increased sample surface area improved detection of C. difficile from environmental samples.
Samples were collected from 12 patient rooms in a tertiary-care hospital in Toronto, Canada.
Samples represented small surface-area and large surface-area floor and bedrail pairs from single-bed rooms of patients with low (without prior antibiotics), medium (with prior antibiotics), and high (C. difficile infected) shedding risk. Presence of C. difficile in samples was measured using quantitative polymerase chain reaction (qPCR) with targets on the 16S rRNA and toxin B genes and using enrichment culture.
Of the 48 samples, 64·6% were positive by 16S qPCR (geometric mean, 13·8 spores); 39·6% were positive by toxin B qPCR (geometric mean, 1·9 spores); and 43·8% were positive by enrichment culture. By 16S qPCR, each 10-fold increase in sample surface area yielded 6·6 times (95% CI, 3·2–13) more spores. Floor surfaces yielded 27 times (95% CI, 4·9–181) more spores than bedrails, and rooms of C. difficile–positive patients yielded 11 times (95% CI, 0·55–164) more spores than those of patients without prior antibiotics. Toxin B qPCR and enrichment culture returned analogous findings.
Clostridium difficile spores were identified in most floor and bedrail samples, and increased surface area improved detection. Future research aiming to understand the role of environmental C. difficile in transmission should prefer samples with large surface areas.
Antibiotic use varies widely between hospitals, but the influence of antimicrobial stewardship programs (ASPs) on this variability is not known. We aimed to determine the key structural and strategic aspects of ASPs associated with differences in risk-adjusted antibiotic utilization across facilities.
Observational study of acute-care hospitals in Ontario, Canada
A survey was sent to hospitals asking about both structural (8 elements) and strategic (32 elements) components of their ASP. Antibiotic use from hospital purchasing data was acquired for January 1 to December 31, 2014. Crude and adjusted defined daily doses per 1,000 patient days, accounting for hospital and aggregate patient characteristics, were calculated across facilities. Rate ratios (RR) of defined daily doses per 1,000 patient days were compared for hospitals with and without each antimicrobial stewardship element of interest.
Of 127 eligible hospitals, 73 (57%) participated in the study. There was a 7-fold range in antibiotic use across these facilities (min, 253 defined daily doses per 1,000 patient days; max, 1,872 defined daily doses per 1,000 patient days). The presence of designated funding or resources for the ASP (RRadjusted, 0·87; 95% CI, 0·75–0·99), prospective audit and feedback (RRadjusted, 0·80; 95% CI, 0·67–0·96), and intravenous-to-oral conversion policies (RRadjusted, 0·79; 95% CI, 0·64–0·99) were associated with lower risk-adjusted antibiotic use.
Wide variability in antibiotic use across hospitals may be partially explained by both structural and strategic ASP elements. The presence of funding and resources, prospective audit and feedback, and intravenous-to-oral conversion should be considered priority elements of a robust ASP.
Electrocardiographic changes resulting from apical hypertrophic cardiomyopathy may mimic an acute coronary syndrome. A 67-year-old Sudanese male without cardiac risk factors presented to hospital with chest pain and electrocardiographic findings of septal ST-segment elevation, ST-segment depression in V4-V6, and diffuse T-wave inversion. He was treated as an acute ST-elevation myocardial infarction with thrombolytics. There was no cardiac biomarker rise and coronary angiography did not reveal evidence of significant coronary arterial disease. Ventriculography, transthoracic echocardiography, and cardiac magnetic resonance imaging were consistent with apical hypertrophic cardiomyopathy. The patient was discharged three days later with outpatient cardiology follow-up. We highlight the clinical and electrocardiographic findings of apical hypertrophic cardiomyopathy, with an emphasis on distinguishing this from acute myocardial infarction.
Prior data suggest that vancomycin-resistant Enterococcus (VRE) bacteremia is associated with worse outcomes than vancomycin-sensitive Enterococcus (VSE) bacteremia. However, many studies evaluating such outcomes were conducted prior to the availability of effective VRE therapies.
To systematically review VRE and VSE bacteremia outcomes among hospital patients in the era of effective VRE therapy.
Electronic databases and grey literature published between January 1997 and December 2014 were searched to identify all primary research studies comparing outcomes of VRE and VSE bacteremias among hospital patients, following the availability of effective VRE therapies. The primary outcome was all-cause, in-hospital mortality, while total hospital length of stay (LOS) was a secondary outcome. All meta-analyses were conducted in Review Manager 5.3 using random-effects, inverse variance modeling.
Among all the studies reviewed, 12 cohort studies and 1 case control study met inclusion criteria. Similar study designs were combined in meta-analyses for mortality and LOS. VRE bacteremia was associated with increased mortality compared with VSE bacteremia among cohort studies (odds ratio [OR], 1.80; 95% confidence interval [CI], 1.38–2.35; I2=0%; n=11); the case-control study estimate was similar, but not significant (OR, 1.93; 95% CI, 0.97–3.82). LOS was greater for VRE bacteremia patients than for VSE bacteremia patients (mean difference, 5.01 days; 95% CI, 0.58–9.44]; I2=0%; n=5).
Despite the availability of effective VRE therapy, VRE bacteremia remains associated with an increased risk of in-hospital mortality and LOS when compared to VSE bacteremia.
Distances to common production and marketing supply chain destinations may vary, and this has economic and animal health implications for small-scale food animal operations. Proximity to these destinations can affect the economic viability and marketing decisions of small-scale operations and may represent significant barriers to sustainability. Data were collected using a cross-sectional survey conducted by the US Department of Agriculture's (USDA) National Animal Health Monitoring System in 2011 using a stratified systematic sample of 16,000 small-scale (gross annual farm sales between US$10,000 and 499,999) operations from all 50 states. A total of 7925 food-animal operations were asked about the farthest one-way distance (in miles) to slaughter facilities, destinations where they sold animals or products, and feed sources. Across all small-scale operations, 95% of operations reported the farthest distance animals or products were transported for sale was 241 km (150 miles) or less. For distance to slaughter facilities, 95% of operations reported the farthest distance was 145 km (90 miles) or less. For feed shipped by a supplier, 95% of operations reported the farthest distance was 322 km (200 miles) or less. The 95th percentile for distance increased as farm sales increased, indicating larger operations were more likely to travel long distances. The results of this study are an important benchmark for understanding the economic and animal health implications of long transportation distances for operations that are small and/or focused on direct marketing.
The stress generation hypothesis was tested in two different longitudinal studies examining relations between weekly depression symptom ratings and stress levels in adolescents and emerging adults at varied risk for depression. The participants in Study 1 included 240 adolescents who differed with regard to their mothers' history of depressive disorders. Youth were assessed annually across 6 years (Grades 6–12). Consistent with the depression autonomy model, higher numbers of prior major depressive episodes (MDEs) were associated with weaker stress generation effects, such that higher levels of depressive symptoms predicted increases in levels of dependent stressors for adolescents with two or more prior MDEs, but depressive symptoms were not significantly related to dependent stress levels for youth with three or more prior MDEs. In Study 2, the participants were 32 remitted-depressed and 36 never-depressed young adults who completed a psychosocial stress task to determine cortisol reactivity and were reassessed for depression and stress approximately 8 months later. Stress generation effects were moderated by cortisol responses to a laboratory psychosocial stressor, such that individuals with higher cortisol responses exhibited a pattern consistent with the depression autonomy model, whereas individuals with lower cortisol responses showed a pattern more consistent with the depression sensitization model. Finally, comparing across the two samples, stress generation effects were weaker for older participants and for those with more prior MDEs. The complex, multifactorial relation between stress and depression is discussed.