To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Chronic suppurative otitis media is a major cause of disabling childhood hearing loss, especially in low-income countries. Estimates on its prevalence in sub-Saharan Africa range from the lowest to the highest in the world (less than one per cent to more than five per cent). However, the prevalence of chronic suppurative otitis media in Zimbabwe is largely unknown. This study aimed to determine the prevalence of paediatric chronic suppurative otitis media and other middle-ear pathology in rural Zimbabwe.
A cross-sectional study was performed in primary school children aged 4–13 years from the rural province of Mashonaland East. Participants underwent video otoscopy and tympanometry.
Out of 451 examined children, two (0.4 per cent) had chronic suppurative otitis media. Acute otitis media was present in one (0.2 per cent), otitis media with effusion was present in five (1.1 per cent) and scarring was present in 69 (15.3 per cent).
Chronic suppurative otitis media and otitis media sequelae were surprisingly uncommon in this sample of rural primary school children in Zimbabwe. More studies, preferably population-based, are needed to enable more precise estimates of chronic suppurative otitis media prevalence in Zimbabwe.
Many cognitive functions are under strong genetic control and twin studies have demonstrated genetic overlap between some aspects of cognition and schizophrenia. How the genetic relationship between specific cognitive functions and schizophrenia is influenced by IQ is currently unknown.
We applied selected tests from the Cambridge Neuropsychological Test Automated Battery (CANTAB) to examine the heritability of specific cognitive functions and associations with schizophrenia liability. Verbal and performance IQ were estimated using The Wechsler Adult Intelligence Scale-III and the Danish Adult Reading Test. In total, 214 twins including monozygotic (MZ = 32) and dizygotic (DZ = 22) pairs concordant or discordant for a schizophrenia spectrum disorder, and healthy control pairs (MZ = 29, DZ = 20) were recruited through the Danish national registers. Additionally, eight twins from affected pairs participated without their sibling.
Significant heritability was observed for planning/spatial span (h2 = 25%), self-ordered spatial working memory (h2 = 64%), sustained attention (h2 = 56%), and movement time (h2 = 47%), whereas only unique environmental factors contributed to set-shifting, reflection impulsivity, and thinking time. Schizophrenia liability was associated with planning/spatial span (rph = −0.34), self-ordered spatial working memory (rph = −0.24), sustained attention (rph = −0.23), and set-shifting (rph = −0.21). The association with planning/spatial span was not driven by either performance or verbal IQ. The remaining associations were shared with performance, but not verbal IQ.
This study provides further evidence that some cognitive functions are heritable and associated with schizophrenia, suggesting a partially shared genetic etiology. These functions may constitute endophenotypes for the disorder and provide a basis to explore genes common to cognition and schizophrenia.
Gender equity in the workplace is not merely a moral imperative; it also affects the success of businesses and our ability to solve the world’s grand challenges. The gender gap in leadership is a global phenomenon rooted in cultural role expectations of men, women, and leaders. Although these expectations vary across cultures, women consistently face barriers from laws, socialization, formal organizational policies, and informal organizational practices that limit their opportunities to become leaders and inhibit their ability to be effective when they do obtain such positions. To address these barriers, we discuss how societies, organizations, and individual women and men around the world can facilitate women’s leadership through culturally-contextual leadership development strategies. We frame our discussion around the intersection of culture, gender, and leadership to understand how the interaction of these variables informs local considerations as to what barriers, and therefore interventions, are most relevant in their respective contexts
In this Research Reflection we describe a common standpoint on suitable methodology for controlled and observational studies in cow-calf contact systems in dairy production. Different methods to assess behaviour, health and production in cow-calf contact systems are outlined. Knowledge and experience from researchers working in this field supplement scientific literature whenever relevant. Specific methods including study design, early behaviour of cow and calf, social behaviour relevant to cow-calf contact systems, human-animal relationships and aspects related to management (milking, weaning and separation, health) are reviewed, and recommendations formed. We expect that this paper can contribute to a better understanding of the complexity of cow-calf contact systems and help to advance research in this area of dairy production.
Due to increasing public concern regarding separation of the dairy cow and calf within the first days after birth, alternative systems, where cows and calves stay in contact for an extended period, are receiving increasing interest from a broad array of researchers and other stakeholders. With more research in the area, there is a risk of inconsistencies emerging in the use of terminology. To create a better consensus in further discussions, the aim of this Research Reflection is to provide definitions and propose a common terminology for cow-calf contact in dairy production. We also suggest definitions for various systems allowing cow-calf contact and describe the distinct phases of cow-calf contact systems.
Leukoaraiosis, or white matter rarefaction, is a common imaging finding in aging and is presumed to reflect vascular disease. When severe in presentation, potential congenital or acquired etiologies are investigated, prompting referral for neuropsychological evaluation in addition to neuroimaging. T2-weighted imaging is the most common magnetic resonance imaging (MRI) approach to identifying white matter disease. However, more advanced diffusion MRI techniques may provide additional insight into mechanisms that influence the abnormal T2 signal, especially when clinical presentations are discrepant with imaging findings.
We present a case of a 74-year-old woman with severe leukoaraoisis. She was examined by a neurologist, neuropsychologist, and rheumatologist, and completed conventional (T1, T2-FLAIR) MRI, diffusion tensor imaging (DTI), and advanced single-shell, high b-value diffusion MRI (i.e., fiber ball imaging [FBI]).
The patient was found to have few neurological signs, no significant cognitive impairment, a negative workup for leukoencephalopathy, and a positive antibody for Sjogren’s disease for which her degree of leukoaraiosis would be highly atypical. Tractography results indicate intact axonal architecture that was better resolved using FBI rather than DTI.
This case illustrates exceptional cognitive resilience in the face of severe leukoaraiosis and the potential for advanced diffusion MRI to identify brain reserve.
Introduction: The novel Paramedics Providing Palliative Care at Home program has been developed to address the mismatch between traditional paramedic practice and patient's goals of care. Case-finding is key to estimate potential impact for systems looking to establish such programs, continuous quality improvement once operational, and for prospective identification of patients who might benefit from referral to palliative care. Typical paramedic charting templates do not provide direct identification of these cases. Our objective was to test the validity of a previously derived Palliative Support Composite Measure (PSCM) and two modifications. Methods: A priori Gold Standard criteria for determining whether a response was appropriate for a paramedic palliative care approach were identified by expert consensus. Excluding chief complaints and clinical conditions that were universally identified as not appropriate for paramedic palliative support, these criteria were applied by two trained chart abstractors to 500 consecutive charts to classify calls as appropriate for paramedic palliative support, or not. The PSCM and modifications (added criteria call location type and registration in a palliative care program, text mining terms) were applied to the same cohort, and sensitivity, specificity, positive and negative predicative (PPV/NPV) values calculated. Results: Of the 500 cases, 21 (4.2%) were classified as appropriate for paramedic palliative support by the Gold Standard (kappa 0.734). 9 cases with initial disagreement were reviewed with 8 ultimately being deemed to fit the palliative support criteria. The PSCM performed poorly (using the “potential palliative” cut point): sensitivity 71.4% (95%CI: 47.8-88.7), specificity 71.4% (95%CI: 67.1-75.4) and PPV of 9.9% (95%CI: 7.5-12.9) and NPV of 98.3% (95%CI: 96.7-99). The modified PSCM: sensitivity 61.9% (95% CI: 38.4-81.9), specificity 99% (95%CI: 97.6-99.7), PPV 72.2% (95%CI: 50.5-86.9) and NPV 98.3% (95%CI: 97.2-99). A Modified PSCM plus pall* text term performed similarly: sensitivity100% (83.9-100), specificity 97.3% (95% CI: 95.4-98.5), PPV 61.8% (95%CI: 48.6-73.4) and NPV100%. Conclusion: A modified PSCM provides moderate sensitivity, specificity and PPV, improved by the text term Pall* if feasible. This query will be helpful to systems considering a paramedic palliative care program or when one is already operational.
Introduction: The Prehospital Evidence-based Practice (PEP) program is an online, freely accessible, continuously updated repository of appraised EMS research evidence. This report is an analysis of published evidence for EMS interventions used to assess and treat patients suffering from hypoglycemia. Methods: PubMed was systematically searched in June 2019. One author screened titles, abstracts and full-texts for relevance. Trained appraisers reviewed full text articles, scored each on a three-point Level of Evidence (LOE) scale (based on study design and quality) and three-point Direction of Evidence (DOE) scale (supportive, neutral, or opposing findings for each intervention's primary outcome), abstracted the primary outcome, setting and assigned an outcome category (patient or process). Second party appraisal was conducted for all included studies. The level and direction of each intervention was plotted in an evidence matrix, based on appraisals. Results: Twenty-nine studies were included and appraised for seven interventions: 5 drugs (Dextrose 50% (D50), Dextrose 10% (D10), glucagon, oral glucose and thiamine), one assessment tool (point-of-care (POC) glucose testing) and one call disposition (treat-and-release). The most frequently reported study primary outcomes were related to: clinical improvement (n = 15, 51.7%), feasibility/safety (n = 8, 27.6%), and diagnostics (n = 6, 20.7%). The majority of outcomes were patient focused (n = 18, 62.0%). Conclusion: EMS interventions for treating hypoglycemia are informed by high-quality supportive evidence. Both D50 and D10 are supported by high-quality evidence; suggesting D10 may be an effective alternative to the standard D50. “Treat-and-release” practices for hypoglycemia are supported by moderate-quality evidence for the patient related outcomes of relapse, patient preference and complications. This body of evidence is high-quality, patient-focused and conducted in the prehospital setting thus generalizable paramedic practice.
In the mink industry, feed costs are the largest variable expense and breeding for feed efficient animals is warranted. Implementation of selection for feed efficiency must consider the relationships between feed efficiency and the current selection traits BW and litter size. Often, feed intake (FI) is recorded on a cage with a male and a female and there is sexual dimorphism that needs to be accounted for. Study aims were to (1) model group recorded FI accounting for sexual dimorphism, (2) derive genetic residual feed intake (RFI) as a measure of feed efficiency, (3) examine the relationship between feed efficiency and BW in males (BWM) and females (BWF) and litter size at day 21 after whelping (LS21) in Danish brown mink and (4) investigate direct and correlated response to selection on each trait of interest. Feed intake records from 9574 cages, BW records on 16 782 males and 16 875 females and LS21 records on 6446 yearling females were used for analysis. Genetic parameters for FI, BWM, BWF and LS21 were obtained using a multivariate animal model, yielding sex-specific additive genetic variances for FI and BW to account for sexual dimorphism. The analysis was performed in a Bayesian setting using Gibbs sampling, and genetic RFI was obtained from the conditional distribution of FI given BW using genetic regression coefficients. Responses to single trait selection were defined as the posterior distribution of genetic superiority of the top 10% of animals after conditioning on the genetic trends. The heritabilities ranged from 0.13 for RFI in females and LS21 to 0.59 for BWF. Genetic correlations between BW in both sexes and LS21 and FI in both sexes were unfavorable, and single trait selection on BW in either sex showed increased FI in both sexes and reduced litter size. Due to the definition of RFI and high genetic correlation between BWM and BWF, selection on RFI did not significantly alter BW. In addition, selection on RFI in either sex did not affect LS21. Genetic correlation between sexes for FI and BW was high but significantly lower than unity. The high correlations across sex allowed for selection on standardized averages of animals’ breeding values (BVs) for RFI, FI and BW, which yielded selection responses approximately equal to the responses obtained using the sex-specific BVs. The results illustrate the possibility of selecting against RFI in mink with no negative effects on BW and litter size.
The Danish Longitudinal Study on Alcoholism was designed to identify predictors of adult male alcoholism. The present study examines the predictability of premorbid personality disorders.
Subjects were selected from a Danish birth cohort (n = 9125, born 1959 – 61) that included 223 sons of alcoholic fathers (high risk = HR) and 106 matched sons of non-alcoholics (low risk = LR). These subjects have been studied systematically over the past 40 years. Most recently, they were evaluated at age 40 (n = 202) by a psychiatrist using structured interviews and DSM-III-R criteria to diagnose an Alcohol Use Disorder.
HR subjects were more likely than LR subjects to develop alcohol dependence over the past 40 years (31% vs. 16%, p < .03). However, HR subjects were not more likely to develop alcohol abuse (17% vs. 15%). Both ADHD (as measured by school teachers) and ASPD (onset before age 15) predicted alcoholism independently at age 40. ADHD and ASPD were much stronger independent predictors of adult alcoholism than parental risk status. Other personality and anxiety disorders did not predict an alcoholic outcome.
Paternal alcoholism predicted alcohol dependence in sons at age 40. But the most predictive premorbid variables were ASPD and ADHD, both with onset in childhood and adolescence.
In London, Ontario, discharges from psychiatric wards to shelters or NFA occurred 194 times per year. This discovery led to the creation of a pilot project that provided immediate access to a housing advocate and changed normal policies related to housing and start-up fees for a select group of income support recipients. The intervention was successful; seven participants who received this additional assistance were still housed six months later, whereas 6 of 7 who received usual care were still homeless. The goal of the current study was to determine the strengths and areas for improvement of a method to prevent discharge from hospital to NFA and suggest improvements in preparation for wider implementation.
Phase 2: intervention to all acute psychiatric patients within a general hospital.
Phase 3: intervention to all patients within a specialized tertiary care psychiatric hospital. Intervention included on-ward access to a housing advocate and income support staff which was facilitated through computer linkages to housing and income databases.
Findings revealed the success of the intervention across both acute and tertiary sites. All hypotheses were supported: the rate of discharge to homelessness decreased; those accessing the service were poor; and the cost savings from the program exceeded the cost of implementation. Advantages of the approach included: accessibility and convenience of services on site, positive influence on overall treatment plan and feelings of independence and support. Results reveal the positive influence a cross-sectoral approach has on preventing discharge from psychiatric wards to the streets and shelters.
Risperidone long acting injection (RLAI) has in recent years gained widespread use in the treatment of schizophrenia and schizoaffective disorder. It is considered safe, tolerable, and the efficacy has been established in several studies. However, all these studies were funded by the manufacturer of RLAI.
To compare the efficiency of RLAI vs. conventional depot antipsychotics (CAD).
This study includes 9197 patients with schizophrenia being treated with depot injections in Denmark from January, 1996 until December, 2007. Prescription data were obtained from the national prescription database which contains information of sold defined daily dosages (DDD).
Of the 9197 patients 1056 had originally received a CAD treatment (zuclopentixole, haloperidol, perphenazine, fluphenazine or flupenthixol), but where switched to RLAI. The two periods were compared with regard to number of admissions and time spent in hospital. A Cox regression analysis of any-cause-discontinuation was performed on 9105 incident depot antipsychotic periods.
When receiving CAD treatment patients had 0.08 (95% CI [0.07; 0.08]) admissions to hospital per month vs. 0.50 (95% CI [0.47; 0.54]) when treated with RLAI. On average the patients spent 33 percent (95% CI [32; 35]) of the time during the CAD-period as admitted to hospital vs. 48 percent (95% CI [46; 50]) during the RLAI-period. Of the 1056 patients 83% where eventually switched back to a CAD.
Furthermore the RLAI patients have a higher hazard of discontinuing their treatment than the CAD patients (HR = 2.26, 95% CI [2.05; 2.48]).
RLAI was inferior compared to conventional depot antipsychotics.
Psychotic depression (PD) is classified as a sybtype of severe depression in the current diagnostic manuals. Accordingly, it is a common conception that psychotic features in depression arise as a consequence of depressive severity.
To determine whether the severity of depression and psychosis correlate in accordance with the “severity-psychosis” hypothesis and to detect potential differences in clinical features of psychotic and non-psychotic depression (non-PD).
We aimed to answer the following questions:
Does the clinical profile differ between patients with PD and non-PD?
Is the severity of depression and psychosis correlated in patients with depression?
Quantitative analysis of Health of the Nation Outcome Scales (HoNOS) scores from all patients admitted to a Danish general psychiatric hospital between 2000 and 2010 due to a severe depressive episode.
A total of 357 patients with severe depression, of which 125 (35%) were of the psychotic subtype, formed the study sample. Mean HoNOS scores at admission differed significantly between patients with non-PD and PD on the items hallucinations and delusions (non-PD = 0.33 vs. PD = 1.37, p < 0.001), aggression (non-PD = 0.20 vs. PD = 0.36, p = 0.044) and on the total score (non-PD = 10.55 vs. PD = 11.87, p = 0.024). the HoNOS scores on the two items “depression” and “hallucinations and delusions” were very weakly correlated (Spearman coefficient = 0.12).
The results suggest that the severity of depression is unlikely to be the key determinant for the development of psychosis and supports the hypothesis that the psychotic- and non-psychotic subtypes of depression are in fact distinct clinical syndromes.
Electric indoor lighting can disturb sleep and increase depressive symptoms; both common complaints in psychiatric inpatients.
To improve quality of sleep in patients using an indoor hospital lighting environment simulating nature in intensity, color, and circadian timing.
Investigator-blinded parallel group randomized controlled effectiveness trial supplied with qualitative interviews in an inpatient psychiatric ward with fully automatic and adjustable lighting. Admitted patients received a room with a naturalistic lighting environment (intervention group) or lighting as usual (control group). The primary outcome was the Pittsburg Sleep Quality Index and secondary outcomes included the Major Depression Inventory and WHO-five Well- Being Index.
In this ongoing trial, we included 28 patients (16 treated and 12 controls). Patients in the intervention group reported higher subjective sleep quality and sleep efficiency, lower use of sleep medication (mean difference, 4.68 mg; 95% CI, 0.54; 53.5), fewer depressive symptoms (mean difference, 5; 95% CI,–2; 13), but lower well-being (difference,–4 percentage points; 95% CI,–20; 16), compared with the control group. At discharge, fewer patients in the intervention group had experienced use of involuntary treatment. Qualitative data indicated no side effects apart from issues in performing indoor leisure activities in dim light.
A naturalistic lighting environment was safe and improved sleep and mood in our small patient sample. The trial integrated well with routine clinical care and our sample reflected the heterogeneity of the target population (Funded by Region Midtjylland and others; Clinicaltrials.gov number, NCT02653040)
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Early changes in biomarker levels probably occur before bloodstream infection (BSI) is diagnosed. However, this issue has not been fully addressed. We aimed at evaluating the kinetics of C-reactive protein (CRP) and plasma albumin (PA) in the 30 days before community-acquired (CA) BSI diagnosis. From a population-based BSI database we identified 658 patients with at least one measurement of CRP or PA from day −30 (D–30) through day −1 (D–1) before the day of CA-BSI (D0) and a measurement of the same biomarker at D0 or D1. Amongst these, 502 had both CRP and PA measurements which fitted these criteria. CRP and PA concentrations began to change inversely some days before CA-BSI diagnosis, CRP increasing by day −3.1 and PA decreasing by day −1.3. From D–30 to D–4, CRP kinetics (expressed as slopes – rate of concentration change per day) was −1.5 mg/l/day. From D–3 to D1, the CRP slope increased to 36.3 mg/l/day. For albumin, the slope between D–30 to D–2 was 0.1 g/l/day and changed to −1.8 g/l/day between D–1 and D1. We showed that biomarker levels begin to change some days before the CA-BSI diagnosis, CRP 3.1 days and PA 1.3 days before.
A priority focus on palliative and supportive care is helping the 43.5 million caregivers who care for individuals with serious illness. Lacking support may lead to caregiver distress and poorer care delivery to patients with serious illness. We examined the potential of instrumental support (assistance with material and task performance) to mitigate distress among caregivers.
We analyzed data from the nationally representative Health Information National Trends Survey (HINTS V2, 2018). Informal/family caregivers were identified in HINTS V2 if they indicated they were caring for or making healthcare decisions for another adult with a health problem. We used the PROMIS® instrumental support four-item short-form T-scores and the Patient Health Questionnaire (PHQ-4) for distress. We examined multivariable linear regression models for associations between distress and instrumental support, adjusted for sampling weights, socio-demographics, and caregiving variables (care recipient health condition(s), years caregiving (≥2), relationship to care recipient, and caregiver burden). We examined interactions between burden and instrumental support on caregiver distress level.
Our analyses included 311 caregivers (64.8% female, 64.9% non-Hispanic White). The unweighted mean instrumental support T-score was 50.4 (SD = 10.6, range = 29.3–63.3); weighted mean was 51.2 (SE = 1.00). Lower instrumental support (p < 0.01), younger caregiver age (p < 0.04), higher caregiving duration (p = 0.008), and caregiver unemployment (p = 0.006) were significantly associated with higher caregiver distress. Mean instrumental support scores by distress levels were 52.3 (within normal limits), 49.4 (mild), 48.9 (moderate), and 39.7 (severe). The association between instrumental support and distress did not differ by caregiver burden level.
Poor instrumental support is associated with high distress among caregivers, suggesting the need for palliative and supportive care interventions to help caregivers leverage instrumental support.
Weight gain among psychiatric inpatients is a widespread phenomenon. This change in body mass index (BMI) can be caused by several factors. Based on recent research, we assume the following factors are related to weight gain during psychiatric inpatient treatment: psychiatric medication, psychiatric diagnosis, sex, age, weight on admission and geographic region of treatment.
876 of originally recruited 2328 patients met the criteria for our analysis. Patients were recruited and examined in mental health care centres in Nigeria (N=265), Japan (N=145) and Western-Europe (Denmark, Germany and Switzerland; N=466).
There was a significant effect of psychiatric medication, psychiatric diagnoses and geographic region, but not age and sex, on BMI changes. Geographic region had a significant effect on BMI change, with Nigerian patients gaining significantly more weight than Japanese and Western European patients. Moreover, geographic region influenced the type of psychiatric medication prescribed and the psychiatric diagnoses. The diagnoses and psychiatric medication prescribed had a significant effect on BMI change.
In conclusion, we consider weight gain as a multifactorial phenomenon that is influenced by several factors. One can discuss a number of explanations for our findings, such as different clinical practices in the geographical regions (prescribing or admission strategies and access-to-care aspects), as well as socio-economic and cultural differences.