To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The main objective was to examine agreement between the internet based Development and Well-Being Assessment (DAWBA) generated diagnoses and clinical diagnoses. Second, we aimed to explore how disclosure of the DAWBA-diagnosis before clinical decision making influenced the clinicians diagnosis. Third, whether there were differences of influence for different categories of disorders. Last, we examined how the use of DAWBA information affected identification of co-morbidities.
315 patients from outpatient clinics were randomised into two groups. In 177 cases the clinician was informed about DAWBA diagnosis, in 155 cases the clinican was blind to DAWBA information. DAWBA is an internet based package of questionnaires and rating techniques designed to generate psychiatric ICD10 or DSM IV diagnoses for 5- 17 year old children and adolescents. Information from parents, teachers and self-reports are brought together by a computer programme that predicts likely diagnoses. An expert rater decides on the diagnosis by synopsis of these different inputs.
DAWBA diagnoses and clinical diagnosis without information from DAWBA showed acceptable agreement with Cohens kappa 0,26 for emotional disorders, kappa of 0,29 for hyperactive disorders and kappa of 0.31 for disruptive disorders.
There was a significant effect on clinical diagnoses for emotional disorders for disclosure of DAWBA (kappa of 0.26 without DAWBA information versus kappa of 0,52 with information, Fishers z of p< 0,05)
There was no significant effect of information about DAWBA-diagnosis considering comorbidities.
DAWBA showed the most pronounced effect on clinical diagnoses for emotional disorders in children and adolescents.
To explore tolerability, safety and efficacy of flexible doses of oral paliperidone ER in adult non-acute patients with schizophrenia requiring a change in their medication due to lack of efficacy with their previous oral antipsychotic.
Interim analysis of a prospective 6-month, open-label, international study. Patients completing the first 3 months of this study were analyzed. Endpoints were the change in the Positive and Negative Syndrome Scale (PANSS) from baseline to endpoint, Clinical Global Impression-Severity Scale (CGI-S), weight change and adverse events (AEs).
81 patients were included (57% male, mean age 41.3±13.6 years, 85% paranoid schizophrenia). 89% of the 81 patients completed the first 3 months of the study. Reasons for early discontinuation were lack of efficacy (3.7%), subject choice (2.5%), loss to follow-up (2.5%) and AE (1.2%). the mean mode dose of paliperidone ER was 6 mg/day. Mean total PANSS decreased from 82.8±16.0 at baseline to 69.2±19.1 at endpoint (mean change -13.6±15.6; 95% confidence interval [CI]-17.0;-10.1, p< 0.0001). the percentage of patients rated mildly ill or less in CGI-S increased from 19.8% to 49.4%. AEs reported in ≥3% were insomnia (4.9%), somnolence (4.9%), extrapyramidal disorder (3.7%), restlessness (3.7%) and psychotic disorder (3.7%). Mean weight change from baseline to endpoint was 0.34 kg (95%CI -0.35;1.03, p=0.71).
These interim open-label data support results from recent randomized controlled studies that flexibly dosed paliperidone ER is safe, well tolerated and effective in patients with schizophrenia requiring a change in medication due to lack of efficacy with their previous oral antipsychotic treatment.
Schizophrenia patients exhibit impairments in prepulse inhibition (PPI) of the acoustic startle response (ASR). PPI is commonly used as an index of sensorimotor gating. Results of animal studies and some human data suggest that PPI deficits are in part genetically determined, such that PPI could be an endophenotypic indicator of risk for schizophrenia, Thus, PPI deficits should already be present prior to onset of psychosis. To test this assumption, we investigated PPI in individuals with prodromal symptoms of schizophrenia and patients with first-episode schizophrenia.
Startle reactivity, habituation, and PPI of ASR were assessed in 54 subjects with prodromal symptoms of schizophrenia (35 at an early prodromal stage, 19 at a late prodromal stage), 31 first episode schizophrenic patients (14 unmedicated, 17 medicated), and 28 healthy controls. Patients were also examined with the Positive and Negative Symptom Scale and the Global Assessment of Functioning Scale.
Prodromal subjects and unmedicated patients with first episode schizophrenia showed significant PPI deficits, whereas schizophrenic patients treated with risperidone had almost normal PPI. In contrast, startle reactivity decreased with severity of symptoms but was relatively unimpaired in the medicated patients. With respect to habituation, prodromal subjects and schizophrenic patients did not differ from healthy controls.
PPI disruption is present in subjects in a prodromal state likely to proceed to schizophrenia, supporting the hypothesis that PPI disruption is an endophenotype of schizophrenia. In contrast, startle reactivity and habituation deficits were not evident in the prodromal subjects, but only in unmedicated patients with diagnosis of schizophrenia.
Deficits in behavioral inhibition leading to impulsivity occur frequently in many otherwise different psychiatric diseases, mainly ADHD and borderline personality disorder (BPD). However, the research is complicated by using of different tests and their parameters. Further, the role of frontoparietal network in behavioral inhibition has been questioned recently.
The aims of our studies were:
– to present the influence of differences in inhibition tasks parameters;
– to describe neural correlates of behavioral inhibition in healthy people;
– to compare them with BPD and ADHD patients.
We implemented two different variants of Go/NoGo Task, one designed for behavioral research and the second for neuroimaging. Thirty healthy participants (37% of women, age range 15 to 33 years) underwent behavioral and fMRI measurement. Further, groups of patients with BPD, ADHD and their healthy controls underwent the Go/NoGo Task under both fMRI and EEG.
The results show differences in behavioral performance based on different task parameters. The fMRI results in healthy people show specific activation patterns within the frontoparietal network associated with inhibition trials (mainly inferior frontal gyrus, insula, cingulate gyrus, SMA, inferior parietal lobule). Further, we present differences between patients with BPD, ADHD and controls in BOLD signal and ERPs.
Go/NoGo Task design substantially influences the subjects’ behavioral performance. Our results with methodologically upgraded Go/NoGo Task design provide support for the inhibition frontoparietal brain network and its different activations in BPD and ADHD patients. The research was supported by Ministry of Health of the Czech Republic, grant nr. 15-30062A.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Postoperative cognitive impairment is among the most common medical complications associated with surgical interventions – particularly in elderly patients. In our aging society, it is an urgent medical need to determine preoperative individual risk prediction to allow more accurate cost–benefit decisions prior to elective surgeries. So far, risk prediction is mainly based on clinical parameters. However, these parameters only give a rough estimate of the individual risk. At present, there are no molecular or neuroimaging biomarkers available to improve risk prediction and little is known about the etiology and pathophysiology of this clinical condition. In this short review, we summarize the current state of knowledge and briefly present the recently started BioCog project (Biomarker Development for Postoperative Cognitive Impairment in the Elderly), which is funded by the European Union. It is the goal of this research and development (R&D) project, which involves academic and industry partners throughout Europe, to deliver a multivariate algorithm based on clinical assessments as well as molecular and neuroimaging biomarkers to overcome the currently unsatisfying situation.
How do international observers decide whether to criticize or condone electoral fraud in a country? We argue that this decision depends on the identity of the victims of electoral fraud. A monitoring organization is more likely to overlook fraud committed against groups that are deemed dangerous by its sponsor. Based on this insight, we hypothesize that in the post-Cold War era election monitors are more tolerant of fraud against Islamic challengers, especially when Islamic movements are perceived as a threat to political stability. In support of our hypothesis, we find that outside monitors are more likely to endorse an election in countries with an Islamic opposition party and an ongoing Islamist terrorist campaign. Furthermore, we find that the effect is driven by Western monitoring organizations and becomes stronger after the September 11 attacks. Our findings provide a simple yet powerful insight: the calculus of outside observers depends not only on who they wish to see in power, but also who they want to keep from power.
Experimental studies have reported on the anti-inflammatory properties of polyphenols. However, results from epidemiological investigations have been inconsistent and especially studies using biomarkers for assessment of polyphenol intake have been scant. We aimed to characterise the association between plasma concentrations of thirty-five polyphenol compounds and low-grade systemic inflammation state as measured by high-sensitivity C-reactive protein (hsCRP). A cross-sectional data analysis was performed based on 315 participants in the European Prospective Investigation into Cancer and Nutrition cohort with available measurements of plasma polyphenols and hsCRP. In logistic regression analysis, the OR and 95 % CI of elevated serum hsCRP (>3 mg/l) were calculated within quartiles and per standard deviation higher level of plasma polyphenol concentrations. In a multivariable-adjusted model, the sum of plasma concentrations of all polyphenols measured (per standard deviation) was associated with 29 (95 % CI 50, 1) % lower odds of elevated hsCRP. In the class of flavonoids, daidzein was inversely associated with elevated hsCRP (OR 0·66, 95 % CI 0·46, 0·96). Among phenolic acids, statistically significant associations were observed for 3,5-dihydroxyphenylpropionic acid (OR 0·58, 95 % CI 0·39, 0·86), 3,4-dihydroxyphenylpropionic acid (OR 0·63, 95 % CI 0·46, 0·87), ferulic acid (OR 0·65, 95 % CI 0·44, 0·96) and caffeic acid (OR 0·69, 95 % CI 0·51, 0·93). The odds of elevated hsCRP were significantly reduced for hydroxytyrosol (OR 0·67, 95 % CI 0·48, 0·93). The present study showed that polyphenol biomarkers are associated with lower odds of elevated hsCRP. Whether diet rich in bioactive polyphenol compounds could be an effective strategy to prevent or modulate deleterious health effects of inflammation should be addressed by further well-powered longitudinal studies.
Intra-uterine growth restriction (IUGR) is associated with adverse metabolic outcome later in life. Healthy mice challenged with a Western-style diet (WSD) accumulated less body fat when previously fed a diet containing large lipid globules (complex lipid matrix (CLM)). This study was designed to clarify whether an early-life CLM diet mitigates ‘programmed’ visceral adiposity and associated metabolic sequelae after IUGR. In rats, IUGR was induced either by bilateral uterine vessel ligation (LIG) or sham operation (i.e. intra-uterine stress) of the dam on gestational day 19. Offspring from non-operated (NOP) dams served as controls. Male offspring of all groups were either fed CLM or ‘normal matrix’ control diet (CTRL) from postnatal days (PND) 15 to 42. Thereafter, animals were challenged with a mild WSD until dissection (PND 98). Fat mass (micro computer-tomograph scan; weight of fat compartments), circulating metabolic markers and expression of ‘metabolic’ genes (quantitative real-time PCR) were assessed. CLM diet significantly reduced visceral fat mass in LIG at PND 40. At dissection, visceral fat mass, fasted blood glucose, TAG and leptin concentrations were significantly increased in LIG-CTRL v. NOP-CTRL, and significantly decreased in LIG-CLM v. LIG-CTRL. Gene expression levels of leptin (mesenteric fat) and insulin-like growth factor 1 (liver) were significantly reduced in LIG-CLM v. LIG-CTRL. In conclusion, early-life CLM diet mitigated the adverse metabolic phenotype after utero-placental insufficiency. The supramolecular structure of dietary lipids may be a novel aspect of nutrient quality that has to be considered in the context of primary prevention of obesity and metabolic disease in at-risk populations.
Objectives: This study investigated the relationship between on-field, objective signs immediately following sport-related concussion and self-reported symptom endorsement within 1 day post injury. Methods: A retrospective case series of 237 concussed high school athletes was performed. On-field signs were evaluated immediately post injury. Self-reported symptoms (2 clusters) were collected within 1 day post injury. A two-step structural equation model and follow-up bivariate regression analyses of significant on-field signs and symptom clusters were performed. Results: Signs of immediate memory, β=0.20, p=.04, and postural instability, β=0.19, p < .01, significantly predicted a greater likelihood of endorsing the cognitive-migraine-fatigue symptom cluster within 1 day post injury. Regarding signs correlated with specific symptoms, immediate memory was associated with symptoms of trouble remembering, χ2=37.92, p < .001, odds ratio (OR)=3.89 (95% confidence interval (CI) [2.47, 6.13]), and concentration difficulties, χ2=10.84, p=.001, OR=2.13 (95% CI [1.37, 3.30]). Postural instability was associated with symptom endorsement of trouble remembering, χ2=12.08, p < .001, OR=1.76 (95% CI [1.29, 2.40]). Conclusions: Certain post-concussion on-field signs exhibited after injury were associated with specific symptom endorsement within 1 day post injury. Based on these associations, individualized education-based interventions and academic accommodations may help reduce unanticipated worry from parents, students, and teachers following a student-athlete’s sport-related concussion, especially in cases of delayed onset symptoms. (JINS, 2018, 24, 476–485)
The mainstay of management of epistaxis refractory to first aid and cautery is intranasal packing. This review aimed to identify evidence surrounding nasal pack use.
A systematic review of the literature was performed using standardised methodology.
Twenty-seven eligible articles were identified relating to non-dissolvable packs and nine to dissolvable packs. Nasal packing appears to be more effective when applied by trained professionals. For non-dissolvable packs, the re-bleed rates for Rapid Rhino and Merocel were similar, but were higher with bismuth iodoform paraffin paste packing. Rapid Rhino packs were the most tolerated non-dissolvable packs. Evidence indicates that 96 per cent of re-bleeding occurs within the first 4 hours after nasal pack removal. Limited evidence suggests that dissolvable packs are effective and well tolerated by patients. There was a lack of evidence relating to: the duration of pack use, the economic effects of pack choice and the appropriate care setting for non-dissolvable packs.
Rapid Rhino packs are the best tolerated, with efficacy equivalent to nasal tampons. FloSeal is easy to use, causes less discomfort and may be superior to Merocel in anterior epistaxis cases. There is no strong evidence to support prophylactic antibiotic use.
The initial assessment of epistaxis patients commonly includes: first aid measures, observations, focused history taking, and clinical examinations and investigations. This systematic review aimed to identify evidence that informs how the initial assessment of these patients should be conducted.
A systematic review of the literature was performed using a standardised methodology and search strategy.
Seventeen articles were included. Factors identified were: co-morbidity, intrinsic patient factors, coagulation screening and ice pack use. Hypertension and anticoagulant use were demonstrated to adversely affect outcomes. Coagulation screening is useful in patients on anticoagulant medication. Four studies could not be accessed. Retrospective methodology and insufficient statistical analysis limit several studies.
Sustained ambulatory hypertension, anticoagulant therapy and posterior bleeding may be associated with recurrent epistaxis, and should be recorded. Oral ice pack use may decrease severity and can be considered as first aid. Coagulation studies are appropriate for patients with a history of anticoagulant use or bleeding diatheses.
There is variation regarding the use of surgery and interventional radiological techniques in the management of epistaxis. This review evaluates the effectiveness of surgical artery ligation compared to direct treatments (nasal packing, cautery), and that of embolisation compared to direct treatments and surgery.
A systematic review of the literature was performed using a standardised published methodology and custom database search strategy.
Thirty-seven studies were identified relating to surgery, and 34 articles relating to interventional radiology. For patients with refractory epistaxis, endoscopic sphenopalatine artery ligation had the most favourable adverse effect profile and success rate compared to other forms of surgical artery ligation. Endoscopic sphenopalatine artery ligation and embolisation had similar success rates (73–100 per cent and 75–92 per cent, respectively), although embolisation was associated with more serious adverse effects (risk of stroke, 1.1–1.5 per cent). No articles directly compared the two techniques.
Trials comparing endoscopic sphenopalatine artery ligation to embolisation are required to better evaluate the clinical and economic effects of intervention in epistaxis.
Cardiovascular fitness in late adolescence is associated with future risk of depression. Relationships with other mental disorders need elucidation. This study investigated whether fitness in late adolescence is associated with future risk of serious non-affective mental disorders. Further, we examined how having an affected brother might impact the relationship.
Prospective, population-based cohort study of 1 109 786 Swedish male conscripts with no history of mental illness, who underwent conscription examinations at age 18 between 1968 and 2005. Cardiovascular fitness was objectively measured at conscription using a bicycle ergometer test. During the follow-up (3–42 years), incident cases of serious non-affective mental disorders (schizophrenia and schizophrenia-like disorders, other psychotic disorders and neurotic, stress-related and somatoform disorders) were identified through the Swedish National Hospital Discharge Register. Cox proportional hazards models were used to assess the influence of cardiovascular fitness at conscription and risk of serious non-affective mental disorders later in life.
Low fitness was associated with increased risk for schizophrenia and schizophrenia-like disorders [hazard ratio (HR) 1.44, 95% confidence interval (CI) 1.29–1.61], other psychotic disorders (HR 1.41, 95% CI 1.27–1.56), and neurotic or stress-related and somatoform disorders (HR 1.45, 95% CI 1.37–1.54). Relationships persisted in models that included illness in brothers.
Lower fitness in late adolescent males is associated with increased risk of serious non-affective mental disorders in adulthood.
An outbreak of respiratory diphtheria occurred in two health districts in the province of KwaZulu-Natal in South Africa in 2015. A multidisciplinary outbreak response team was involved in the investigation and management of the outbreak. Fifteen cases of diphtheria were identified, with ages ranging from 4 to 41 years. Of the 12 cases that were under the age of 18 years, 9 (75%) were not fully immunized for diphtheria. The case fatality was 27%. Ninety-three household contacts, 981 school or work contacts and 595 healthcare worker contacts were identified and given prophylaxis against Corynebacterium diphtheriae infection. A targeted vaccination campaign for children aged 6–15 years was carried out at schools in the two districts. The outbreak highlighted the need to improve diphtheria vaccination coverage in the province and to investigate the feasibility of offering diphtheria vaccines to healthcare workers.
To validate the ovine model of profound oropharyngeal dysphagia and compare swallowing outcomes of laryngotracheal separation with those of total laryngectomy.
Under real-time fluoroscopy, swallowing trials were conducted using the head and neck of two Dorper cross ewes and one human cadaver, secured in lateral fluoroscopic orientation. Barium trials were administered at baseline, pre- and post-laryngohyoid suspension, following laryngotracheal separation, and following laryngectomy in the ovine model.
Mean pre-intervention Penetration Aspiration Scale and National Institutes of Health Swallow Safety Scale scores were 8 ± 0 and 6 ± 0 respectively in sheep and human cadavers, with 100 per cent intra- and inter-species reproducibility. These scores improved to 1 ± 0 and 2 ± 0 post-laryngohyoid suspension (p < 0.01). Aerodigestive tract residue was 18.6 ± 2.4 ml at baseline, 15.4 ± 3.8 ml after laryngotracheal separation and 3.0 ± 0.7 ml after total laryngectomy (p < 0.001).
The ovine model displayed perfect intra- and inter- species reliability for the Penetration Aspiration Scale and Swallow Safety Scale. Less aerodigestive tract residue after narrow-field laryngectomy suggests that swallowing outcomes after total laryngectomy are superior to those after laryngotracheal separation.
Following an unusually heavy rainfall in June 2009, a community-wide outbreak of Campylobacter gastroenteritis occurred in a small Danish town. The outbreak investigation consisted of (1) a cohort study using an e-questionnaire of disease determinants, (2) microbiological study of stool samples, (3) serological study of blood samples from cases and asymptomatic members of case households, and (4) environmental analyses of the water distribution system. The questionnaire study identified 163 cases (respondent attack rate 16%). Results showed a significant dose-response relationship between consumption of tap water and risk of gastroenteritis. Campylobacter jejuni belonging to two related flaA types were isolated from stool samples. Serum antibody levels against Campylobacter were significantly higher in cases than in asymptomatic persons. Water samples were positive for coliform bacteria, and the likely mode of contamination was found to be surface water leaking into the drinking-water system. This geographically constrained outbreak presented an ideal opportunity to study the serological response in persons involved in a Campylobacter outbreak. The serology indicated that asymptomatic persons from the same household may have been exposed, during the outbreak period, to Campylobacter at doses that did not elicit symptoms or alternatively had been exposed to Campylobacter at a time prior to the outbreak, resulting in residual immunity and thus absence of clinical signs.
Several studies on the effect of physical exercise on activities of daily living (ADL) for people with dementia exist; yet, data concerning the specific context of acute psychiatric hospitals remain scant. This study measured the effect of a physical exercise program on ADL scores in patients with moderate to severe dementia hospitalized in an acute psychiatric ward.
A multicenter clinical trial was conducted in five Swiss and Belgian psychiatric hospitals. Participants were randomly allocated to either an experimental group (EG) or a control group (CG). Members of the EG received 20 physical exercise sessions (strengthening, balance, and walking) over a four-week period while members of the CG participated in social interaction sessions of equivalent duration and frequency, but without physical exercise. The effect of exercise on ADL was measured by comparing scores of the Barthel Index and the Functional Independence Measure in the EG and CG before and after the intervention, and two weeks later.
Hundred and sixty patients completed the program. Characteristics of participants of both groups were similar at the inception of the study. The mean ADL score of EG decreased slightly over time, whereas that of the CG significantly decreased compared to initial scores. Overall differences between groups were not significant; however, significant differences were found for mobility-related items.
ADL scores in elderly with moderate to severe dementia deteriorate during acute psychiatric hospitalization. An exercise program delays the loss of mobility but does not have a significant impact on overall ADL scores.
To characterize meal patterns across ten European countries participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) calibration study.
Cross-sectional study utilizing dietary data collected through a standardized 24 h diet recall during 1995–2000. Eleven predefined intake occasions across a 24 h period were assessed during the interview. In the present descriptive report, meal patterns were analysed in terms of daily number of intake occasions, the proportion reporting each intake occasion and the energy contributions from each intake occasion.
Twenty-seven centres across ten European countries.
Women (64 %) and men (36 %) aged 35–74 years (n 36 020).
Pronounced differences in meal patterns emerged both across centres within the same country and across different countries, with a trend for fewer intake occasions per day in Mediterranean countries compared with central and northern Europe. Differences were also found for daily energy intake provided by lunch, with 38–43 % for women and 41–45 % for men within Mediterranean countries compared with 16–27 % for women and 20–26 % for men in central and northern European countries. Likewise, a south–north gradient was found for daily energy intake from snacks, with 13–20 % (women) and 10–17 % (men) in Mediterranean countries compared with 24–34 % (women) and 23–35 % (men) in central/northern Europe.
We found distinct differences in meal patterns with marked diversity for intake frequency and lunch and snack consumption between Mediterranean and central/northern European countries. Monitoring of meal patterns across various cultures and populations could provide critical context to the research efforts to characterize relationships between dietary intake and health.