To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Severe acute respiratory syndrome-coronavirus-2 (SARS-CoV-2) led to a significant disease burden and disruptions in health systems. We describe the epidemiology and transmission characteristics of early coronavirus disease 2019 (COVID-19) cases in Bavaria, Germany. Cases were reverse transcription polymerase chain reaction (RT-PCR)-confirmed SARS-CoV-2 infections, reported from 20 January−19 March 2020. The incubation period was estimated using travel history and date of symptom onset. To estimate the serial interval, we identified pairs of index and secondary cases. By 19 March, 3546 cases were reported. A large proportion was exposed abroad (38%), causing further local transmission. Median incubation period of 256 cases with exposure abroad was 3.8 days (95%CI: 3.5–4.2). For 95% of infected individuals, symptom onset occurred within 10.3 days (95%CI: 9.1–11.8) after exposure. The median serial interval, using 53 pairs, was 3.5 days (95%CI: 3.0–4.2; mean: 3.9, s.d.: 2.2). Travellers returning to Germany had an important influence on the spread of SARS-CoV-2 infections in Bavaria in early 2020. Especially in times of low incidence, public health agencies should identify holiday destinations, and areas with ongoing local transmission, to monitor potential importation of SARS-CoV-2 infections. Travellers returning from areas with ongoing community transmission should be advised to quarantine to prevent re-introductions of COVID-19.
Laboratory data increasingly suggest that Salmonella infection may contribute to colon cancer (CC) development. Here, we examined epidemiologically the potential risk of CC associated with salmonellosis in the human population. We conducted a population-based cohort study using four health registries in Denmark. Person-level demographic data of all residents were linked to laboratory-confirmed non-typhoidal salmonellosis and to CC diagnoses in 1994–2016. Hazard ratios (HRs) for CC (overall/proximal/distal) associated with reported salmonellosis were estimated using Cox proportional hazard models. Potential effects of serovar, age, sex, inflammatory bowel disease and follow-up time post-infection were also assessed. We found no increased risk of CC ≥1 year post-infection (HR 0.99; 95% confidence interval (CI) 0.88–1.13). When stratifying by serovar, there was a significantly increased risk of proximal CC ≥1 year post-infection with serovars other than Enteritidis and Typhimurium (HR 1.40; 95% CI 1.03–1.90). CC risk was significantly increased in the first year post-infection (HR 2.08; 95% CI 1.48–2.93). The association between salmonellosis and CC in the first year post-infection can be explained by increased stool testing around the time of CC diagnosis. The association between proximal CC and non-Enteritidis/non-Typhimurium serovars is unclear and warrants further investigation. Overall, this study provides epidemiological evidence that notified Salmonella infections do not contribute significantly to CC risk in the studied population.
To investigate how individuals with a history of affective disorder use and perceive their use of social media and online dating.
A questionnaire focusing on affective disorders and the use of social media and online dating was handed out to outpatients from unipolar depression and bipolar disorder clinics and general practice patients with or without a history of affective disorders (latter as controls). The association between affective disorders and use of social media and online dating was analysed using linear/logistic regression.
A total of 194 individuals with a history of unipolar depression, 124 individuals with a history of bipolar disorder and 196 controls were included in the analysis. Having a history of unipolar depression or bipolar disorder was not associated with the time spent on social media compared with controls. Using the controls as reference, having a history bipolar disorder was associated with use of online dating (adjusted odds ratio: 2.2 (95% CI: 1.3; 3.7)). The use of social media and online dating had a mood-congruent pattern with decreased and more passive use during depressive episodes, and increased and more active use during hypomanic/manic episodes. Among the respondents with a history of affective disorder, 51% reported that social media use had an aggravating effect on symptoms during mood episodes, while 10% reported a beneficial effect. For online dating, the equivalent proportions were 49% (aggravation) and 20% (benefit), respectively.
The use of social media and online dating seems related to symptom deterioration among individuals with affective disorder.
Background: Infection prevention surveillance for cross transmission is often performed by manual review of microbiologic culture results to identify geotemporally related clusters. However, the sensitivity and specificity of this approach remains uncertain. Whole-genome sequencing (WGS) analysis can help provide a gold-standard for identifying cross-transmission events. Objective: We employed a published WGS program, the Philips IntelliSpace Epidemiology platform, to compare accuracy of two surveillance methods: (i.) a virtual infection practitioner (VIP) with perfect recall and automated analysis of antibiotic susceptibility testing (AST), sample collection timing, and patient location data and (ii) a novel clinical matching (CM) algorithm that provides cluster suggestions based on a nuanced weighted analysis of AST data, timing of sample collection, and shared location stays between patients. Methods: WGS was performed routinely on inpatient and emergency department isolates of Enterobacter cloacae, Enterococcus faecium, Klebsiella pneumoniae, and Pseudomonas aeruginosa at an academic medical center. Single-nucleotide variants (SNVs) were compared within core genome regions on a per-species basis to determine cross-transmission clusters. Moreover, one unique strain per patient was included within each analysis, and duplicates were excluded from the final results. Results: Between May 2018 and April 2019, clinical data from 121 patients were paired with WGS data from 28 E. cloacae, 21 E. faecium, 61 K. pneumoniae, and 46 P. aeruginosa isolates. Previously published SNV relatedness thresholds were applied to define genomically related isolates. Mapping of genomic relatedness defined clusters as follows: 4 patients in 2 E. faecium clusters and 2 patients in 1 P. aeruginosa cluster. The VIP method identified 12 potential clusters involving 28 patients, all of which were “pseudoclusters.” Importantly, the CM method identified 7 clusters consisting of 27 patients, which included 1 true E. faecium cluster of 2 patients with genomically related isolates. Conclusions: In light of the WGS data, all of the potential clusters identified by the VIP were pseudoclusters, lacking sufficient genomic relatedness. In contrast, the CM method showed increased sensitivity and specificity: it decreased the percentage of pseudoclusters by 14% and it identified a related genomic cluster of E. faecium. These findings suggest that integrating clinical data analytics and WGS is likely to benefit institutions in limiting expenditure of resources on pseudoclusters. Therefore, WGS combined with more sophisticated surveillance approaches, over standard methods as modeled by the VIP, are needed to better identify and address true cross-transmission events.
Funding: This study was supported by Philips Healthcare.
The challenge of user requirements for maintenance scheduling design in large asset-intensive industries suffers from lack of academic and empirical studies. Therefore, using a representative case study, this paper aims to: (1) identify the current practices and complex scheduling requirements; (2) propose a design support tool to optimize the maintenance scheduling process; and (3) report the gained benefits. The results reveal that the proposed tool can decrease the resource requirements, increase the capacity utilization, and reduce the cost while addressing the complex user requirements.
It is important to be able to compare and evaluate different solutions early in development. This paper proposes a method for structuring historical data into a data model that can support the evaluation of new design concepts. The data is contextualized by linking it to a hierarchical decomposition of existing products. Two case studies were conducted to evaluate the value of using historical data when evaluating new concepts. The cases confirm that the proposed method is useful for evaluation of new concepts.
Help-seeking and service utilization depends on the patients’ interpretation of their illness and treatment needs. Worry, denial of illness, need for treatment and need for hospitalization in first-time admitted patients was studied.
New patients in two mental hospitals were consecutively recruited. Three hundred and thirty-four satisfied the inclusion criteria and 251 gave informed consent. One hundred and ninety-six had complete datasets (56% of those eligible).
Demography was recorded with the Minimal Basic Dataset by Ruud et al. (1993) . Experiences of hospitalisation were measured with the Patient's Experience of Hospitalisation Questionnaire by Carskey et al. (1992) . MINI was used for diagnosing and SCL-90-R by Derogatis (1997)  for subjective symptoms. Standard multiple regressions were performed with the PEH subscales (Denial, Worry, Need for treatment and Need for hospitalisation) as dependents and demography, diagnosis and SCL-90-R subscales as explanatory variables.
(a) Psychoticism and the diagnosis of schizophrenia were associated with little worrying, denial of illness, of treatment needs and of need for hospitalisation. (b) Anxiety and affective disorders were related to worries, acknowledgement of illness, need for treatment and for hospitalisation.
In contrast to patients with mainly anxiety and affective disorders, psychotic patient tended to deny illness-related worries, that they had an illness and that they needed treatment and hospitalisation. An affective disorder together with suicidal thoughts (not attempts) was a strong drive towards hospital admission.
The validity of the Seasonal Pattern Assessment Questionnaire (SPAQ) in diagnosing Seasonal Affective Disorder is questionable. In 2004 the Seasonal Health Questionnaire (SHQ) was proposed as a more appropriate screening instrument for depression with a seasonal pattern.
To compare the performance of the SPAQ, the most commonly used tool for assigning a diagnosis of SAD, with the SHQ, which uses the DSM-IV criteria for recurrent depression with seasonal pattern.
Two samples of approximately 200 medical students in Tromsø, Norway (69° north) and Ferrara, Italy (44° north), filled in both questionnaires. Prevalence of recurrent depression with seasonal pattern was calculated according to gender and latitude of living, with both instruments. Using SHQ diagnosis as the gold standard, sensitivity and specificity of the SPAQ as a diagnostic instrument was ascertained.
The prevalence of depression with seasonal pattern measured by SPAQ was 12% in Norway and 14.5% in Italy, the difference was not significant. Prevalence was highest in females in both countries (Norway: males 4.2%, females 14.7%, Italy: males 2.3%, females 18.8%), but the difference was only significant in Italy (p = 0.007). According to SHQ, the corresponding figures in Norway 5.9% and 7.1% (p = 0.77) and in Italy 3.9% and 3% (p = 0.75). The specificity of the SPAQ was 88.8% and the sensitivity was 47.3%.
Compared to a DSM-IV diagnosis of depression with seasonal pattern as measured by the SHQ, the SPAQ seriously overestimates the prevalence of seasonal depression, especially in women, and the sensitivity is far too low.
Atopic disorders, including asthma, dermatitis and rhinoconjunctivitis are the most common chronic diseases of childhood. Numerous studies have investigated the relationship between immune dysregulation and prenatal factors, including psychological stress. The association between prenatal maternal stress and atopy, however, has never been systematically reviewed.
To systematically review all observational studies on the association between prenatal maternal stress and atopic disorders or predisposition in childhood.
To identify all observational studies in humans that compared the prevalence of one or more atopic disorders or predispositions in children of exposed and unexposed mothers. To critically evaluate the quality and validity of the published literature.
PubMed, EMBASE, PSYCInfo and Scopus databases were searched and relevant studies were identified and assessed accordingly to the PRISMA-criteria.
Fifteen studies met the inclusion criteria, many of which examined the association between prenatal stress and multiple disorders. Preliminary results suggest that children of mothers who experienced stress during pregnancy have a higher risk of developing asthma, dermatitis and rhinoconjunctivitis than children of unexposed mothers.
The impact of psychological stress on immune function appears consistent regardless of stress-definition. The varying stress- and outcomes measures make it difficult to compare results from the studies. Future research should focus on whether certain disorders are more susceptible than others, as well as if certain stressor-types or times during pregnancy are more critical.
There is a need for systematic evaluation of treatment of out-patients in mental health services. Very few instruments for such evaluation that has been translated to Norwegian have been properly validated.
To document the psychometric properties of the Norwegian translation of the outcome-measure Clinical Outcomes in Routine Evaluation (CORE-OM).
A clinical sample (N = 528) was collected from out-patient mental health services. A non-clinical convenience sample (N = 473) was also collected. A sub-sample of 81 from the non-clinical sample filled in the questionnaire four times, with one week between tests, alternating between the Norwegian translation and the original English version. The same sub-sample of 81 students was posed a question to measure psychological strain.
There were no significant differences in mean-scores between the sexes, neither in the non-clinical nor in the clinical sample. All mean scores were significantly higher in the clinical than in the non-clinical sample. The cut-off point for the Well-being items was higher for women, while the cut-off point for risk of suicide/harm to others was higher for men. Cut-off scores for Problem-items and Function-items was similar for men and women. The scores increased after exposure to psychological strain. Acceptability, internal consistency, test-retest stability, and the differences between scores in non-clinical and clinical samples were at the same level as in English data. Regarding cut-off scores, these are very similar also.
The Norwegian translation of the CORE-OM have psychometric properties very simialer to the English original, and could be recommended for general use.
To study gender differences in mortality among patients with schizophrenia over a period of 27 years.
A longitudinal psychiatric admission register in Northern Norway linked to the National Norwegian Cause of Death Register.
1111 patients with schizophrenia admitted from 1980 – 2006.
Weconfirm the persisting mortality gap between patients with schizophrenia and the general population over a period of 27 years, with a tendency of increasing standardized mortality rates. Male and female schizophrenic patients had 3.5 (95% CI: 3.1 to 4.1) and 2.6 (95% CI: 2.1 to 3.2) times higher mortality, respectively (p = 0.01 for the difference between the genders). Age-adjusted mortality rates in female schizophrenic patients admitted for the first time after 1992 was 70% higher than for those admitted previously, and female patients admitted for the first time after 1992 had significantly higher SMRs (4.6, 95% CI: 2.9 to 7.2) than women who were admitted earlier (SMR = 2.4, 95% CI: 1.9 to 2.9) (p = 0.009). Thus, the absolute mortality also increased. The SMRs for women admitted after 1992 are higher than for men. Men who had always been admitted voluntarily had non-significantly lower mortality than women in the same situation, while for women there was a linear, statistically significant trend towards higher SMRs for women with less use of coercion.
There is a persisting mortality gap between patients with schizophrenia and the general population over a period of 27 years.
Over the past decade, the World Health Summit (WHS) has provided a global platform for policy-makers and decision-makers to interact with academics and practitioners on global health. Recently the WHS adopted health security into their agenda for transnational disease risks (eg, Ebola and antimicrobial resistance) that increasingly threaten multiple sectors. Global health engagement (GHE) focuses efforts across interdisciplinary and interorganizational lines to identify critical threats and provide rapid deployment of key resources at the right time for addressing health security risks. As a product of subject matter experts convening at the WHS, a special side-group has organically risen with leadership and coordination from the German Institute for Defense and Strategic Studies in support of GHE activities across governmental, academic, and industry partners. Through novel approaches and targeted methodology that maximize outcomes and streamline global health operational process, the Global Health Security Alliance (GloHSA) was born. This short conference report describes in more detail the GloHSA.
Hyper-prolific sows nurse more piglets than less productive sows, putting a high demand on the nutrient supply for milk production. In addition, the high production level can increase mobilization from body tissues. The effect of increased dietary protein (104, 113, 121, 129, 139 and 150 g standardized ileal digestible (SID) CP/kg) on sow body composition, milk production and plasma metabolite concentrations was investigated from litter standardization (day 2) until weaning (day 24). Sow body composition was determined using the deuterium oxide dilution technique on days 3 and 24 postpartum. Blood samples were collected weekly, and milk samples were obtained on days 3, 10 and 17 of lactation. Litter average daily gain (ADG) peaked at 135 g SID CP/kg (P < 0.001). Sow BW and back fat loss reached a breakpoint at 143 and 127 g SID CP/kg (P < 0.001). Milk fat increased linearly with increasing dietary SID CP (P < 0.05), and milk lactose decreased until a breakpoint at 124 g SID CP/kg and 5.3% (P < 0.001) on day 17. The concentration of milk protein on day 17 increased until a breakpoint at 136 g SID CP/kg (5.0%; P < 0.001). The loss of body protein from day 3 until weaning decreased with increased dietary SID CP until it reached a breakpoint at 128 g SID CP/kg (P < 0.001). The body ash loss declined linearly with increasing dietary SID CP (P < 0.01), and the change in body fat was unaffected by dietary treatment (P=0.41). In early lactation (day 3 + day 10), plasma urea N (PUN) increased linearly after the breakpoint at 139 g SID CP/kg at a concentration of 3.8 mmol/l, and in late lactation (day 17 + day 24), PUN increased linearly after a breakpoint at 133 g SID CP/kg (P < 0.001) at a concentration of 4.5 mmol/l. In conclusion, the SID CP requirement for sows was estimated to 135 g/kg based on litter ADG, and this was supported by the breakpoints of other response variables within the interval 124 to 143 g/kg.
In situ transmission electron microscope (TEM) characterization techniques provide valuable information on structure–property correlations to understand the behavior of materials at the nanoscale. However, understanding nanoscale structures and their interaction with the electron beam is pivotal for the reliable interpretation of in situ/ex situ TEM studies. Here, we report that oxides commonly used in nanoelectronic applications, such as transistor gate oxides or memristive devices, are prone to electron beam induced damage that causes small structural changes even under very low dose conditions, eventually changing their electrical properties as examined via in situ measurements. In this work, silicon, titanium, and niobium oxide thin films are used for in situ TEM electrical characterization studies. The electron beam induced reduction of the oxides turns these insulators into conductors. The conductivity change is reversible by exposure to air, supporting the idea of electron beam reduction of oxides as primary damage mechanism. Through these measurements we propose a limit for the critical dose to be considered for in situ scanning electron microscopy and TEM characterization studies.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
Based on findings from the literature on campaign effects on the one hand, and the literature on European Parliament elections on the other, we propose a model of European Parliamentary elections in which the campaign shift the calculus of electoral support, making differences in national political allegiances less important and attitudes about the European project more important by informing voters of and getting them interested in European politics. In effect, we argue that the political campaign leading up to the election makes European Parliament elections less second order. While previous studies have demonstrated that EU issues can matter for voting behavior in European Parliament elections, existing research has drawn on post-election surveys that do not enable us to capture campaign effects. Our contribution is to assess the impact of a campaign by utilizing a rolling cross-sectional survey that enables us to track how voters were affected by the campaign. Our findings show that campaigns do have an effect on European Parliament election outcomes, in that they provide information that enables voters to make decisions based on their attitude on European issues, making voter decision-making more dominated by EU issue voting.
Selection for increased litter size have generated hyper-prolific sows that nurses large litters, however limited knowledge is available regarding the connection between milk production, feed intake and body mobilization of these modern sows. The aim of the current study was to determine what characterized sows with high milk production and nursing large litters, differences between sows of different parities and effects of lactational performance on next reproductive cycle. In total 565 sows (parity 1 to 4) were studied from 7 days before farrowing until weaning. On day 2 postpartum litters were standardized to 14 piglets. Weight and back fat thickness of sows were measured at day 7 prepartum, day 2 postpartum and at weaning. Litters were weighed at day 2 and at weaning. Pearson correlation coefficients between variables were calculated and regression models were developed. The average daily feed intake (ADFI) of the sows was 6.1±1.1 kg/day, average daily gain (ADG) of the litter was 2.92±0.53 kg/day and sows weaned 13.0±1.1 piglets. First parity sows generally had a lower ADFI and milk production and a decrease in total born piglets in next litter compared with parity 2 to 4 sows, which could be explained by a relatively higher proportion of their body reserves being mobilized compared with multiparous sows. The ADG of the litter was positively related by ADFI of the sows, litter size and BW loss and increasing the ADFI with 1 kg/day throughout lactation likely increased the ADG of the litter with 220 to 440 g/day in parity 1 to 4, respectively. Increasing the ADFI by 1 kg/day reduced the BW loss with 6.6 to 13.9 kg of parity 1 to 4 sows, respectively, during lactation, whereas increasing the average milk yield with 1 kg/day raised the BW loss with 4.3 to 21.0 kg of the four parities during lactation. The number of total born piglets in the next litter was positively related to the number of piglets born in the previous litter. In conclusion, both a high feed intake and a high mobilization of body reserves was a prerequisite for a high milk production. The sows might be very close to the physical limit of what they can ingest and future research should therefore, focus on optimizing the dietary energy and nutrient concentrations of diets for lactating hyper-prolific sows and herein distinguish between primiparous and multiparous sows.
To characterize meal patterns across ten European countries participating in the European Prospective Investigation into Cancer and Nutrition (EPIC) calibration study.
Cross-sectional study utilizing dietary data collected through a standardized 24 h diet recall during 1995–2000. Eleven predefined intake occasions across a 24 h period were assessed during the interview. In the present descriptive report, meal patterns were analysed in terms of daily number of intake occasions, the proportion reporting each intake occasion and the energy contributions from each intake occasion.
Twenty-seven centres across ten European countries.
Women (64 %) and men (36 %) aged 35–74 years (n 36 020).
Pronounced differences in meal patterns emerged both across centres within the same country and across different countries, with a trend for fewer intake occasions per day in Mediterranean countries compared with central and northern Europe. Differences were also found for daily energy intake provided by lunch, with 38–43 % for women and 41–45 % for men within Mediterranean countries compared with 16–27 % for women and 20–26 % for men in central and northern European countries. Likewise, a south–north gradient was found for daily energy intake from snacks, with 13–20 % (women) and 10–17 % (men) in Mediterranean countries compared with 24–34 % (women) and 23–35 % (men) in central/northern Europe.
We found distinct differences in meal patterns with marked diversity for intake frequency and lunch and snack consumption between Mediterranean and central/northern European countries. Monitoring of meal patterns across various cultures and populations could provide critical context to the research efforts to characterize relationships between dietary intake and health.
Two experiments studied the effects of dietary chicory against gastrointestinal nematodes in cattle. In Experiment (Exp.) 1, stabled calves were fed chicory silage (CHI1; n = 9) or ryegrass/clover hay (CTL1; n = 6) with balanced protein/energy intakes between groups. After 16 days, all calves received 10 000 Ostertagia ostertagi and 66 000 Cooperia oncophora third-stage larvae (L3) [day (D) 0 post-infection (p.i.)]. In Exp. 2, calves were assigned to pure chicory (CHI2; n=10) or ryegrass/clover (CTL2; n = 10) pastures. After 7 days, animals received 20 000 O. ostertagi L3/calf (D0 p.i.) and were moved regularly preventing pasture-borne infections. Due to poor regrowth of the chicory pasture, CHI2 was supplemented with chicory silage. At D40 p.i. (Exp. 1) and D35 p.i. (Exp. 2) calves were slaughtered for worm recovery. In Exp.1, fecal egg counts (FEC) were similar between groups. However, O. ostertagi counts were significantly reduced in CHI1 by 60% (geometric mean; P < 0·01), whereas C. oncophora burdens were unaffected (P = 0·12). In Exp. 2, FEC were markedly lowered in CHI2 from D22 p.i onwards (P < 0·01). Ostertagia ostertagi adult burdens were significantly reduced in CHI2 by 66% (P < 0·001). Sesquiterpene lactones were identified only in chicory (fresh/silage). Chicory shows promise as an anti-Ostertagia feed for cattle and further studies should investigate its on-farm use.