To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Post-traumatic stress disorder (PTSD) is a serious mental disorder that develops in some individuals following exposure to severe psychological stressors. In this chapter, we provide an overview of the conceptual issues, specific methods, and practical considerations in evidence-based assessment of PTSD. First, we outline the conceptual issues and practical components of a comprehensive assessment of PTSD. Second, we provide an overview of the most widely used self-rated and clinician-rated measures of trauma exposure and PTSD, comorbid disorders, and response bias. Third, we discuss cultural considerations in assessing PTSD. Fourth, we offer practical guidelines for conducting a clinically sensitive assessment of PTSD, highlighting some of the unique considerations in engaging trauma survivors in the assessment process and optimizing the information obtained. Last, we briefly summarize conceptual considerations and specific measures for other trauma- and stressor-related disorders.
New dietary-based concepts are needed for treatment and effective prevention of overweight and obesity. The primary objective was to investigate if reduction in appetite is associated with improved weight loss maintenance. This cohort study was nested within the European Commission project Satiety Innovation (SATIN). Participants achieving ≥8% weight loss during an initial 8-week low-energy formula diet were included in a 12-week randomised double-blind parallel weight loss maintenance intervention. The intervention included food products designed to reduce appetite or matching controls along with instructions to follow national dietary guidelines. Appetite was assessed by ad libitum energy intake and self-reported appetite evaluations using visual analogue scales during standardised appetite probe days. These were evaluated at the first day of the maintenance period compared with baseline (acute effects after a single exposure of intervention products) and post-maintenance compared with baseline (sustained effects after repeated exposures of intervention products) regardless of randomisation. A total of 181 participants (forty-seven men and 134 women) completed the study. Sustained reduction in 24-h energy intake was associated with improved weight loss maintenance (R 0·37; P = 0·001), whereas the association was not found acutely (P = 0·91). Suppression in self-reported appetite was associated with improved weight loss maintenance both acutely (R −0·32; P = 0·033) and sustained (R −0·33; P = 0·042). Reduction in appetite seems to be associated with improved body weight management, making appetite-reducing food products an interesting strategy for dietary-based concepts.
The aim of this study was to translate, culturally adapt, and psychometrically evaluate the Brazilian version of the “End-of-Life Professional Caregiver Survey” (BR-EPCS).
This is an observational cross-sectional study. The sample was composed of 285 Brazilian healthcare professionals who work or worked in the palliative care area. A minimum number of 280 participants were established, following the recommendation of 10 subjects for each instrument item. The European Organisation for Research and Treatment of Cancer — Quality of Life Group Translation Procedure protocol was used for the translation and the cultural adaptation. For the precise/reliable evaluation of factors measured by the BR-EPCS, Cronbach's alpha (α) and composite reliability coefficients were used. The factorial analyses were made by means of the exploratory structural equation modeling methods and confirmatory factor analysis. We have conducted a multiple linear regression analysis to evaluate the sociodemographic variables' capabilities in the result prediction measured by BR-EPCS factors.
The factorial analysis showed the relevance of two factors: Factor 1 — “Given care effectiveness” (18 items; Cronbach's α = 0.94; Composite Reliability = 0.95) and Factor 2 — “Mourning and ethical and cultural values” (10 items; Cronbach's α = 0.89; Composite Reliability = 0.88). Multiple linear regression analyses revealed that the working time, sex, palliative care training, and its own advance directives are predictors of the constructs assessed by the BR-EPCS.
Significance of results
The BR-EPCS is a reliable, valid, and culturally appropriate tool to identify the educational needs of healthcare professionals who work with palliative care. This instrument can be used for educational and research reasons.
Telomeres are nucleoprotein complexes that form the ends of eukaryotic chromosomes where they protect DNA from genomic instability, prevent end-to-end fusion and limit cellular replicative capabilities. Increased telomere attrition rates, and relatively shorter telomere length, is associated with genomic instability and has been linked with several chronic diseases, malignancies and reduced longevity. Telomeric DNA is highly susceptible to oxidative damage and dietary habits may make an impact on telomere attrition rates through the mediation of oxidative stress and chronic inflammation. The aim of this study was to examine the association between leucocyte telomere length (LTL) with both the Dietary Inflammatory Index® 2014 (DII®) and the Alternative Healthy Eating Index 2010 (AHEI-2010). This is a cross-sectional analysis using baseline data from 263 postmenopausal women from the Alberta Physical Activity and Breast Cancer Prevention (ALPHA) Trial, in Calgary and Edmonton, Alberta, Canada. No statistically significant association was detected between LTL z-score and the AHEI-2010 (P = 0·20) or DII® (P = 0·91) in multivariable adjusted models. An exploratory analysis of AHEI-2010 and DII® parameters and LTL revealed anthocyanidin intake was associated with LTL (P < 0·01); however, this association was non-significant after a Bonferroni correction was applied (P = 0·27). No effect modification by age, smoking history, or recreational physical activity was detected for either relationship. Increased dietary antioxidant and decreased oxidant intake were not associated with LTL in this analysis.
Delirium is often missed in older outpatients. Caregivers can give valuable information that might improve identification rates. The aim of this study was to develop a short and sensitive delirium caregiver questionnaire (DCQ) for triage of elderly outpatients with cognitive impairment by telephone.
Design, setting, and participants:
The pilot questionnaire was administered to 112 caregivers of patients who were referred for dementia screening to our clinic for geriatric psychiatry, and the final DCQ to 234 other caregivers.
In phase I (2013–2014), we tested a pilot questionnaire with 17 items. Health professionals who established delirium diagnoses were blinded to the results. We then used the results and other information available at referral to construct the final DCQ with seven items. During phase II (2015–2016), we investigated the test accuracy of the final DCQ in a subsequent cohort. In both phases, the patients received a structured diagnostic workup. Time between referral and first visit was a secondary outcome.
The final DCQ consisted of the following items: emergency visit required, sleeping disorder, fluctuating course, hallucinations, suspicious thoughts, previous delirium, and recent discharge from hospital. DCQ results indicated that urgent intake was required in 85 of 234 patients. Sensitivity was 73.5% (95% CI: 58.9–85.1%) and specificity 73.5% (95% CI: 66.5–79.7%). The mean number of days to first visit dropped from 31.6 to 11.2 in delirious patients (p = 0.001).
Triage with the easy-to-use DCQ among patients referred for cognitive screening leads to earlier assessment and higher detection rates of delirium.
Conservation researchers are increasingly drawing on a wide range of philosophies, methods and values to examine conservation problems. Here we adopt methods from social psychology to develop a questionnaire with the dual purpose of illuminating diversity within conservation research communities and providing a tool for use in cross-disciplinary dialogue workshops. The questionnaire probes the preferences that different researchers have with regards to conservation science. It elicits insight into their motivations for carrying out research, the scales at which they tackle problems, the subjects they focus on, their beliefs about the connections between nature and society, their sense of reality as absolute or socially constituted, and their propensity for collaboration. Testing the questionnaire with a group of 204 conservation scientists at a student conference on conservation science, we illustrate the latent and multidimensional diversity in the research preferences held by conservation scientists. We suggest that creating opportunities to further explore these differences and similarities using facilitated dialogue could enrich the mutual understanding of the diverse research community in the conservation field.
Medical responders are at-risk of experiencing a wide range of negative psychological health conditions following a disaster.
Published literature was reviewed on the adverse psychological health outcomes in medical responders to various disasters and mass casualties in order to: (1) assess the psychological impact of disasters on medical responders; and (2) identify the possible risk factors associated with psychological impacts on medical responders.
A literature search of PubMed, Discovery Service, Science Direct, Google Scholar, and Cochrane databases for studies on the prevalence/risk factors of posttraumatic stress disorder (PTSD) and other mental disorders in medical responders of disasters and mass casualties was carried out using pre-determined keywords. Two reviewers screened the 3,545 abstracts and 28 full-length articles which were included for final review.
Depression and PTSD were the most studied outcomes in medical responders. Nurses reported higher levels of adverse outcomes than physicians. Lack of social support and communication, maladaptive coping, and lack of training were important risk factors for developing negative psychological outcomes across all types of disasters.
Disasters have significant adverse effects on the mental well-being of medical responders. The prevalence rates and presumptive risk factors varied among three different types of disasters. There are certain high-risk, vulnerable groups among medical responders, as well as certain risk factors for adverse psychological outcomes. Adapting preventive measures and mitigation strategies aimed at high-risk groups would be beneficial in decreasing negative outcomes.
Population ethics is widely considered to be exceptionally important and exceptionally difficult. One key source of difficulty is the conflict between certain moral intuitions and analytical results identifying requirements for rational (in the sense of complete and transitive) social choice over possible populations. One prominent such intuition is the Asymmetry, which jointly proposes that the fact that a possible child’s quality of life would be bad is a normative reason not to create the child, but the fact that a child’s quality of life would be good is not a reason to create the child. This paper reports a set of questionnaire experiments about the Asymmetry in the spirit of economists’ empirical social choice. Few survey respondents show support for the Asymmetry; instead respondents report that expectations of a good quality of life are relevant. Each experiment shows evidence (among at least some participants) of dual-process moral reasoning, in which cognitive reflection is statistically associated with reporting expected good quality of life to be normatively relevant. The paper discusses possible implications of these results for the economics of population-sensitive social welfare and for the conflict between moral mathematics and population intuition.
As a quasi-judicial body, the WGAD operates without a formal set of rules but is instead guided in implementing its mandate by its Methods of Work, which explain the overall process by which to submit a case for consideration.1 That said, however, the WGAD considers individual cases brought to it in closed sessions of its members and staff. And the day-to-day operations of the WGAD are run by a small secretariat of staff of the Office of the UN High Commissioner for Human Rights (OHCHR) in its Special Procedures Branch, led by its Secretary. This chapter will illuminate and elucidate how an individual case can most effectively be brought to the WGAD, supplementing the procedures described in the Methods of Work with the author’s practical experience gained from having taken more than forty-five cases to the WGAD and having interviewed many current and former WGAD members and staff.
Malnutrition risk screening in cirrhotic patients is crucial, as poor nutritional status negatively affects disease prognosis and survival. Given that a variety of malnutrition screening tools is usually used in routine clinical practice, the effectiveness of eight screening tools in detecting malnutrition risk in cirrhotic patients was sought. A total of 170 patients (57·1 % male, 59·4 (sd 10·5) years, 50·6 % decompensated ones) with cirrhosis of various aetiologies were enrolled. Nutritional screening was performed using the Malnutrition Universal Screening Tool, Nutritional Risk Index, Malnutrition Screening Tool, Nutritional Risk Screening (NRS-2002), Birmingham Nutritional Risk Score, Short Nutritional Assessment Questionnaire, Royal Free Hospital Nutritional Prioritizing Tool (RFH-NPT) and Liver Disease Undernutrition Screening Tool (LDUST). Malnutrition diagnosis was defined using the Subjective Global Assessment (SGA). Data on 1-year survival were available for 145 patients. The prevalence of malnutrition risk varied according to the screening tools used, with a range of 13·5–54·1 %. RFH-NPT and LDUST were the most accurate in detecting malnutrition (AUC = 0·885 and 0·892, respectively) with a high sensitivity (97·4 and 94·9 %, respectively) and fair specificity (73·3 and 58 %, respectively). Malnutrition according to SGA was an independent prognostic factor of within 1-year mortality (relative risk was 2·17 (95 % CI 1·0, 4·7), P = 0·049) after adjustment for sex, age, disease aetiology and Model for End-stage Liver Disease score, whereas nutrition risk according to RFH-NPT, LDUST and NRS-2002 showed no association. RFH-NPT and LDUST were the only screening tools that proved to be accurate in detecting malnutrition in cirrhotic patients.
The risk of undernutrition in older community-dwelling adults increases when they are no longer able to shop or cook themselves. Home-delivered products could then possibly prevent them from becoming undernourished. This single-blind randomised trial tested the effectiveness of home-delivered protein-rich ready-made meals and dairy products in reaching the recommended intake of 1·2 g protein/kg body weight (BW) per d and ≥25 g of protein per meal. Community-dwelling older adults (n 98; mean age 80·4 (sd 6·8) years) switched from self-prepared to home-delivered hot meals and dairy products for 28 d. The intervention group received ready-made meals and dairy products high in protein; the control group received products lower in protein. Dietary intake was measured at baseline, after 2 weeks (T1), and after 4 weeks (T2). Multilevel analyses (providing one combined outcome for T1 and T2) and logistic regressions were performed. Average baseline protein intake was 1·09 (se 0·05) g protein/kg BW per d in the intervention group and 0·99 (se 0·05) g protein/kg BW per d in the control group. During the trial, protein intake of the intervention group was 1·12 (se 0·05) g protein/kg BW per d compared with 0·87 (se 0·03) g protein/kg BW per d in the control group (between-group differences P < 0·05). More participants of the intervention group reached the threshold of ≥25 g protein at dinner compared with the control group (intervention T1: 84·8 %, T2: 88·4 % v. control T1: 42·9 %, T2: 40·5 %; P < 0·05), but not at breakfast and lunch. Our findings suggest that switching from self-prepared meals to ready-made meals carries the risk of a decreasing protein intake, unless extra attention is given to protein-rich choices.
The first positive genome-wide association study on gestational length and preterm delivery showed associations with a gene involved in the selenium metabolism. In this study we examine the associations between maternal intake of selenium and selenium status with gestational length and preterm delivery in 72,025 women with singleton live births from the population based, prospective Norwegian Mother, Father and Child Cohort Study (MoBa). A self-reported, semi-quantitative food-frequency questionnaire answered in pregnancy week 22 was used to estimate selenium intake during the first half of pregnancy. Associations were analysed with adjusted linear and cox regressions. Selenium status was assessed in whole blood collected in gestational week 17 (n=2,637). Median dietary selenium intake was 53 (IQR: 44-62) µg/day, supplements provided additionally 50 (30-75) µg/day for supplement-users (n=23,409). Maternal dietary selenium intake was significantly associated with prolonged gestational length (β per SD=0.25, 95% CI=0.07-0.43) and decreased risk for preterm delivery (n=3,618, HR per SD=0.92, 95% CI=0.87-0.98). Neither selenium intake from supplements nor maternal blood selenium status was associated with gestational length or preterm delivery. Hence, this study showed that maternal dietary selenium intake, but not intake of selenium containing supplements, during the first half of pregnancy was significantly associated with decreased risk for preterm delivery. Further investigations, preferably in the form of a large RCT, are needed to elucidate the impact of selenium on pregnancy duration.
Farmer training is important to improve weed management practices in tea cultivation. To explore the group characteristics of tea growers, we interviewed 354 growers in Guizhou Province, China. Sixty-one percent of the respondents planted tea for companies or cooperative groups, and 56% managed tea gardens larger than 10 ha. Self-employed tea growers tended to be older and smallholders, and to apply herbicides and conduct weed control less frequently (P < 0.05). Approximately 87% of the respondents conducted weed control two to four times yr−1, 83% spent between $200 and $2,000 ha−1 yr−1 for weed control, and 42% thought weed control costs would decrease by 5 years from this study. Twenty-eight species were mentioned by the respondents as being the most serious. According to canonical correspondence analysis, latitude, altitude, being self-employed or a member of a cooperative, having training experience in tea-garden weed management, and frequency and cost of weed control in tea gardens had significant (P < 0.05) influence on the composition of most troublesome weed species listed by respondents. Among the respondents, 60% had had farmer’s training on weed management in tea gardens. Of these, a significant number (P < 0.05) tended to think weed control costs would decrease, and a nonsignificant number (P > 0.05) tended to conduct weed control more frequently and have lower weed management costs in their tea gardens.
To evaluate differences in children’s eating behaviour in relation to their weight status.
Prospective, cross-sectional study. Anthropometric measures were taken and age- and sex-adjusted BMI percentiles and Z-scores were calculated according to the Centers for Disease Control and Prevention recommendations to assess weight status. Parents completed a questionnaire which included demographic data and the Children’s Eating Behaviour Questionnaire (CEBQ) to assess eating behaviour.
Tuzla Canton, Bosnia and Herzegovina (September 2016–September 2017).
Male and female children aged 3–10 years and one of their parents.
The study sample comprised 2500 children; 6·8 % of them were underweight and 14·4 % were overweight, while there were 14·8 % obese children and 64·0 % had normal weight. The factor analysis of CEBQ revealed an eight-factor solution. Significant differences in CEBQ subscale scores were found within BMI categories for all CEBQ subscales except Food Fussiness. On the other hand, child BMI Z-scores showed a linear increase with the ‘food approach’ subscales of the CEBQ, except the Desire to Drink subscale which was excluded from analysis, and a decrease with ‘food avoidant’ subscales.
The present study suggests that the CEBQ is valuable for identifying specific eating styles that are associated with weight status and can be seen as important and modifiable determinants implicated in the development and maintenance of overweight/obesity as well as underweight.
Previous models suggest biological and behavioral continua among healthy individuals (HC), at-risk condition, and full-blown schizophrenia (SCZ). Part of these continua may be captured by schizotypy, which shares subclinical traits and biological phenotypes with SCZ, including thalamic structural abnormalities. In this regard, previous findings have suggested that multivariate volumetric patterns of individual thalamic nuclei discriminate HC from SCZ. These results were obtained using machine learning, which allows case–control classification at the single-subject level. However, machine learning accuracy is usually unsatisfactory possibly due to phenotype heterogeneity. Indeed, a source of misclassification may be related to thalamic structural characteristics of those HC with high schizotypy, which may resemble structural abnormalities of SCZ. We hypothesized that thalamic structural heterogeneity is related to schizotypy, such that high schizotypal burden would implicate misclassification of those HC whose thalamic patterns resemble SCZ abnormalities.
Following a previous report, we used Random Forests to predict diagnosis in a case–control sample (SCZ = 131, HC = 255) based on thalamic nuclei gray matter volumes estimates. Then, we investigated whether the likelihood to be classified as SCZ (π-SCZ) was associated with schizotypy in 174 HC, evaluated with the Schizotypal Personality Questionnaire.
Prediction accuracy was 72.5%. Misclassified HC had higher positive schizotypy scores, which were correlated with π-SCZ. Results were specific to thalamic rather than whole-brain structural features.
These findings strengthen the relevance of thalamic structural abnormalities to SCZ and suggest that multivariate thalamic patterns are correlates of the continuum between schizotypy in HC and the full-blown disease.
This trial compared weight loss outcomes over 14 weeks in women showing low- or high-satiety responsiveness (low- or high-satiety phenotype (LSP, HSP)) measured by a standardised protocol. Food preferences and energy intake (EI) after low and high energy-density (LED, HED) meals were also assessed. Ninety-six women (n 52 analysed; 41·24 (SD 12·54) years; 34·02 (sd 3·58) kg/m2) engaged in one of two weight loss programmes underwent LED and HED laboratory test days during weeks 3 and 12. Preferences for LED and HED food (Leeds Food Preference Questionnaire) and ad libitum evening meal and snack EI were assessed in response to equienergetic LED and HED breakfasts and lunches. Weekly questionnaires assessed control over eating and ease of adherence to the programme. Satiety quotients based on subjective fullness ratings post LED and HED breakfasts determined LSP (n 26) and HSP (n 26) by tertile splits. Results showed that the LSP lost less weight and had smaller reductions in waist circumference compared with HSP. The LSP showed greater preferences for HED foods, and under HED conditions, consumed more snacks (kJ) compared with HSP. Snack EI did not differ under LED conditions. LSP reported less control over eating and reported more difficulty with programme adherence. In conclusion, low-satiety responsiveness is detrimental for weight loss. LED meals can improve self-regulation of EI in the LSP, which may be beneficial for longer-term weight control.
Cancer diagnosis affects patients, their families, and their caregivers in particular. This study focused on the validation of the CareGiver Oncology Quality of Life (CarGOQoL) questionnaire in Portuguese caregivers of patients with multiple myeloma, from the caregiver's point of view.
This was a cross-sectional study with 146 caregivers of patients with multiple myeloma from outpatient medical oncology and clinical hematology consultations from five hospitals in north and central Portugal. Participants were assessed on quality of life (QoL), psychological morbidity and social support.
The Portuguese version maintains 17 of the original 29 items version, maintaining general coherence and a dimensional structure that is clinically interpretable. Reliability findings indicated good internal consistency for the total scale (0.86) and respective subscales (0.75 to 0.88), which is in agreement with the alpha values from the previous CarGOQoL validation study for the corresponding subscales (0.74 to 0.89) and total scale (0.90).
Significance of results
The CarGOQoL is a reliable and valid tool for clinical trials and intervention programs to assess QoL in caregivers of myeloma patients. Future studies should validate the adapted version in caregivers of other types of cancer patients including other chronic diseases.
To describe the impact of CHD surgery in early childhood on quality of life in children aged 10–16 years with surgically corrected Ventricular Septal Defect, Transposition of the Great Arteries, and Tetralogy of Fallot.
A cross-sectional survey study of quality of life survey on 161 children and adolescents aged 10–16 years with surgically corrected Ventricular Septal Defect, Transposition of the Great Arteries, and Tetralogy of Fallot. The international Paediatric Quality of Life 4.0 quality of life questionnaires were applied and collected for assessment from patients and parents. The endpoints were total, physical, emotional, social, and school quality of life scores.
The quality of life total and school scores was significantly lower in children with CHD than their healthy peers. There was no significant difference in quality of life between the three CHD groups. All three CHD groups had a significantly lower total (7.7–13.2%, p<0.001) and school scores (21.1–31.6%, p<0.001) than the control group. The tetralogy of Fallot group was the only group that had significantly lower scores in the physical subscale (p<0.001) than the controls.
Children and adolescents with surgically corrected CHD show losses in quality of life in total and school scores compared to healthy controls. The tetralogy of Fallot group was the only CHD group that had significantly lower physical score than the controls.