To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Despite the clinical use of dignity therapy (DT) to enhance end-of-life experiences and promote an increased sense of meaning and purpose, little is known about the cost in practice settings. The aim is to examine the costs of implementing DT, including transcriptions, editing of legacy document, and dignity-therapists’ time for interviews/patient’s validation.
Analysis of a prior six-site, randomized controlled trial with a stepped-wedge design and chaplains or nurses delivering the DT.
The mean cost per transcript was $84.30 (SD = 24.0), and the mean time required for transcription was 52.3 minutes (SD = 14.7). Chaplain interviews were more expensive and longer than nurse interviews. The mean cost and time required for transcription varied across the study sites. The typical total cost for each DT protocol was $331–$356.
Significance of results
DT implementation costs varied by provider type and study site. The study’s findings will be useful for translating DT in clinical practice and future research.
Dignity therapy (DT) is a guided process conducted by a health professional for reviewing one's life to promote dignity through the illness process. Empathic communication has been shown to be important in clinical interactions but has yet to be examined in the DT interview session. The Empathic Communication Coding System (ECCS) is a validated, reliable coding system used in clinical interactions. The aims of this study were (1) to assess the feasibility of the ECCS in DT sessions and (2) to describe the process of empathic communication during DT sessions.
We conducted a secondary analysis of 25 transcripts of DT sessions with older cancer patients. These DT sessions were collected as part of larger randomized controlled trial. We revised the ECCS and then coded the transcripts using the new ECCS-DT. Two coders achieved inter-rater reliability (κ = 0.84) on 20% of the transcripts and then independently coded the remaining transcripts.
Participants were individuals with cancer between the ages of 55 and 75. We developed the ECCS-DT with four empathic response categories: acknowledgment, reflection, validation, and shared experience. We found that of the 235 idea units, 198 had at least one of the four empathic responses present. Of the total 25 DT sessions, 17 had at least one empathic response present in all idea units.
Significance of results
This feasibility study is an essential first step in our larger program of research to understand how empathic communication may play a role in DT outcomes. We aim to replicate findings in a larger sample and also investigate the linkage empathic communication may have in the DT session to positive patient outcomes. These findings, in turn, may lead to further refinement of training for dignity therapists, development of research into empathy as a mediator of outcomes, and generation of new interventions.
Dignity Therapy (DT) has been implemented over the past 20 years, but a detailed training protocol is not available to facilitate consistency of its implementation. Consistent training positively impacts intervention reproducibility.
The objective of this article is to describe a detailed method for DT therapist training.
Chochinov's DT training seminars included preparatory reading of the DT textbook, in-person training, and practice interview sessions. Building on this training plan, we added feedback on practice and actual interview sessions, a tracking form to guide the process, a written training manual with an annotated model DT transcript, and quarterly support sessions. Using this training method, 18 DT therapists were trained across 6 sites.
The DT experts’ verbal and written feedback on the practice and actual sessions encouraged the trainees to provide additional attention to eight components: (1) initial framing (i.e., clarifying and organizing of the patient's own goals for creating the legacy document), (2) verifying the patient's understanding of DT, (3) gathering the patient's biographical information, (4) using probing questions, (5) exploring the patient's story thread, (6) refocusing toward the legacy document creation, (7) inviting the patient's expression of meaningful messages, and (8) general DT processes. Evident from the ongoing individual trainee mentoring was achievement and maintenance of adherence to the DT protocol.
The DT training protocol is a process to enable consistency in the training process, across waves of trainees, toward the goal of maintaining DT implementation consistency. This training protocol will enable future DT researchers and clinicians to consistently train therapists across various disciplines and locales. Furthermore, we anticipate that this training protocol could be generalizable as a roadmap for implementers of other life review and palliative care interview-based interventions.
Diet modifies the risk of colorectal cancer (CRC), and inconclusive evidence suggests that yogurt may protect against CRC. We analysed the data collected from two separate colonoscopy-based case–control studies. The Tennessee Colorectal Polyp Study (TCPS) and Johns Hopkins Biofilm Study included 5446 and 1061 participants, respectively, diagnosed with hyperplastic polyp (HP), sessile serrated polyp, adenomatous polyp (AP) or without any polyps. Multinomial logistic regression models were used to derive OR and 95 % CI to evaluate comparisons between cases and polyp-free controls and case–case comparisons between different polyp types. We evaluated the association between frequency of yogurt intake and probiotic use with the diagnosis of colorectal polyps. In the TCPS, daily yogurt intake v. no/rare intake was associated with decreased odds of HP (OR 0·54; 95 % CI 0·31, 0·95) and weekly yogurt intake was associated with decreased odds of AP among women (OR 0·73; 95 % CI 0·55, 0·98). In the Biofilm Study, both weekly yogurt intake and probiotic use were associated with a non-significant reduction in odds of overall AP (OR 0·75; 95 % CI 0·54, 1·04) and (OR 0·72; 95 % CI 0·49, 1·06) in comparison with no use, respectively. In summary, yogurt intake may be associated with decreased odds of HP and AP and probiotic use may be associated with decreased odds of AP. Further prospective studies are needed to verify these associations.
The present study compared executive dysfunction among children with attention-deficit/hyperactivity disorder (ADHD) after traumatic brain injury (TBI), also called secondary ADHD (S-ADHD), pre-injury ADHD and children with TBI only (i.e., no ADHD). Youth aged 6–16 years admitted for TBI to five trauma centers were enrolled (n=177) and evaluated with a semi-structured psychiatric interview scheduled on three occasions (within 2 weeks of TBI, i.e., baseline assessment for pre-injury status; 6-months and 12-months post-TBI). This permitted the determination of 6- and 12-month post-injury classifications of membership in three mutually exclusive groups (S-ADHD; pre-injury ADHD; TBI-only). Several executive control measures were administered. Unremitted S-ADHD was present in 17/141 (12%) children at the 6-month assessment, and in 14/125 (11%) children at 12-months post-injury. The study found that children with S-ADHD exhibited deficient working memory, attention, and psychomotor speed as compared to children with pre-injury ADHD. Furthermore, the children with S-ADHD and the children with TBI-only were impaired compared to the children with pre-injury ADHD with regard to planning. No group differences related to response inhibition emerged. Age, but not injury severity, gender, or adaptive functioning was related to executive function outcome. Neuropsychological sequelae distinguish among children who develop S-ADHD following TBI and those with TBI only. Moreover, there appears to be a different pattern of executive control performance in those who develop S-ADHD than in children with pre-injury ADHD suggesting that differences exist in the underlying neural mechanisms that define each disorder, underscoring the need to identify targeted treatment interventions. (JINS, 2014, 20, 971–981)
The implementation in Ontario of 15 primary-care–based interprofessional memory clinics represented a unique model of team-based case management aimed at increasing capacity for dementia care at the primary-care level. Each clinic tracked referrals; in a subset of clinics, charts were audited by geriatricians, clinic members were interviewed, and patients, caregivers, and referring physicians completed satisfaction surveys. Across all clinics, 582 patients were assessed, and 8.9 per cent were referred to a specialist. Patients and caregivers were very satisfied with the care received, as were referring family physicians, who reported increased capacity to manage dementia. Geriatricians’ chart audits revealed a high level of agreement with diagnosis and management. This study demonstrated acceptability, feasibility, and preliminary effectiveness of the primary-care memory clinic model. Led by specially trained family physicians, it provided timely access to high-quality collaborative dementia care, impacting health service utilization by more-efficient use of scarce geriatric specialist resources.
Dietary reference values for essential trace elements are designed to meet requirements with minimal risk of deficiency and toxicity. Risk–benefit analysis requires data on habitual dietary intakes, an estimate of variation and effects of deficiency and excess on health. For some nutrients, the range between the upper and lower limits may be extremely narrow and even overlap, which creates difficulties when setting safety margins. A new approach for estimating optimal intakes, taking into account several health biomarkers, has been developed and applied to selenium, but at present there are insufficient data to extend this technique to other micronutrients. The existing methods for deriving reference values for Cu and Fe are described. For Cu, there are no sensitive biomarkers of status or health relating to marginal deficiency or toxicity, despite the well-characterised genetic disorders of Menkes and Wilson's disease which, if untreated, lead to lethal deficiency and overload, respectively. For Fe, the wide variation in bioavailability confounds the relationship between intake and status and complicates risk–benefit analysis. As with Cu, health effects associated with deficiency or toxicity are not easy to quantify, therefore status is the most accessible variable for risk–benefit analysis. Serum ferritin reflects Fe stores but is affected by infection/inflammation, and therefore additional biomarkers are generally employed to measure and assess Fe status. Characterising the relationship between health and dietary intake is problematic for both these trace elements due to the confounding effects of bioavailability, inadequate biomarkers of status and a lack of sensitive and specific biomarkers for health outcomes.
Tourette Syndrome (TS) in children is associated with various neurobehavioral disorders including attention deficit hyperactivity disorder (ADHD). Children with TS and ADHD show some difficulties with neuropsychological tasks, but we do not know if children with TS alone have neuropsychological deficits. To assess specific cognitive differences among children with TS and/or ADHD, we administered a battery of neuropsychological tests, including 10 tasks related to executive function (EF), to 10 children with TS-only, 48 with ADHD-only, and 32 with TS+ADHD. Children in all groups could not efficiently produce output on a timed continuous performance task [Test of Variables of Attention (TOVA) mean reaction time and reaction time variability]. Children with TS-only appeared to have fewer EF impairments and significantly higher perceptual organization scores than children with TS+ADHD or ADHD-only. These findings suggest that deficiencies in choice reaction time and consistency of timed responses are common to all three groups, but children with TS-only have relatively less EF impairment than children with TS+ADHD or ADHD-only. (JINS, 1995, 1, 511–516.)
The essentiality of copper (Cu) in humans is demonstrated by various clinical features associated with deficiency, such as anaemia, hypercholesterolaemia and bone malformations. Despite significant effort over several decades a sensitive and specific Cu status biomarker has yet to be identified. The present article updates a comprehensive review recently published by the authors which assesses the reliability and robustness of current biomarkers and outlines the on-going search for novel indicators of status(1). The essential features of this earlier review are reiterated whilst considering whether there are other approaches, not yet tested, which may provide valuable information in the quest for an appropriate measure of copper status. Current biomarkers include a range of cuproenzymes such as the acute phase protein caeruloplasmin and Cu-Zn-superoxide dismutase all of which are influenced by a range of other dietary and environmental factors. A recent development is the identification of the Cu chaperone, CCS as a potential biomarker; although its reliability has yet to be established. This appears to be the most promising potential biomarker, responding to both Cu deficiency and excess. The potential for identifying a ‘suite’ of biomarkers using high-throughput technologies such as transcriptomics and proteomics is only now being examined. A combination of these technologies in conjunction with a range of innovative metal detection techniques is essential if the search for robust copper biomarkers is to be successful.
Relatively little research has focused on everyday memory function in childhood, possibly reflecting the limited number of measures available. This study introduces the Observer Memory Questionnaire—Parent Form (OMQ-PF), which assesses parental beliefs about their child's everyday memory. The OMQ-PF and a selection of neuropsychological measures were administered to a cohort of healthy children in Study 1 (n = 376; 5–16 years old) and a temporal lobe epilepsy (TLE) group in Study 2 (n = 44; 6–16 years old). Study 1 found the OMQ-PF had sound internal consistency and was significantly correlated to a learning task. Study 2 found the TLE group was impaired on the OMQ-PF relative to the healthy cohort. Everyday memory ratings were related to a wider range of neuropsychological measures in this group. Findings are encouraging in terms of the properties of the OMQ-PF and suggest further development of the scale is warranted. (JINS, 2008, 14, 337–342.)
The post-genomic technologies are generating vast quantities of data but many nutritional scientists are not trained or equipped to analyse it. In high-resolution NMR spectra of urine, for example, the number and complexity of spectral features mean that computational techniques are required to interrogate and display the data in a manner intelligible to the researcher. In addition, there are often multiple underlying biological factors influencing the data and it is difficult to pinpoint which are having the most significant effect. This is especially true in nutritional studies, where small variations in diet can trigger multiple changes in gene expression and metabolite concentration. One class of computational tools that are useful for analysing this highly multivariate data include the well-known ‘whole spectrum’ methods of principal component analysis and partial least squares. In this work, we present a nutritional case study in which NMR data generated from a human dietary Cu intervention study is analysed using multivariate methods and the advantages and disadvantages of each technique are discussed. It is concluded that an alternative approach, called feature subset selection, will be important in this type of work; here we have used a genetic algorithm to identify the small peaks (arising from metabolites of low concentration) that have been altered significantly following a dietary intervention.
Hepcidin plays a major role in iron homeostasis, but understanding its role has been hampered by the absence of analytical methods for quantification in blood. A commercial ELISA has been developed for serum prohepcidin, a hepcidin precursor, and there is interest in its potential use in the clinical and research arena. We investigated the association between serum prohepcidin concentration and iron absorption in healthy men, and its relationship with iron status in men carrying HFE mutations, hereditary haemochromatosis patients, and pregnant women. Iron absorption was determined in thirty healthy men (fifteen wild-type, fifteen C282Y heterozygote) using the stable isotope red cell incorporation technique. Iron status was measured in 138 healthy men (ninety-one wild-type, forty-seven C282Y heterozygote), six hereditary haemochromatosis patients, and thirteen pregnant women. Mean serum prohepcidin concentrations were 214 (sd 118) ng/ml [208 (sd 122) ng/ml in wild-type and 225 (sd 109) ng/ml in C282Y heterozygotes] in healthy men, 177 (sd 36) ng/ml in haemochromatosis patients, and 159 (sd 59) ng/ml in pregnant women. There was no relationship between serum prohepcidin concentration and serum ferritin in any subject groups, nor was it associated with efficiency of iron absorption. Serum prohepcidin is not a useful biomarker for clinical or research purposes.
Women of childbearing age are at risk of Fe deficiency if insufficient dietary Fe is available to replace menstrual and other Fe losses. Haem Fe represents 10–15 % of dietary Fe intake in meat-rich diets but may contribute 40 % of the total absorbed Fe. The aim of the present study was to determine the relative effects of type of diet and menstrual Fe loss on Fe status in women. Ninety healthy premenopausal women were recruited according to their habitual diet: red meat, poultry/fish or lacto-ovo-vegetarian. Intake of Fe was determined by analysing 7 d duplicate diets, and menstrual Fe loss was measured using the alkaline haematin method. A substantial proportion of women (60 % red meat, 40 % lacto-ovo-vegetarian, 20 % poultry/fish) had low Fe stores (serum ferritin <10 μg/l), but the median serum ferritin concentration was significantly lower in the red meat group (6·8 μg/l (interquartile range 3·3, 16·25)) than in the poultry/fish group (17·5 μg/l (interquartile range 11·3, 22·4) (P<0·01). The mean and standard deviation of dietary Fe intake were significantly different between the groups (P=0·025); the red meat group had a significantly lower intake (10·9 (sd 4·3) mg/d) than the lacto-ovo-vegetarians (14·5 (sd 5·5) mg/d), whereas that of the poultry/fish group (12·8 (sd 5·1) mg/d) was not significantly different from the other groups. There was no relationship between total Fe intake and Fe status, but menstrual Fe loss (P=0·001) and dietary group (P=0·040) were significant predictors of Fe status: poultry/fish diets were associated with higher Fe stores than lacto-ovo-vegetarian diets. Identifying individuals with high menstrual losses should be a key component of strategies to prevent Fe deficiency.
The study of Cu metabolism is hampered by a lack of sensitive and specific biomarkers of status and suitable isotopic labels, but limited information suggests that Cu homeostasis is maintained through changes in absorption and endogenous loss. The aim of the present study was to employ stable-isotope techniques to measure Cu absorption and endogenous losses in adult men adapted to low, moderate and high Cu-supplemented diets. Twelve healthy men, aged 20–59 years, were given diets containing 0·7, 1·6 and 6·0 mg Cu/d for 8 weeks, with at least 4 weeks intervening washout periods. After 6 weeks adaptation, apparent and true absorption of Cu were determined by measuring luminal loss and endogenous excretion of Cu following oral administration of 3 mg highly enriched 65Cu stable-isotope label. Apparent and true absorption (41 and 48% respectively) on the low-Cu diet were not significantly different from the high-Cu diet (45 and 48% respectively). Endogenous losses were significantly reduced on the low- (0·45mg/d; P<0·001) and medium- (0·81 mg/d; P=0·001) compared with the high-Cu diet (2·46mg/d). No biochemical changes resulting from the dietary intervention were observed. Cu homeostasis was maintained over a wide range of intake and more rapidly at the lower intake, mainly through changes in endogenous excretion.
Neuropsychological outcome was evaluated in a prospective,
longitudinal follow-up study of children age 4 months to
7 years at injury with either mild-to-moderate (N
= 35) or severe (N = 44) traumatic brain injury
(TBI). Age-appropriate tests were administered at baseline,
6 months, 12 months, and 24 months after the injury. Performance
was compared on (1) composite IQ and motor, (2) receptive
and expressive language, and (3) Verbal and Perceptual–Performance
IQ scores. In comparison to mild-to-moderate TBI, severe
TBI in infants and preschoolers produced deficits in all
areas. Interactions between task and severity of injury
were obtained. Motor scores were lower than IQ scores,
particularly after severe TBI. Both receptive and expressive
scores were reduced following severe TBI. Expressive language
scores were lower than receptive language scores for children
sustaining mild-to-moderate TBI. While severe TBI lowered
both Verbal and Perceptual– Performance IQ scores,
Verbal IQ scores were significantly lower than Perceptual–Performance
IQ scores after mild-to-moderate TBI. Mild injuries may
produce subtle linguistic changes adversely impacting estimates
of Verbal IQ and expressive language. Within the limited
age range evaluated within this study, age at injury was
unrelated to test scores: The impact of TBI was comparable
in children ages 4 to 41 months versus 42 to 72
months at the time of injury. All neuropsychological scores
improved significantly from baseline to the 6-month follow-up.
However, no further change in scores was observed from
6 to 24 months after the injury. The persistent deficits
and lack of catch-up over time suggest a reduction in the
rate of acquisition of new skills after severe TBI. Methodological
issues in longitudinal studies of young children were discussed.
(JINS, 1997, 3, 581–591.)
Email your librarian or administrator to recommend adding this to your organisation's collection.