To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Heat shock proteins (HSPs) consist of highly preserved stress proteins that are expressed in response to stress. Two studies were carried out to investigate whether HSP genes in hair follicles from beef calves can be suggested as indicators of heat stress (HS). In study 1, hair follicles were harvested from three male Hanwoo calves (aged 172.2 ± 7.20 days) on six dates over the period of 10 April to 9 August 2017. These days provided varying temperature–humidity indices (THIs). In study 2, 16 Hanwoo male calves (aged 169.6 ± 4.60 days, with a BW of 136.9 ± 6.23 kg) were maintained (4 calves per experiment) in environmentally controlled chambers. A completely randomized design with a 2 × 4 factorial arrangement involving two periods (thermoneutral: TN; HS) and four THI treatment groups (threshold: THI = 68 to 70; mild: THI = 74 to 76; moderate THI = 81 to 83; severe: THI = 88 to 90). The calves in the different group were subjected to ambient temperature (22°C) for 7 days (TN) and subsequently to the temperature and humidity corresponding to the target THI level for 21 days (HS). Every three days (at 1400 h) during both the TN and HS periods, the heart rate (HR) and rectal temperature (RT) of each individual were measured, and hair follicles were subsequently collected from the tails of each individual. In study 1, the high variation (P < 0.0001) in THI indicated that the external environment influenced the HS to different extents. The expression levels of the HSP70 and HSP90 genes at the high-THI level were higher (P = 0.0120, P = 0.0002) than those at the low-THI level. In study 2, no differences in the THI (P = 0.2638), HR (P = 0.2181) or RT (P = 0.3846) were found among the groups during the TN period, whereas differences in these indices (P < 0.0001, P < 0.0001 and P < 0.0001, respectively) were observed during the HS period. The expression levels of the HSP70 (P = 0.0010, moderate; P = 0.0065, severe) and HSP90 (P = 0.0040, severe) genes were increased after rapid exposure to heat-stress conditions (moderate and severe levels). We conclude that HSP gene expression in hair follicles provides precise and accurate data for evaluating HS and can be considered a novel indicator of HS in Hanwoo calves maintained in both external and climatic chambers.
The practice of foodborne illness outbreak investigations has evolved, shifting away from large-scale community case-control studies towards more focused case exposure assessments and sub-cluster investigations to identify contaminated food sources. Criteria to include or exclude cases are established to increase the efficiency of epidemiological analyses and traceback activities, but these criteria can also affect the investigator's ability to implicate a suspected food vehicle. A 2010 outbreak of Salmonella ser. Hvittingfoss infections associated with a chain of quick-service restaurants (Chain A) provided a useful case study on the impact of exclusion criteria on the ability to identify a food vehicle. In the original investigation, a case-control study of restaurant-associated cases and well meal companions was conducted at the ingredient level to identify a suspected food vehicle; however, 21% of cases and 22% of well meal companions were excluded for eating at Chain A restaurants more than once during the outbreak. The objective of this study was to explore how this decision affected the results of the outbreak investigation.
UK Biobank is a well-characterised cohort of over 500 000 participants including genetics, environmental data and imaging. An online mental health questionnaire was designed for UK Biobank participants to expand its potential.
Describe the development, implementation and results of this questionnaire.
An expert working group designed the questionnaire, using established measures where possible, and consulting a patient group. Operational criteria were agreed for defining likely disorder and risk states, including lifetime depression, mania/hypomania, generalised anxiety disorder, unusual experiences and self-harm, and current post-traumatic stress and hazardous/harmful alcohol use.
A total of 157 366 completed online questionnaires were available by August 2017. Participants were aged 45–82 (53% were ≥65 years) and 57% women. Comparison of self-reported diagnosed mental disorder with a contemporary study shows a similar prevalence, despite respondents being of higher average socioeconomic status. Lifetime depression was a common finding, with 24% (37 434) of participants meeting criteria and current hazardous/harmful alcohol use criteria were met by 21% (32 602), whereas other criteria were met by less than 8% of the participants. There was extensive comorbidity among the syndromes. Mental disorders were associated with a high neuroticism score, adverse life events and long-term illness; addiction and bipolar affective disorder in particular were associated with measures of deprivation.
The UK Biobank questionnaire represents a very large mental health survey in itself, and the results presented here show high face validity, although caution is needed because of selection bias. Built into UK Biobank, these data intersect with other health data to offer unparalleled potential for crosscutting biomedical research involving mental health.
To analyse the results of treatment for nasolabial cysts according to whether an intraoral sublabial or endoscopic transnasal approach was used, and to determine the recent surgical trend in our hospital.
Twenty-four patients with a histopathologically and radiologically confirmed nasolabial cyst between January 2010 and December 2017 were enrolled in this study.
Nasolabial cysts were predominant in females (91.7 per cent) and on the left side (54.2 per cent). Treatment involved an intraoral sublabial approach in 12 cases (48.0 per cent) and a transnasal endoscopic approach in 13 cases (52.0 per cent). In 13 cases (52.0 per cent) surgery was performed under local anaesthesia, while in 12 cases (48.0 per cent) it was conducted under general anaesthesia. The most common post-operative complications were numbness of the upper lip or teeth (n = 9, 36.0 per cent). Only one patient (4.0 per cent), who underwent a transnasal endoscopic approach, experienced a reoccurrence.
Surgical resection through an intraoral sublabial or transnasal endoscopic approach is the best treatment for a nasolabial cyst, showing very good results and a low recurrence rate. The recent surgical trend in our hospital is to treat nasolabial cysts using a transnasal endoscopic approach under local anaesthesia.
The Minnesota Center for Twin and Family Research (MCTFR) comprises multiple longitudinal, community-representative investigations of twin and adoptive families that focus on psychological adjustment, personality, cognitive ability and brain function, with a special emphasis on substance use and related psychopathology. The MCTFR includes the Minnesota Twin Registry (MTR), a cohort of twins who have completed assessments in middle and older adulthood; the Minnesota Twin Family Study (MTFS) of twins assessed from childhood and adolescence into middle adulthood; the Enrichment Study (ES) of twins oversampled for high risk for substance-use disorders assessed from childhood into young adulthood; the Adolescent Brain (AdBrain) study, a neuroimaging study of adolescent twins; and the Siblings Interaction and Behavior Study (SIBS), a study of adoptive and nonadoptive families assessed from adolescence into young adulthood. Here we provide a brief overview of key features of these established studies and describe new MCTFR investigations that follow up and expand upon existing studies or recruit and assess new samples, including the MTR Study of Relationships, Personality, and Health (MTR-RPH); the Colorado-Minnesota (COMN) Marijuana Study; the Adolescent Brain Cognitive Development (ABCD) study; the Colorado Online Twins (CoTwins) study and the Children of Twins (CoT) study.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Introduction: It is recommended that seniors consulting to the Emergency Department (ED) undergo a comprehensive geriatric screening, which is difficult for most EDs. Patient self-assessment using electronic tablet could be an interesting solution to this issue. However, the acceptability of self-assessment by older ED patients remains unknown. Assessing acceptability is a fundamental step in evaluating new interventions. The main objective of this project is to compare the acceptability of older patient self-assessment in the ED to that of a standard assessment made by a professional, according to seniors and their caregivers. Methods: Design: This randomized crossover design cohort study took place between May and July 2018. Participants: 1) Patients aged ≥65 years consulting to the ED, 2) their caregiver, when present. Measurements: Patients performed self-assessment of their frailty, cognitive and functional status using an electronic tablet. Acceptability was measured using the Treatment Acceptability and Preferences (TAP) questionnaires. Analyses: Descriptive analyses were performed for sociodemographic variables. Scores were adjusted for confounding variables using multivariate linear regression. Thematic content analysis was performed by two independent analysts for qualitative data collected in the TAP's open-ended question. Results: A total of 67 patients were included in this study. Mean age was 75.5 ± 8.0 and 55.2% of participants were women. Adjusted mean TAP scores for RA evaluation and patient self-assessment were 2.36 and 2.20, respectively. We found no difference between the two types of evaluations (p = 0.0831). When patients are stratified by age groups, patients aged 85 and over (n = 11) showed a difference between the TAPs scores, 2.27 for RA evaluation and 1.72 for patient self-assessment (p = 0.0053). Our qualitative data shows that this might be attributed to the use of technology, rather than to the self-assessment itself. Data from 9 caregivers showed a 2.42 mean TAP score for RA evaluation and 2.44 for self-assessment. However, this relatively small sample size prevented us to perform statistical tests. Conclusion: Our results show that older patients find self-assessment in the ED using an electronic tablet just as acceptable as a standard evaluation by a professional.
Introduction: Although acute gastroenteritis is an extremely common childhood illness, there is a paucity of literature characterizing the associated pain and its management. Our primary objective was to quantify the pain experienced by children with acute gastroenteritis in the 24-hours prior to emergency department (ED) presentation. Secondary objectives included describing maximum pain, analgesic use, discharge recommendations, and factors that influenced analgesic use in the ED. Methods: Study participants were recruited into this prospective cohort study by the Alberta Provincial Pediatric EnTeric Infection TEam between January 2014 and September 2017. This study was conducted at two Canadian pediatric EDs; the Alberta Children's Hospital (Calgary) and the Stollery Children's Hospital (Edmonton). Eligibility criteria included < 18 years of age, acute gastroenteritis (□ 3 episodes of diarrhea or vomiting in the previous 24 hours), and symptom duration □ 7 days. The primary study outcome, caregiver-reported maximum pain in the 24-hours prior to presentation, was assessed using the 11-point Verbal Numerical Rating Scale. Results: We recruited 2136 patients, median age 20.8 months (IQR 10.4, 47.4); 45.8% (979/2136) female. In the 24-hours prior to enrolment, 28.6% (610/2136) of caregivers reported that their child experienced moderate (4-6) and 46.2% (986/2136) severe (7-10) pain in the preceding 24-hours. During the emergency visit, 31.1% (664/2136) described pain as moderate and 26.7% (571/2136) as severe. In the ED, analgesia was provided to 21.2% (452/2131) of children. The most commonly administered analgesics in the ED were ibuprofen (68.1%, 308/452) and acetaminophen (43.4%, 196/452); at home, acetaminophen was most commonly administered (77.7%, 700/901), followed by ibuprofen (37.5%, 338/901). Factors associated with analgesia use in the ED were greater pain scores during the visit, having a primary-care physician, shorter illness duration, fewer diarrheal episodes, presence of fever and hospitalization. Conclusion: Although children presenting to the ED with acute gastroenteritis experience moderate to severe pain, both prior to and during their emergency visit, analgesic use is limited. Future research should focus on appropriate pain management through the development of effective and safe pain treatment plans.
Introduction: BACKGROUND: Recognition rates of delirium in older ED patients were reported between 13 to 25% in studies conducted in the U.S in the 1990's. Recently, there has been increased attention to delirium in Emergency Medicine, with the development of Geriatric curriculums in Canada specifically focused on delirium. However rates of delirium recognition have not been reassessed in Canadian ED's. OBJECTIVES: To assess the rate of delirium recognition by ED staff in a cohort of older ED patients assessed at a tertiary care Canadian ED. Methods: STUDY DESIGN: Prospective observational cohort study at a Canadian teaching ED. PARTICIPANTS: Eligible patients were aged ≥70 years and older who had stayed in the ED for a minimum of 4 hours. We excluded patients who were critically ill, visually impaired or otherwise unable to communicate. DATA COLLECTION: Trained research assistants approached clinical staff prior to approaching patients to confirm that patients were delirium free. They then assessed demographics, ED length of stay (LOS) and cognition using the validated Montreal Cognitive Assessment scale (MOCA), mini-mental status exam (MMSE), delirium index and Richardson Agitation Scale (RASS) at baseline. Delirium was assessed using the validated Confusion Assessment Method (CAM). We report descriptive statistics and 95% confidence intervals (CI) where appropriate. Results: We enrolled 203 patients of which 102 (50.3%) were female. Their mean age was 81.0 years, mean LOS was 16.3 hours, mean MOCA was 23.4 and mean MMSE was 26.7. RA's detected delirium using the CAM in 16/203 patients (7.9%, 95% CI 4.6 to 12.5%). Mean MOCA and MMSE for delirious patients was 13.4 and 18.3 and their mean DI was 6.4. All CAM positive patients were deemed to be delirium free by clinical staff. RA alerted clinical staff in all cases where patients had delirium, but 3/16 were discharged home (18.8%, 95% CI 4.1 to 45.7%). Conclusion: Our findings confirm previous low delirium recognition rates in a Canadian Tertiary ED. Future research should explore barriers and facilitators to recognizing delirium in the ED.
Introduction: According to WHO, one third of patients aged ≥65 fall every year. Those falls account for 25% of all geriatric emergency department (ED) visits. Fear of falling (FOF) is common in older patients who sustained a fall and is associated with a decline in mobility and health issues for patients. We hypothesized that there is an association between FOF and return to ED (RTED) and future falls. Objective: To assess the relation between FOF and RTED and subsequent falls in older ED patients Methods: This research was conducted as part of the Canadian Emergency Team Initiative in elderly (CETIe) multicenter prospective cohort study from 2011 to 2016. Participants: Patients 65 years or older were assessed and discharged from ED following a minor trauma. They had to be independent in all basic activities of daily living and being able to communicate in English or French. Measures: Primary outcome was RTED and secondary outcome was subsequent falls. Both were self-reported at 3 and 6 months. Patients were stratified according to Short Falls Efficacy Scale International (SFES-I) score, assessing FOF in different situations. A total score is calculated to determine the mild, moderate or severe level of FOF. Previous falls and TUG were used to evaluate patients’ mobility. OARS, ISAR and SOF were used to evaluated patient frailty. Descriptive statistical were performed and multiple regression were performed to show the association between SFES-1 score and outcomes. Results: FOF was measured in 2899 participants, of which 2214 participated at the 3 months follow-up and 2009 participated at the 6 months follow-up. Odds Ratio (OR) of return to ED at 3 months was 1.10 for moderate FOF and 1.52 for severe FOF (Type 3 test p = 0.11). At 6 months, OR was 1.03 for moderate FOF and 1.25 for severe FOF (Type 3 test p = 0.63). OR of subsequent fall at 3 months was 1.80 for moderate FOF and 2.18 for severe FOF (Type 3 test p < 0.001). At 6 months, OR of subsequent fall was 1.63 for moderate FOF and 2.37 for severe FOF (Type 3 test p < 0.001). Conclusion: The multicenter cohort study showed that severe fear of falling is strongly associated with subsequent falls over the next 6 months following ED discharge, but not significantly associated with return to ED episodes. Further research should be done to analyze the association between severe FOF and RTED.
We illustrate the extraordinary potential of the (far-IR) Origins Survey Spectrometer (OSS) on board the Origins Space Telescope (OST) to address a variety of open issues on the co-evolution of galaxies and AGNs. We present predictions for blind surveys, each of 1000 h, with different mapped areas (a shallow survey covering an area of 10 deg2 and a deep survey of 1 deg2) and two different concepts of the OST/OSS: with a 5.9 m telescope (Concept 2, our reference configuration) and with a 9.1 m telescope (Concept 1, previous configuration). In 1 000 h, surveys with the reference concept will detect from ∼1.9×106 to ∼8.7×106 lines from ∼4.8×105 to 2.7×106 star-forming galaxies and from ∼1.4×104 to ∼3.8×104 lines from ∼1.3×104 to 3.5×104 AGNs. The shallow survey will detect substantially more sources than the deep one; the advantage of the latter in pushing detections to lower luminosities/higher redshifts turns out to be quite limited. The OST/OSS will reach, in the same observing time, line fluxes more than one order of magnitude fainter than the SPICA/SMI and will cover a much broader redshift range. In particular it will detect tens of thousands of galaxies at z ≥ 5, beyond the reach of that instrument. The polycyclic aromatic hydrocarbons lines are potentially bright enough to allow the detection of hundreds of thousands of star-forming galaxies up to z ∼ 8.5, i.e. all the way through the reionisation epoch. The proposed surveys will allow us to explore the galaxy–AGN co-evolution up to z ∼ 5.5−6 with very good statistics. OST Concept 1 does not offer significant advantages for the scientific goals presented here.
The present study evaluates the use of multiple correspondence analysis (MCA), a type of exploratory factor analysis designed to reduce the dimensionality of large categorical data sets, in identifying behaviours associated with measures of overweight/obesity in Vanuatu, a rapidly modernizing Pacific Island country.
Starting with seventy-three true/false questions regarding a variety of behaviours, MCA identified twelve most significantly associated with modernization status and transformed the aggregate binary responses of participants to these twelve questions into a linear scale. Using this scale, individuals were separated into three modernization groups (tertiles) among which measures of body fat were compared and OR for overweight/obesity were computed.
Ni-Vanuatu adults (n 810) aged 20–85 years.
Among individuals in the tertile characterized by positive responses to most of or all the twelve modernization questions, weight and measures of body fat and the likelihood that measures of body fat were above the US 75th percentile were significantly greater compared with individuals in the tertiles characterized by mostly or partly negative responses.
The study indicates that MCA can be used to identify individuals or groups at risk for overweight/obesity, based on answers to simply-put questions. MCA therefore may be useful in areas where obtaining detailed information about modernization status is constrained by time, money or manpower.
Porphyrins absorb light to initiate photocatalytic activity. The complex, asymmetric structures of natural porphyrins such as heme, chlorophyll, and their derivatives hold unique interest. A platform for biosynthesis of porphyrins in Escherichia coli is developed with the aim of producing a variety of porphyrins for examining their photocatalytic properties within a porous material. Bioderived protoporphyrin IX is tethered inside the highly porous metal-organic framework (MOF) NU-1000 via solvent-assisted ligand incorporation. This MOF catalyzes the photocatalytic oxidation of 2-chloroethyl ethyl sulfide with improved performance over an expanded range of the visible spectrum when compared to unmodified NU-1000.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
The impact of dementia-related stressors and strains have been examined for their potential to threaten the well-being of either the person with dementia or the family care partner, but rarely have studies considered the dyadic nature of well-being in dementia. The purpose of this study was to examine the dyadic effects of multiple dimensions of strain on the well-being of dementia care dyads.
Using multilevel modeling to account for the inter-relatedness of individual well-being within dementia care dyads, we examined cross-sectional responses collected from 42 dyads comprised of a hospitalized patient diagnosed with a primary progressive dementia (PWD) and their family care partner (CP). Both PWDs and CPs self-reported on their own well-being using measures of quality of life (QOL-Alzheimer’s Disease scale) and depressive symptoms (Center for Epidemiological Studies Depression Scale).
In adjusted models, the PWD’s well-being (higher QOL and lower depressive symptoms) was associated with significantly less strain in the dyad’s relationship. The CP’s well-being was associated with significantly less care-related strain and (for QOL scale) less relationship strain.
Understanding the impact of dementia on the well-being of PWDs or CPs may require an assessment of both members of the dementia care dyad in order to gain a complete picture of how dementia-related stressors and strains impact individual well-being. These results underscore the need to assess and manage dementia-related strain as a multi-dimensional construct that may include strain related to the progression of the disease, strain from providing care, and strain on the dyad’s relationship quality.