We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Data suggest poorer bereavement outcomes for lesbian, gay and bisexual people, but this has not been estimated in population-based research. This study compared bereavement outcomes for partners of same-gender and different-gender decedents.
Methods
In this population-based, cross-sectional survey of people bereaved of a civil partner or spouse 6–10 months previously, we used adjusted logistic and linear regression to investigate outcomes of interest: (1) positive screen on Inventory of Complicated Grief (ICG), (2) positive screen on General Health Questionnaire (GHQ), (3) grief intensity (ICG) and (4) psychiatric symptoms (GHQ-12).
Results
Among 233 same-gender partners and 329 of different-gender partners, 66.1% [95% confidence interval (CI) 60.0–72.2] and 59.2% [95% CI (53.9–64.6)] respectively screened positive for complicated grief on the ICG, whilst 76.0% [95% CI (70.5–81.5)] and 69.3% [95% CI (64.3–74.3)] respectively screened positive on the GHQ-12. Same-gender bereaved partners were not significantly more likely to screen positive for complicated grief than different-gender partners [adjusted odds ratio (aOR) 1.56, 95% CI (0.98–2.47)], p = 0.059, but same-gender bereaved partners were significantly more likely to screen for psychiatric caseness [aOR 1.67 (1.02, 2.71) p = 0.043]. We similarly found no significant association of partner gender with grief intensity [B = 1.86, 95% CI (−0.91to 4.63), p = 0.188], but significantly greater psychological distress for same-gender partners [B = 1.54, 95% CI (−0.69–2.40), p < 0.001].
Conclusions
Same-gender bereaved partners report significantly more psychological distress. In view of their poorer sub-clinical mental health, clinical and bereavement services should refine screening processes to identify those at risk of poor mental health outcomes.
To compare 2 methods of communicating polymerase chain reaction (PCR) blood-culture results: active approach utilizing on-call personnel versus passive approach utilizing notifications in the electronic health record (EHR).
Design:
Retrospective observational study.
Setting:
A tertiary-care academic medical center.
Patients:
Adult patients hospitalized with ≥1 positive blood culture containing a gram-positive organism identified by PCR between October 2014 and January 2018.
Methods:
The standard protocol for reporting PCR results at baseline included a laboratory technician calling the patient’s nurse, who would report the critical result to the medical provider. The active intervention group consisted of an on-call pager system utilizing trained pharmacy residents, whereas the passive intervention group combined standard protocol with real-time in-basket notifications to pharmacists in the EHR.
Results:
Of 209 patients, 105, 61, and 43 patients were in the control, active, and passive groups, respectively. Median time to optimal therapy was shorter in the active group compared to the passive group and control (23.4 hours vs 42.2 hours vs 45.9 hours, respectively; P = .028). De-escalation occurred 12 hours sooner in the active group. In the contaminant group, empiric antibiotics were discontinued faster in the active group (0 hours) than in the control group and the passive group (17.7 vs 7.2 hours; P = .007). Time to active therapy and days of therapy were similar.
Conclusions:
A passive, electronic method of reporting PCR results to pharmacists was not as effective in optimizing stewardship metrics as an active, real-time method utilizing pharmacy residents. Further studies are needed to determine the optimal method of communicating time-sensitive information.
OBJECTIVES/GOALS: Perioperative surgical care is team-based with close partnership between surgeons, residents, advanced practice professionals (APPs), and others. The objective is to develop an understanding of the current state and implementation needs required for APPs to engage surgical patients in advanced care planning (ACP) to promote goal concordant care. METHODS/STUDY POPULATION: We will conduct a mixed methods evaluation of ACP knowledge, attitudes, and beliefs amongst surgical APPs to identify barriers and facilitators of APPs engaging in a team-based approach to engaging surgical patients in ACP. We will conduct an online survey and qualitative interviews in the following 4 domains: 1) knowledge, skills, and attitudes about engaging in ACP with a patient or their surrogate decision maker during their perioperative care; 2) prior ACP-specific education; 3) experiences conducting ACP discussions with patients; and 4) perceived training needs to increase ACP uptake and documentation. The findings will provide the foundations to design team-based interventions focused on addressing the barriers and inform training and coaching needs to develop expertise and comfort in the ACP process. RESULTS/ANTICIPATED RESULTS: We expect variability in the knowledge, skills, attitudes, and experiences with the ACP process. We anticipate gaining a better understanding of the educational materials best suited to support APPs as they begin engaging patients in ACP. Possible barriers to APP-led ACP discussions include inconsistent role delineation, uncertainty about the value of pre-operative vs. post-operative ACP discussions, lack of experience engaging in ACP discussion, and lack of familiarity with electronic health records ACP tools. Possible facilitators of APP-led ACP discussions may be related to past work experience settings, exposure to ACP in educational preparation, hands-on observation of value of ACP in surgical patients and influences from attending and residents. DISCUSSION/SIGNIFICANCE: While current ACP research in surgery focuses on physician-led patient engagement in ACP discussions, there is a paucity of research focusing on how to develop a team-based approach to ACP discussions in surgery. This study will provide information necessary for the development of interventions that increase team-based ACP for surgical patients.
Increased frequency and occurrence of herbicide-resistant biotypes heightens the need for alternative wild oat management strategies. This study aimed to exploit the height differential between wild oat and crops by targeting wild oat between panicle emergence and seed shed timing. Two field studies were conducted either in Lacombe, AB, or Lacombe, AB and Saskatoon, SK, from 2015 to 2017. In the first study, we compared panicle removal methods: hand clipping, use of a hedge trimmer, and a selective herbicide crop topping application to a weedy check and an industry standard in-crop herbicide application in wheat. These treatments were tested early (at panicle emergence), late (at initiation of seed shed), or in combination at one location over 3 yr. In the second study, we investigated optimal timing of panicle removal via a hedge trimmer with weekly removals in comparison to a weedy check in wheat and lentil. This study was conducted at two locations, Lacombe, AB, and Saskatoon, SK, over 3 yr. Among all the tested methods, the early crop topping treatment consistently had the largest impact on wild oat density, dockage, seedbank, and subsequent year crop yield. The early (at panicle emergence) or combination of early and late (at initiation of seed shed) treatments tended to reduce wild oat populations the following season the most compared to the late treatments. Subsequent wild oat populations were not influenced by panicle removal timing, but only by crop and location interactions. Panicle removal timing did significantly affect wild oat dockage in the year of treatment, but no consistent optimal timing could be identified. However, the two studies together highlight additional questions to be investigated, as well as the opportunity to manage wild oat seedbank inputs at the panicle emergence stage of the wild oat lifecycle.
To prioritise and refine a set of evidence-informed statements into advice messages to promote vegetable liking in early childhood, and to determine applicability for dissemination of advice to relevant audiences.
Design:
A nominal group technique (NGT) workshop and a Delphi survey were conducted to prioritise and achieve consensus (≥70 % agreement) on thirty evidence-informed maternal (perinatal and lactation stage), infant (complementary feeding stage) and early years (family diet stage) vegetable-related advice messages. Messages were validated via triangulation analysis against the strength of evidence from an Umbrella review of strategies to increase children’s vegetable liking, and gaps in advice from a Desktop review of vegetable feeding advice.
Setting:
Australia.
Participants:
A purposeful sample of key stakeholders (NGT workshop, n 8 experts; Delphi survey, n 23 end users).
Results:
Participant consensus identified the most highly ranked priority messages associated with the strategies of: ‘in-utero exposure’ (perinatal and lactation, n 56 points) and ‘vegetable variety’ (complementary feeding, n 97 points; family diet, n 139 points). Triangulation revealed two strategies (‘repeated exposure’ and ‘variety’) and their associated advice messages suitable for policy and practice, twelve for research and four for food industry.
Conclusions:
Supported by national and state feeding guideline documents and resources, the advice messages relating to ‘repeated exposure’ and ‘variety’ to increase vegetable liking can be communicated to families and caregivers by healthcare practitioners. The food industry provides a vehicle for advice promotion and product development. Further research, where stronger evidence is needed, could further inform strategies for policy and practice, and food industry application.
To determine which established diet quality indices best predict weight-related outcomes in young women.
Design:
In this cross-sectional analysis, we collected dietary information using the Harvard FFQ and measured body fat percentage (BF%) by dual-energy X-ray absorptiometry. We used FFQ data to derive five diet quality indices: Recommended Food Score (RFS), Healthy Eating Index 2015 (HEI-2015), Alternate Healthy Eating Index 2010 (AHEI-2010), alternate Mediterranean Diet Score (aMED) and Healthy Plant-Based Diet Index (HPDI).
Setting:
University of Massachusetts at Amherst.
Participants:
Two hundred sixty healthy women aged 18–30 years.
Results:
The AHEI-2010 and HPDI were associated with BMI and BF%, such that a ten-point increase in either diet score was associated with a 1·2 percentage-point lower BF% and a 0·5 kg/m2 lower BMI (P < 0·05). Odds of excess body fat (i.e. BF% > 32 %) were 50 % lower for those in the highest v. lowest tertile of the AHEI-2010 (P = 0·04). Neither the RFS nor HEI-2015 was associated with BMI or BF%; the aMED was associated with BMI but not BF%.
Conclusions:
These results suggest that diet quality tends to be inversely associated with BMI and BF% in young women, but that this association is not observed for all diet quality indices. Diet indices may have limited utility in populations where the specific healthful foods and food groups emphasised by the index are not widely consumed. Future research should aim to replicate these findings in longitudinal studies that compare body composition changes over time across diet indices in young women.
One of the most common concerns for parents is their child’s sleep behaviour. Inadequate sleep can impact cognitive, behavioural and social-emotional functioning. There are predictable developmental changes that occur in sleep behaviour. It is important to know that sleep problems throughout childhood and adolescence are common and that there is a spectrum from sleep problems through to diagnosed sleep disorders. This chapter starts with a brief overview of what sleep is, how it is regulated, steps for assessment and theoretical underpinnings that aid in further understanding treatment principles for behaviourally based sleep problems (e.g., cognitive and behavioural theories and the 4-P model). Then a developmental framework is used to outline common behaviourally based sleep problems experienced across developmental stages and the range of family-based behavioural interventions that can be applied from infancy through to adolescence. Throughout the chapter, the impact of behaviourally based sleep problems on the family is considered. Finally, the role of the therapist in working with children experiencing behaviourally based sleep problems and the importance of implementing a core competencies approach are discussed.
This study compared the level of education and tests from multiple cognitive domains as proxies for cognitive reserve.
Method:
The participants were educationally, ethnically, and cognitively diverse older adults enrolled in a longitudinal aging study. We examined independent and interactive effects of education, baseline cognitive scores, and MRI measures of cortical gray matter change on longitudinal cognitive change.
Results:
Baseline episodic memory was related to cognitive decline independent of brain and demographic variables and moderated (weakened) the impact of gray matter change. Education moderated (strengthened) the gray matter change effect. Non-memory cognitive measures did not incrementally explain cognitive decline or moderate gray matter change effects.
Conclusions:
Episodic memory showed strong construct validity as a measure of cognitive reserve. Education effects on cognitive decline were dependent upon the rate of atrophy, indicating education effectively measures cognitive reserve only when atrophy rate is low. Results indicate that episodic memory has clinical utility as a predictor of future cognitive decline and better represents the neural basis of cognitive reserve than other cognitive abilities or static proxies like education.
Identifying developmental endophenotypes on the pathway between genetics and behavior is critical to uncovering the mechanisms underlying neurodevelopmental conditions. In this proof-of-principle study, we explored whether early disruptions in visual attention are a unique or shared candidate endophenotype of autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD). We calculated the duration of the longest look (i.e., peak look) to faces in an array-based eye-tracking task for 335 14-month-old infants with and without first-degree relatives with ASD and/or ADHD. We leveraged parent-report and genotype data available for a proportion of these infants to evaluate the relation of looking behavior to familial (n = 285) and genetic liability (using polygenic scores, n = 185) as well as ASD and ADHD-relevant temperament traits at 2 years of age (shyness and inhibitory control, respectively, n = 272) and ASD and ADHD clinical traits at 6 years of age (n = 94).
Results showed that longer peak looks at the face were associated with elevated polygenic scores for ADHD (β = 0.078, p = .023), but not ASD (β = 0.002, p = .944), and with elevated ADHD traits in mid-childhood (F(1,88) = 6.401, p = .013, $\eta _p^2$=0.068; ASD: F (1,88) = 3.218, p = .076), but not in toddlerhood (ps > 0.2). This pattern of results did not emerge when considering mean peak look duration across face and nonface stimuli. Thus, alterations in attention to faces during spontaneous visual exploration may be more consistent with a developmental endophenotype of ADHD than ASD. Our work shows that dissecting paths to neurodevelopmental conditions requires longitudinal data incorporating polygenic contribution, early neurocognitive function, and clinical phenotypic variation.
Background: A prolonged outbreak of carbapenemase-producing Serratia marcescens (CPSM) was identified in our quaternary healthcare center over a 2-year period from 2015 through 2017. A reservoir of IMP-4–producing S. marcescens in sink drains of clinical hand basins (CHB) was implicated in propagating transmission, supported by evidence from whole-genome sequencing (WGS). We assessed the impact of manual bioburden reduction intervention on further transmission of CPSM. Methods: Environmental sampling of frequently touched wet and dry areas around CPSM clinical cases was undertaken to identify potential reservoirs and transmission pathways. After identifying CHB as a source of CPSM, a widespread annual CHB cleaning intervention involving manual scrubbing of sink drains and the proximal pipes was implemented. Pre- and postintervention point prevalence surveys (PPS) of CHB drains performed to assess for CPSM colonization. Surveillance for subsequent transmission was conducted through weekly screening of patients and annual screening of CHB in transmission areas, and 6-monthly whole-hospital PPS of patients. All CPSM isolates were assessed by WGS. Results: In total, 6 patients were newly identified with CPSM from 2015 to 2017 (4.3 transmission events per 100,000 surveillance bed days [SBD]; 95% CI, 1.6–9.4). All clinical CPSM isolates were linked to CHB isolates by WGS. The CHB cleaning intervention resulted in a reduction in CHB colonization with CPSM in transmission areas from 72% colonization to 28% (ARR, 0.44; 95% CI, 0.25–0.63). A single further clinical case of CPSM linked to the CHB isolates was detected over 2 years of surveillance from 2017 to 2019 following the implementation of the annual CHB cleaning program (0.7 transmissions per 100,000 SBD; 95% CI, 0.0–3.9). No transmissions were linked to undertaking the cleaning intervention. Conclusions: A simple intervention targeted at reducing the biological burden of CPSM in CHB drains at regular intervals was effective in preventing transmission of carbapenemase-producing Enterobacterales from the hospital environment to patients over a prolonged period of intensive surveillance. These findings highlight the importance of detailed cleaning for controlling the spread of multidrug-resistant organisms from healthcare environments.
Infants struggle to understand familiar words spoken in unfamiliar accents. Here, we examine whether accent exposure facilitates accent-specific adaptation. Two types of pre-exposure were examined: video-based (i.e., listening to pre-recorded stories; Experiment 1) and live interaction (reading books with an experimenter; Experiments 2 and 3). After video-based exposure, Canadian English-learning 15- to 18-month-olds failed to recognize familiar words spoken in an unfamiliar accent. However, after face-to-face interaction with a Mandarin-accented talker, infants showed enhanced recognition for words produced in Mandarin English compared to Australian English. Infants with live exposure to an Australian talker were not similarly facilitated, perhaps due to the lower vocabulary scores of the infants assigned to the Australian exposure condition. Thus, live exposure can facilitate accent adaptation, but this ability is fragile in young infants and is likely influenced by vocabulary size and the specific mapping between the speaker and the listener's phonological system.
Southeastern Appalachian Ohio has more than double the national average of diabetes and a critical shortage of healthcare providers. Paradoxically, there is limited research focused on primary care providers’ experiences treating people with diabetes in this region. This study explored providers’ perceived barriers to and facilitators for treating patients with diabetes in southeastern Appalachian Ohio.
Methods:
We conducted in-depth interviews with healthcare providers who treat people with diabetes in rural southeastern Ohio. Interviews were transcribed, coded, and analyzed via content and thematic analyses using NVivo 12 software (QSR International, Chadstone, VIC, Australia).
Results:
Qualitative analysis revealed four themes: (1) patients’ diabetes fatalism and helplessness: providers recounted story after story of patients believing that their diabetes was inevitable and that they were helpless to prevent or delay diabetes complications. (2) Comorbid psychosocial issues: providers described high rates of depression, anxiety, incest, abuse, and post-traumatic stress disorder among people with diabetes in this region. (3) Inter-connected social determinants interfering with diabetes care: providers identified major barriers including lack of access to providers, lack of access to transportation, food insecurity, housing insecurity, and financial insecurity. (4) Providers’ cultural understanding and recommendations: providers emphasized the importance of understanding of the values central to Appalachian culture and gave culturally attuned clinical suggestions for how to use these values when working with this population.
Conclusions:
Evidence-based interventions tailored to Appalachian culture and training designed to increase the cultural competency and cultural humility of primary care providers may be effective approaches to reduce barriers to diabetes care in Appalachian Ohio.
We examined whether change in added sugar intake is associated with change in δ13C, a novel sugar biomarker, in thirty-nine children aged 5–10 years selected from a Colorado (USA) prospective cohort of children at increased risk for type 1 diabetes. Reported added sugar intake via FFQ and δ13C in erythrocytes were measured at two time points a median of 2 years apart. Change in added sugar intake was associated with change in the δ13C biomarker, where for every 1-g increase in added sugar intake between the two time points, there was an increase in δ13C of 0⋅0082 (P = 0⋅0053), independent of change in HbA1c and δ15N. The δ13C biomarker may be used as a measure of compliance in an intervention study of children under the age of 10 years who are at increased risk for type 1 diabetes, in which the goal was to reduce dietary sugar intake.
There is increasing evidence that both black and green tea are beneficial for prevention of cardiovascular disease (CVD). We conducted a systematic review and meta-analysis evaluating the effects of tea flavonoids on cardiovascular (CVD) and all-cause mortality outcomes.Searches across five databases including PubMed and Embase were conducted through November 2018 to identify randomized controlled trials (RCTs) and prospective cohort studies reporting cardiovascular and all-cause mortality outcomes. Two investigators independently conducted abstract and full-text screenings, data extractions, and risk of bias (ROB) assessments using the Nutrition Evidence Library Bias Assessment Tool (NEL BAT). Mixed-effects dose-response meta-regression and standard random-effects meta-analyses for outcomes with ≥ 4 studies were performed. 0 RCTs and 38 prospective cohort studies were included in the systematic review. NEL BAT scores ranged from 0–15 (0 being the lowest risk). Our linear meta-regression model showed that each cup increase in daily tea consumption (about 280 mg and 338 mg of total flavonoids for black and green tea, respectively) was associated with 3–4% lower risk of CVD mortality (predicted adjusted RR = 0.96; CI 0.93–0.99 for green tea and RR = 0.97; CI 0.94–0.99 for black tea). Furthermore, eachcup increase in daily tea consumption was associated a 2% lower risk of all-cause mortality (predicted adjusted relative risk (RR) = 0.98; 95% CI 0.97–0.99 for black tea and RR = 0.98; CI 0.96–0.99 for green tea, respectively). Two studies reported multivariable Cox regression analysis results for the relationship between black tea intake and risks of all-cause mortality outcomes. The results from these two studies were combined with our linear meta-regression result in a random-effects model meta-analysis and showed that each cup increase in daily black tea consumption was associated with an average of 3% lower risk of all-cause mortality (pooled adjusted RR = 0.97; 95% CI 0.87- 1.00) with large heterogeneity (I2 = 81.4%; p = 0.005). Current evidence indicates that increased tea consumption may reduce cardiovascular and all-cause mortality in a dose-response manner. This systematic review was registered on PROSPERO.
The material in this volume ranges from Germanic epic and early Welsh saints’ lives to twenty-first century comic books. This is characteristic of the Arthurian Literature series which since its inception in 1981 has always cast its net very widely over Western European culture. We are delighted that the founding editor, Richard Barber, has contributed a characteristically stimulating interdisciplinary study of swords belonging to Arthurian and other heroes. He himself has heroic stature in the world of Arthurian studies, both as an historian and as an editor and publisher. Andrew Rabin's discussion of Caradog's Vita Gildae throws light on the complex attitudes to Arthur of contemporaries of Geoffrey of Monmouth in a time of political turmoil in England, the Anarchy: Arthur is represented both as a tyrannical ruler and a conciliator, an ambivalence which Rabin notes in other Latin accounts of the king produced at this time. Christopher Berard also considers the use of Arthurian material for political purposes: borrowings from Geoffrey's Historia appear in a chronicle of Anglo-Scottish relations in the time of Edward I, a well-known admirer of the Arthurian legend. Berard argues that these borrowings would have appealed to the clerical élite of the time. Usha Vishnuvajjala focuses on women and their friendships in Ywain and Gawain, the only known close English adaptation of a romance by Chrétien. She argues that this text does not align with received wisdom about medieval friendship, or with conventional binaries about stereotypical gendered behaviour. Natalie Goodison considers the mixture of sacred and secular in The Turke and Gawain, and finds fascinating alchemical parallels for a puzzling beheading episode. Mary Bateman discusses the views on native and foreign sources of three sixteenth-century defenders of Arthur, both English and Welsh – John Leland, John Prise and Humphrey Llwyd – and their responses to the criticisms of Polydore Vergil.
In twentieth-century reception history, John Steinbeck was an ardent Arthurian enthusiast: Elaine Treharne and William J. Fowler look at the significance of his annotations to his copy of Malory as he worked on a modern adaptation, the posthumously published The Acts of King Arthur and his Noble Knights.