To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The intrauterine environment and early-life nutrition are regulated by maternal biomarkers in the blood and breast milk. We aimed to explore epigenetic modifications that may contribute to differential chemerin expression in maternal plasma, colostrum, and breast milk and find its association with fetal cord blood and infant weight at 6 weeks postpartum. Thirty-three gestational diabetes mellitus (GDM) mothers and 33 normoglycemic mothers (NGT) were recruited. Two maternal blood samples (28th week of gestation and 6 weeks postpartum), cord blood, colostrum, and mature milk were collected. Methylation-specific polymerase chain reaction and enzyme-linked immunosorbent assay were conducted. The weight of the babies was measured at birth and 6 weeks postpartum. Serum chemerin levels at the 28th gestational week and 6 weeks postpartum were significantly lower for the NGT group as compared to the GDM group; (P < 0.05). Higher colostrum chemerin concentrations were observed in the GDM group and remained elevated in mature milk as compared to NGT (P < 0.05). Colostrum and breast milk chemerin levels showed an independent association with infant weight at 6 weeks postpartum (r = 0.270; P = 0.034) (r = 0.464; P < 0.001). Forty percent GDM mothers expressed unmethylated chemerin reflecting increased chemerin concentration in the maternal blood. This pattern was also observed in newborn cord blood where 52% of samples showed unmethylated chemerin in contrast to none in babies born to normoglycemic mothers. The results of this study highlight the critical importance of altered chemerin regulation in gestational diabetic mothers and its effect during early life period and suggest a possible role in contributing to childhood obesity.
A recent genome-wide association study (GWAS) identified 12 independent loci significantly associated with attention-deficit/hyperactivity disorder (ADHD). Polygenic risk scores (PRS), derived from the GWAS, can be used to assess genetic overlap between ADHD and other traits. Using ADHD samples from several international sites, we derived PRS for ADHD from the recent GWAS to test whether genetic variants that contribute to ADHD also influence two cognitive functions that show strong association with ADHD: attention regulation and response inhibition, captured by reaction time variability (RTV) and commission errors (CE).
The discovery GWAS included 19 099 ADHD cases and 34 194 control participants. The combined target sample included 845 people with ADHD (age: 8–40 years). RTV and CE were available from reaction time and response inhibition tasks. ADHD PRS were calculated from the GWAS using a leave-one-study-out approach. Regression analyses were run to investigate whether ADHD PRS were associated with CE and RTV. Results across sites were combined via random effect meta-analyses.
When combining the studies in meta-analyses, results were significant for RTV (R2 = 0.011, β = 0.088, p = 0.02) but not for CE (R2 = 0.011, β = 0.013, p = 0.732). No significant association was found between ADHD PRS and RTV or CE in any sample individually (p > 0.10).
We detected a significant association between PRS for ADHD and RTV (but not CE) in individuals with ADHD, suggesting that common genetic risk variants for ADHD influence attention regulation.
Besides a global health crisis, the COVID-19 pandemic has potential to have a severe and long-lasting psychological impact on frontline healthcare workers such as paramedics. It is imperative to shed light on these mental health issues and employ interventions to protect the mental wellness of this vulnerable group of healthcare workers.
The DESCANT (Dementia Early Stage Cognitive Aids New Trial) intervention provided a personalised care package to improve the cognitive abilities, function and well -being of people with early-stage dementia and their carers by providing a range of memory aids, with training and support for use. This presentation will explore findings from a goal attainment scaling exercise undertaken within a multi-site pragmatic randomised trial, part of a NIHR-funded research programme ‘Effective Home Support in Dementia Care: Components, Impacts and Costs of Tertiary Prevention.’
The aim was to describe the Goal Attainment Scaling (GAS) approach developed; investigate the types of goals identified by people with dementia and their carers and subsequent attainment; and explore the role of Dementia Support Practitioners (DSPs) in the process. This GAS exercise was designed by researchers, a clinical psychologist, a clinician and a DSP. Goal setting and attainment were conducted with the person with dementia and their carer and recorded by DSPs. Data were obtained from 117 intervention records and semi-structured interviews with five DSPs delivering the intervention across seven NHS Trusts in England and Wales. The GAS exercise was conducted as planned with goals and extent of involvement in the exercise tailored to individual participants and engagement was high. Demographic characteristics from the trial baseline dataset were analysed. Measures were created from intervention records to permit quantification and descriptive analysis. Interviews were professionally transcribed and subject to thematic analysis to identify salient themes.
A total of 293 goals were identified across the 117 participants. From these 17 goal types were distinguished across six domains: self -care; household tasks; daily occupation; orientation; communication; and well-being and safety. A measure of goal attainment appropriate to both the client group and a modest intervention was obtained. On average participants had evidenced some improvement regarding goals set. Qualitative findings suggested overall DSPs were positive about their experience of goal setting. Although several challenges were identified, if these were overcome, measuring goal attainment was generally viewed as straightforward. GAS can be used in the context of a psychosocial intervention for people with early-stage dementia to identify and measure attainment of personalised care goals.
With the growing number of adults requiring operations for CHD, prolonged length of stay adds an additional burden on healthcare systems, especially in developing countries. This study aimed to identify factors associated with prolonged length of stay in adult patients undergoing operations for CHD.
This retrospective study included all adult patients (≥18 years) who underwent cardiac surgery with cardiopulmonary bypass for their CHD from 2011 to 2016 at a tertiary-care private hospital in Pakistan. Prolonged length of stay was defined as hospital stay >75th percentile of the overall cohort (>8 days).
This study included 166 patients (53.6% males) with a mean age of 32.05 ± 12.11 years. Comorbid disease was present in 59.0% of patients. Most patients underwent atrial septal defect repair (42.2%). A total of 38 (22.9%) patients had a prolonged length of stay. Post-operative complications occurred in 38.6% of patients. Multivariable analysis showed that pre-operative body mass index (odds ratio: 0.779; 95% confidence interval: 0.620–0.980), intraoperative aortic cross-clamp time (odds ratio: 1.035; 95% confidence interval: 1.009–1.062), and post-operative acute kidney injury (odds ratio: 7.392; 95% confidence interval: 1.036–52.755) were associated with prolonged length of stay.
Predictors of prolonged length of stay include lower body mass index, longer aortic cross-clamp time, and development of post-operative acute kidney injury. Shorter operations, improved pre-operative nutritional optimisation, and timely management of post-operative complications could help prevent prolonged length of stay in patients undergoing operations for adult CHD.
The Cognitive Abilities Screening Instrument (CASI) is a screening test of global cognitive function used in research and clinical settings. However, the CASI was developed using face validity and has not been investigated via empirical tests such as factor analyses. Thus, we aimed to develop and test a parsimonious conceptualization of the CASI rooted in cognitive aging literature reflective of crystallized and fluid abilities.
Secondary data analysis implementing confirmatory factor analyses where we tested the proposed two-factor solution, an alternate one-factor solution, and conducted a χ2 difference test to determine which model had a significantly better fit.
Data came from 3,491 men from the Kuakini Honolulu-Asia Aging Study.
The Cognitive Abilities Screening Instrument.
Findings demonstrated that both models fit the data; however, the two-factor model had a significantly better fit than the one-factor model. Criterion validity tests indicated that participant age was negatively associated with both factors and that education was positively associated with both factors. Further tests demonstrated that fluid abilities were significantly and negatively associated with a later-life dementia diagnosis.
We encourage investigators to use the two-factor model of the CASI as it could shed light on underlying cognitive processes, which may be more informative than using a global measure of cognition.
OBJECTIVES/SPECIFIC AIMS: Rodent models can be used to study neonatal abstinence syndrome (NAS), but the applicability of findings from the models to NAS in humans is not well understood. The objective of this study was to develop a rat model of norbuprenorphine-induced NAS and validate its translational value by comparing blood concentrations in the norbuprenorphine-treated pregnant rat to those previously reported in pregnant women undergoing buprenorphine treatment. METHODS/STUDY POPULATION: Pregnant Long-Evans rats were implanted with 14-day osmotic minipumps containing vehicle, morphine (positive control), or norbuprenorphine (0.3–3 mg/kg/d) on gestation day 9. Within 12 hours of delivery, pups were tested for spontaneous or precipitated opioid withdrawal by injecting them with saline (10 mL/kg, i.p.) or naltrexone (1 or 10 mg/kg, i.p), respectively, and observing them for well-validated neonatal withdrawal signs. Blood was sampled via indwelling jugular catheters from a subset of norbuprenorphine-treated dams on gestation day 8, 10, 13, 17, and 20. Norbuprenorphine concentrations in whole blood samples were quantified using LC/MS/MS. RESULTS/ANTICIPATED RESULTS: Blood concentrations of norbuprenorphine in rats exposed to 1–3 mg/kg/d of norbuprenorphine were similar to levels previously reported in pregnant women undergoing buprenorphine treatment. Pups born to dams treated with these doses exhibited robust withdrawal signs. Blood concentrations of norbuprenorphine decreased across gestation, which is similar to previous reports in humans. DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that dosing dams with 1–3 mg/kg/day norbuprenorphine produces maternal blood concentrations and withdrawal severity similar to those previously reported in humans. This provides evidence that, at these doses, this model is useful for testing hypotheses about norbuprenorphine that are applicable to NAS in humans.
To evaluate the impact of discontinuing routine contact precautions (CP) for endemic methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE) on hospital adverse events.
Academic medical center with single-occupancy rooms.
We compared hospital reportable adverse events 1 year before and 1 year after discontinuation of routine CP for endemic MRSA and VRE (preintervention and postintervention periods, respectively). Throughout the preintervention period, daily chlorhexidine gluconate bathing was expanded to nearly all inpatients. Chart reviews were performed to identify which patients and events were associated with CP for MRSA/VRE in the preintervention period as well as the patients that would have met prior criteria for MRSA/VRE CP but were not isolated in the postintervention period. Adverse events during the 2 periods were compared using segmented and mixed-effects Poisson regression models.
There were 24,732 admissions in the preintervention period and 25,536 in the postintervention period. Noninfectious adverse events (ie, postoperative respiratory failure, hemorrhage/hematoma, thrombosis, wound dehiscence, pressure ulcers, and falls or trauma) decreased by 19% (12.3 to 10.0 per 1,000 admissions, P=.022) from the preintervention to the postintervention period. There was no significant difference in the rate of infectious adverse events after CP discontinuation (20.7 to 19.4 per 1,000 admissions, P=.33). Patients with MRSA/VRE showed the largest reduction in noninfectious adverse events after CP discontinuation, with a 72% reduction (21.4 to 6.08 per 1,000 MRSA/VRE admissions; P<.001).
After discontinuing routine CP for endemic MRSA/VRE, the rate of noninfectious adverse events declined, especially in patients who no longer required isolation. This suggests that elimination of CP may substantially reduce noninfectious adverse events.
From 2000 to 2009, rates of multidrug-resistant Acinetobacter baumanii increased 10-fold to 0.2 per 1,000 patient days. From 2010 to 2015, however, rates markedly declined and have stayed below 0.05 per 1,000 patient days. Herein, we present a 15-year trend analysis and discuss interventions that may have led to the decline.
An asymptomatic 6-year-old boy with a history of right lung hypoplasia was referred for cardiology evaluation. Echocardiography demonstrated right pulmonary artery hypoplasia with flow reversal in that vessel. The right pulmonary veins were not visualised in the echocardiogram. Cardiac catheterisation confirmed the diagnosis of scimitar syndrome with a characteristic large vertical vein; however, the right pulmonary veins were found to be atretic with no connection to the heart with decompression through the azygos vein. In all, four systemic to pulmonary arterial collaterals were identified, supplying the right lung, which were occluded using embolization coils. This case demonstrates the potential for progressive stenosis and atresia of the so-called “scimitar vein” without previous surgical instrumentation, and that this can occur without haemodynamic embarrassment or development of pulmonary vascular disease.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
Objectives: To summarize the clinical characteristics and outcomes of pediatric sports-related concussion (SRC) patients who were evaluated and managed at a multidisciplinary pediatric concussion program and examine the healthcare resources and personnel required to meet the needs of this patient population. Methods: We conducted a retrospective review of all pediatric SRC patients referred to the Pan Am Concussion Program from September 1st, 2013 to May 25th, 2015. Initial assessments and diagnoses were carried out by a single neurosurgeon. Return-to-Play decision-making was carried out by the multidisciplinary team. Results: 604 patients, including 423 pediatric SRC patients were evaluated at the Pan Am Concussion Program during the study period. The mean age of study patients was 14.30 years (SD: 2.32, range 7-19 years); 252 (59.57%) were males. Hockey (182; 43.03%) and soccer (60; 14.18%) were the most commonly played sports at the time of injury. Overall, 294 (69.50%) of SRC patients met the clinical criteria for concussion recovery, while 75 (17.73%) were lost to follow-up, and 53 (12.53%) remained in active treatment at the end of the study period. The median duration of symptoms among the 261 acute SRC patients with complete follow-up was 23 days (IQR: 15, 36). Overall, 25.30% of pediatric SRC patients underwent at least one diagnostic imaging test and 32.62% received referral to another member of our multidisciplinary clinical team. Conclusion: Comprehensive care of pediatric SRC patients requires access to appropriate diagnostic resources and the multidisciplinary collaboration of experts with national and provincially-recognized training in TBI.
Stress-related pathophysiology drives comorbid trajectories that elude precise prediction. Allostatic load algorithms that quantify biological “wear and tear” represent a comprehensive approach to detect multisystemic disease processes of the mind and body. However, the multiple morbidities directly or indirectly related to stress physiology remain enigmatic. Our aim in this article is to propose that biological comorbidities represent discrete pathophysiological processes captured by measuring allostatic load. This has applications in research and clinical settings to predict physical and psychiatric comorbidities alike. The reader will be introduced to the concepts of allostasis, allostasic states, allostatic load, and allostatic overload as they relate to stress-related diseases and the proposed prediction of biological comorbidities that extend rather to understanding psychopathologies. In our transdisciplinary discussion, we will integrate perspectives related to (a) mitochondrial biology as a key player in the allostatic load time course toward diseases that “get under the skin and skull”; (b) epigenetics related to child maltreatment and biological embedding that shapes stress perception throughout lifespan development; and (c) evolutionary drivers of distinct personality profiles and biobehavioral patterns that are linked to dimensions of psychopathology.
To evaluate the impact of discontinuation of contact precautions (CP) for methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE) and expansion of chlorhexidine gluconate (CHG) use on the health system.
We compared hospital-wide laboratory-identified clinical culture rates (as a marker of healthcare-associated infections) 1 year before and after routine CP for endemic MRSA and VRE were discontinued and CHG bathing was expanded to all units. Culture data from patients and cost data on material utilization were collected. Nursing time spent donning personal protective equipment was assessed and quantified using time-driven activity-based costing.
Average positive culture rates before and after discontinuing CP were 0.40 and 0.32 cultures/100 admissions for MRSA (P=.09), and 0.48 and 0.40 cultures/100 admissions for VRE (P=.14). When combining isolation gown and CHG costs, the health system saved $643,776 in 1 year. Before the change, 28.5% intensive care unit and 19% medicine/surgery beds were on CP for MRSA/VRE. On the basis of average room entries and donning time, estimated nursing time spent donning personal protective equipment for MRSA/VRE before the change was 45,277 hours/year (estimated cost, $4.6 million).
Discontinuing routine CP for endemic MRSA and VRE did not result in increased rates of MRSA or VRE after 1 year. With cost savings on materials, decreased healthcare worker time, and no concomitant increase in possible infections, elimination of routine CP may add substantial value to inpatient care delivery.
Sometime before 1 September 1664, Grigorii Karpovich Kotoshikhin, a middlelevel chancery clerk (pod'iachii) in the Russian Foreign Office (Posol'skii prikaz), fled his homeland for the West. He went first to Vilna, which was then part of the Polish–Lithuanian Commonwealth (the so-called Rzeczpospolita), then to Poland proper, and then on to Silesia, Prussia, Lübeck and, by the autumn of 1665, to Sweden. Kotoshikhin fled for several possible reasons, all of them good, though historians remain unclear which, if any, was the most important. He fled, according to one view, because he and his father had made powerful enemies at court and their property and other possessions had been (apparently unjustly) confiscated. Kotoshikhin feared additional reprisals and found escape the best alternative in his predicament. Or Kotoshikhin fled, as he himself would later claim, because he had inadvertently manoeuvred himself into the hopeless position of being a pawn in the intrigues of powerful men over him, each trying to cajole him into supporting their feud against the other. Or, what is most likely, Kotoshikhin fled his homeland and made a break for Russia's western neighbours and rivals because he had been passing secret information to the Swedes off and on since the summer of 1663 and feared that his treason was about to be discovered.
Kotoshikhin was therefore one of the earliest and most famous of Russia's defectors. But his name may not have become so well known to us were it not for the account he wrote for his new Swedish masters in the late spring and summer of 1666, in which he described ‘the whole Muscovite state’. This account, entitled On Russia in the Reign of Aleksei Mikhailovich (O Rossii v tsarstvovanie Alekseia Mikhailovicha), is a broad yet penetrating description of how seventeenth-century Muscovy was run. It is divided into thirteen thematic chapters, each treating what Kotoshikhin considered to be a key element of the Muscovite government and political culture, though he surely would not have used these terms to describe them. He begins in chapter 1 with a detailed description of the tsar's family and of important moments in the life cycle of the tsar and his kin, including a lengthy explanation of royal wedding ceremonies.
The foliose lichens Pseudocyphellaria pilosella and P. piloselloides are characterized by a cyanobacterial photobiont, a tomentose upper surface, a yellow medulla and yellow pseudocyphellae. The latter species has long been recognized as the sorediate counterpart of the former. The morphological, anatomical, chemical, and molecular analyses performed for this study support their treatment as a single species.