To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An increasing number of patients are being prescribed direct oral anticoagulants (DOACs), while the patients who remain on warfarin are becoming more complex. There is currently a lack of a standardised anticoagulation review for patients in primary care, resulting in potentially preventable harm events. Our aim was to implement a new service, where a standardised review is carried out by a specialist multidisciplinary secondary care anticoagulation team. Overall, the implementation of a standardised review resulted in better optimisation of anticoagulation management for patients taking either a DOAC or a warfarin. Of the 172 eligible patients prescribed warfarin, 47 (27%) chose to switch a DOAC. The average time in therapeutic range for patients on warfarin before and after the pilot increased from 73.5% to 75%. Of 482 patients taking a DOAC, 35 (7%) were found to be on incorrect dose. In 32 (91%) of 35 patients, the dose was amended after notifying the patient’s general practitioner. We also found a significant number of patients inappropriately prescribed concomitant medication such as antiplatelet or non-steroidal anti-inflammatory drugs, potentially putting the patients at an elevated risk of bleeding. While further research is needed; we believe the results of this pilot can be used to help build a case to influence the commissioning of anticoagulation services. Secondary care anticoagulation teams, like our own, may be well-placed to provide or support such services, by working across the primary care and secondary care interface to support our primary care colleagues.
Introduction: Emergency department (ED) buprenorphine/naloxone inductions for opioid use disorder are an effective and safe way to initiate addictions care in the ED. Kelowna General Hospital's ED buprenorphine/naloxone (KEDSS) program was implemented in September 2018 in order to respond to a community need for accessible and evidence-based addictions care. The objective of our program evaluation study was to examine the implementation of the first five months of the KEDSS program through evaluating patient characteristics and service outcomes. Methods: The KEDSS treatment pathway consists of a standardized protocol (pre-printed order set) to facilitate buprenorphine/naloxone induction and stabilization in the acute care setting (ED and inpatient wards) at Kelowna General Hospital, a community academic hospital. All patients referred to the outpatient addictions clinic via the order set during September 2018-January 2019 (the first 5 months) were included in the study population. A retrospective descriptive chart review was completed. Outcome measures included population characteristics (sociodemographic information, clinical characteristics) and service outcomes (number of patients initiated, patient follow-up). Descriptive statistics and bivariate analyses using t-tests or Pearson's χ2 statistic, as appropriate, were conducted to compare the ED-initiated group with the inpatient-initiated group. Results: During the first five months of the KEDSS program, a total of 35 patients (26% female, mean age 36.6 years, 54% homeless) were started on the treatment pathway, 16 (46%) in the ED. Compared to the inpatient-initiated group, the ED-initiated group were less likely to have psychiatric comorbidities (ED 1.0 vs. inpatient 1.5, p = 0.002), require methadone or sustained-release oral morphine (ED 13% vs. inpatient 37%, p = 0.048), and have attended follow-up (ED 56% vs. inpatient 84%, p = 0.004). Conclusion: This study provides a preliminary look at a new opioid agonist therapy (OAT) treatment pathway (KEDSS) at Kelowna General Hospital, and provides insight into the population that is accessing the program. We found that the majority of patients who are started on buprenorphine/naloxone in the ED are seen in follow-up at the addictions clinic. Future work will examine ongoing follow-up and OAT adherence rates in the study population to quantify the program's impact on improving access to addictions treatment within this community hospital setting.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The sternocleidomastoid can be used as a pedicled flap in head and neck reconstruction. It has previously been associated with high complication rates, likely due in part to the variable nature of its blood supply.
To provide clinicians with an up-to-date review of clinical outcomes of sternocleidomastoid flap surgery in head and neck reconstruction, integrated with a review of vascular anatomical studies of the sternocleidomastoid.
A literature search of the Medline and Web of Science databases was conducted. Complications were analysed for each study. The trend in success rates was analysed by date of the study.
Reported complication rates have improved over time. The preservation of two vascular pedicles rather than one may have contributed to improved outcomes.
The sternocleidomastoid flap is a versatile option for patients where prolonged free flap surgery is inappropriate. Modern vascular imaging techniques could optimise pre-operative planning.
India has the second largest number of people with type 2 diabetes (T2D) globally. Epidemiological evidence indicates that consumption of white rice is positively associated with T2D risk, while intake of brown rice is inversely associated. Thus, we explored the effect of substituting brown rice for white rice on T2D risk factors among adults in urban South India. A total of 166 overweight (BMI ≥ 23 kg/m2) adults aged 25–65 years were enrolled in a randomised cross-over trial in Chennai, India. Interventions were a parboiled brown rice or white rice regimen providing two ad libitum meals/d, 6 d/week for 3 months with a 2-week washout period. Primary outcomes were blood glucose, insulin, glycosylated Hb (HbA1c), insulin resistance (homeostasis model assessment of insulin resistance) and lipids. High-sensitivity C-reactive protein (hs-CRP) was a secondary outcome. We did not observe significant between-group differences for primary outcomes among all participants. However, a significant reduction in HbA1c was observed in the brown rice group among participants with the metabolic syndrome (−0·18 (se 0·08) %) relative to those without the metabolic syndrome (0·05 (se 0·05) %) (P-for-heterogeneity = 0·02). Improvements in HbA1c, total and LDL-cholesterol were observed in the brown rice group among participants with a BMI ≥ 25 kg/m2 compared with those with a BMI < 25 kg/m2 (P-for-heterogeneity < 0·05). We observed a smaller increase in hs-CRP in the brown (0·03 (sd 2·12) mg/l) compared with white rice group (0·63 (sd 2·35) mg/l) (P = 0·04). In conclusion, substituting brown rice for white rice showed a potential benefit on HbA1c among participants with the metabolic syndrome and an elevated BMI. A small benefit on inflammation was also observed.
Young people with 22q11.2 deletion syndrome (22q11.2DS) are at high risk for neurodevelopmental disorders. Sleep problems may play a role in this risk but their prevalence, nature and links to psychopathology and cognitive function remain undescribed in this population.
Sleep problems, psychopathology, developmental coordination and cognitive function were assessed in 140 young people with 22q11.2DS (mean age = 10.1, s.d. = 2.46) and 65 unaffected sibling controls (mean age = 10.8, s.d.SD = 2.26). Primary carers completed questionnaires screening for the children's developmental coordination and autism spectrum disorder.
Sleep problems were identified in 60% of young people with 22q11.2DS compared to 23% of sibling controls (OR 5.00, p < 0.001). Two patterns best-described sleep problems in 22q11.2DS: restless sleep and insomnia. Restless sleep was linked to increased ADHD symptoms (OR 1.16, p < 0.001) and impaired executive function (OR 0.975, p = 0.013). Both patterns were associated with elevated symptoms of anxiety disorder (restless sleep: OR 1.10, p = 0.006 and insomnia: OR 1.07, p = 0.045) and developmental coordination disorder (OR 0.968, p = 0.0023, and OR 0.955, p = 0.009). The insomnia pattern was also linked to elevated conduct disorder symptoms (OR 1.53, p = 0.020).
Clinicians and carers should be aware that sleep problems are common in 22q11.2DS and index psychiatric risk, cognitive deficits and motor coordination problems. Future studies should explore the physiology of sleep and the links with the neurodevelopment in these young people.
The longstanding association between the major histocompatibility complex (MHC) locus and schizophrenia (SZ) risk has recently been accounted for, partially, by structural variation at the complement component 4 (C4) gene. This structural variation generates varying levels of C4 RNA expression, and genetic information from the MHC region can now be used to predict C4 RNA expression in the brain. Increased predicted C4A RNA expression is associated with the risk of SZ, and C4 is reported to influence synaptic pruning in animal models.
Based on our previous studies associating MHC SZ risk variants with poorer memory performance, we tested whether increased predicted C4A RNA expression was associated with reduced memory function in a large (n = 1238) dataset of psychosis cases and healthy participants, and with altered task-dependent cortical activation in a subset of these samples.
We observed that increased predicted C4A RNA expression predicted poorer performance on measures of memory recall (p = 0.016, corrected). Furthermore, in healthy participants, we found that increased predicted C4A RNA expression was associated with a pattern of reduced cortical activity in middle temporal cortex during a measure of visual processing (p < 0.05, corrected).
These data suggest that the effects of C4 on cognition were observable at both a cortical and behavioural level, and may represent one mechanism by which illness risk is mediated. As such, deficits in learning and memory may represent a therapeutic target for new molecular developments aimed at altering C4’s developmental role.
A range of endophenotypes characterise psychosis, however there has been limited work understanding if and how they are inter-related.
This multi-centre study includes 8754 participants: 2212 people with a psychotic disorder, 1487 unaffected relatives of probands, and 5055 healthy controls. We investigated cognition [digit span (N = 3127), block design (N = 5491), and the Rey Auditory Verbal Learning Test (N = 3543)], electrophysiology [P300 amplitude and latency (N = 1102)], and neuroanatomy [lateral ventricular volume (N = 1721)]. We used linear regression to assess the interrelationships between endophenotypes.
The P300 amplitude and latency were not associated (regression coef. −0.06, 95% CI −0.12 to 0.01, p = 0.060), and P300 amplitude was positively associated with block design (coef. 0.19, 95% CI 0.10–0.28, p < 0.001). There was no evidence of associations between lateral ventricular volume and the other measures (all p > 0.38). All the cognitive endophenotypes were associated with each other in the expected directions (all p < 0.001). Lastly, the relationships between pairs of endophenotypes were consistent in all three participant groups, differing for some of the cognitive pairings only in the strengths of the relationships.
The P300 amplitude and latency are independent endophenotypes; the former indexing spatial visualisation and working memory, and the latter is hypothesised to index basic processing speed. Individuals with psychotic illnesses, their unaffected relatives, and healthy controls all show similar patterns of associations between endophenotypes, endorsing the theory of a continuum of psychosis liability across the population.
Burn patients are particularly vulnerable to infection, and an estimated half of all burn deaths are due to infections. This study explored risk factors for healthcare-associated infections (HAIs) in adult burn patients.
Retrospective cohort study.
Tertiary-care burn center.
Adults (≥18 years old) admitted with burn injury for at least 2 days between 2004 and 2013.
HAIs were determined in real-time by infection preventionists using Centers for Disease Control and Prevention criteria. Multivariable Cox proportional hazards regression was used to estimate the direct effect of each risk factor on time to HAI, with inverse probability of censor weights to address potentially informative censoring. Effect measure modification by burn size was also assessed.
Overall, 4,426 patients met inclusion criteria, and 349 (7.9%) patients had at least 1 HAI within 60 days of admission. Compared to <5% total body surface area (TBSA), patients with 5%–10% TBSA were almost 3 times as likely to acquire an HAI (hazard ratio [HR], 2.92; 95% CI, 1.63–5.23); patients with 10%–20% TBSA were >6 times as likely to acquire an HAI (HR, 6.38; 95% CI, 3.64–11.17); and patients with >20% TBSA were >10 times as likely to acquire an HAI (HR, 10.33; 95% CI, 5.74–18.60). Patients with inhalational injury were 1.5 times as likely to acquire an HAI (HR, 1.61; 95% CI, 1.17–2.22). The effect of inhalational injury (P=.09) appeared to be larger among patients with ≤20% TBSA.
Larger burns and inhalational injury were associated with increased incidence of HAIs. Future research should use these risk factors to identify potential interventions.
Over the past 30 years, the number of US doctoral anthropology graduates has increased by about 70%, but there has not been a corresponding increase in the availability of new faculty positions. Consequently, doctoral degree-holding archaeologists face more competition than ever before when applying for faculty positions. Here we examine where US and Canadian anthropological archaeology faculty originate and where they ultimately end up teaching. Using data derived from the 2014–2015 AnthroGuide, we rank doctoral programs whose graduates in archaeology have been most successful in the academic job market; identify long-term and ongoing trends in doctoral programs; and discuss gender division in academic archaeology in the US and Canada. We conclude that success in obtaining a faculty position upon graduation is predicated in large part on where one attends graduate school.
We studied neuroinflammation in individuals with late-life, depression, as a
risk factor for dementia, using [11C]PK11195 positron emission
tomography (PET). Five older participants with major depression and 13
controls underwent PET and multimodal 3T magnetic resonance imaging (MRI),
with blood taken to measure C-reactive protein (CRP). We found significantly
higher CRP levels in those with late-life depression and raised
[11C]PK11195 binding compared with controls in brain regions
associated with depression, including subgenual anterior cingulate cortex,
and significant hippocampal subfield atrophy in cornu ammonis 1 and
subiculum. Our findings suggest neuroinflammation requires further
investigation in late-life depression, both as a possible aetiological
factor and a potential therapeutic target.
Universal screening for postpartum depression is recommended in many countries. Knowledge of whether the disclosure of depressive symptoms in the postpartum period differs across cultures could improve detection and provide new insights into the pathogenesis. Moreover, it is a necessary step to evaluate the universal use of screening instruments in research and clinical practice. In the current study we sought to assess whether the Edinburgh Postnatal Depression Scale (EPDS), the most widely used screening tool for postpartum depression, measures the same underlying construct across cultural groups in a large international dataset.
Ordinal regression and measurement invariance were used to explore the association between culture, operationalized as education, ethnicity/race and continent, and endorsement of depressive symptoms using the EPDS on 8209 new mothers from Europe and the USA.
Education, but not ethnicity/race, influenced the reporting of postpartum depression [difference between robust comparative fit indexes (∆*CFI) < 0.01]. The structure of EPDS responses significantly differed between Europe and the USA (∆*CFI > 0.01), but not between European countries (∆*CFI < 0.01).
Investigators and clinicians should be aware of the potential differences in expression of phenotype of postpartum depression that women of different educational backgrounds may manifest. The increasing cultural heterogeneity of societies together with the tendency towards globalization requires a culturally sensitive approach to patients, research and policies, that takes into account, beyond rhetoric, the context of a person's experiences and the context in which the research is conducted.
The anticipated release of EnlistTM cotton, corn, and soybean cultivars likely will increase the use of 2,4-D, raising concerns over potential injury to susceptible cotton. An experiment was conducted at 12 locations over 2013 and 2014 to determine the impact of 2,4-D at rates simulating drift (2 g ae ha−1) and tank contamination (40 g ae ha−1) on cotton during six different growth stages. Growth stages at application included four leaf (4-lf), nine leaf (9-lf), first bloom (FB), FB + 2 wk, FB + 4 wk, and FB + 6 wk. Locations were grouped according to percent yield loss compared to the nontreated check (NTC), with group I having the least yield loss and group III having the most. Epinasty from 2,4-D was more pronounced with applications during vegetative growth stages. Importantly, yield loss did not correlate with visual symptomology, but more closely followed effects on boll number. The contamination rate at 9-lf, FB, or FB + 2 wk had the greatest effect across locations, reducing the number of bolls per plant when compared to the NTC, with no effect when applied at FB + 4 wk or later. A reduction of boll number was not detectable with the drift rate except in group III when applied at the FB stage. Yield was influenced by 2,4-D rate and stage of cotton growth. Over all locations, loss in yield of greater than 20% occurred at 5 of 12 locations when the drift rate was applied between 4-lf and FB + 2 wk (highest impact at FB). For the contamination rate, yield loss was observed at all 12 locations; averaged over these locations yield loss ranged from 7 to 66% across all growth stages. Results suggest the greatest yield impact from 2,4-D occurs between 9-lf and FB + 2 wk, and the level of impact is influenced by 2,4-D rate, crop growth stage, and environmental conditions.
Major depressive disorder (MDD) is a common and disabling condition with well-established heritability and environmental risk factors. Gene–environment interaction studies in MDD have typically investigated candidate genes, though the disorder is known to be highly polygenic. This study aims to test for interaction between polygenic risk and stressful life events (SLEs) or childhood trauma (CT) in the aetiology of MDD.
The RADIANT UK sample consists of 1605 MDD cases and 1064 controls with SLE data, and a subset of 240 cases and 272 controls with CT data. Polygenic risk scores (PRS) were constructed using results from a mega-analysis on MDD by the Psychiatric Genomics Consortium. PRS and environmental factors were tested for association with case/control status and for interaction between them.
PRS significantly predicted depression, explaining 1.1% of variance in phenotype (p = 1.9 × 10−6). SLEs and CT were also associated with MDD status (p = 2.19 × 10−4 and p = 5.12 × 10−20, respectively). No interactions were found between PRS and SLEs. Significant PRSxCT interactions were found (p = 0.002), but showed an inverse association with MDD status, as cases who experienced more severe CT tended to have a lower PRS than other cases or controls. This relationship between PRS and CT was not observed in independent replication samples.
CT is a strong risk factor for MDD but may have greater effect in individuals with lower genetic liability for the disorder. Including environmental risk along with genetics is important in studying the aetiology of MDD and PRS provide a useful approach to investigating gene–environment interactions in complex traits.
Age-related cognitive decline is common and well-documented. Cognitive speed of processing training (SOPT) has been shown to improve trained abilities (Useful Field of View; UFOV), but transfer to individual non-trained cognitive outcomes or neuropsychological composites is sparse. We examine the effects of SOPT on a composite of six equally weighted tests – UFOV, Trail-making A and B, Symbol Digit Modality, Controlled Oral Word Association, Stroop Color and Word, and Digit Vigilance.
681 patients were randomized separately within two age-bands (50–64, ≥ 65) to three SOPT groups (10 initial hours on-site, 10 initial hours on-site plus 4 hours of boosters, or 10 initial hours at-home) or an attention-control group (10 initial hours on-site of crossword puzzles). At one-year, 587 patients (86.2%) had complete data. A repeated measures linear mixed model was used.
Factor analysis revealed a simple unidimensional structure with Cronbach's α of 0.82. The time effect was statistically significant (p < 0.001; ηp2 = 0.246), but the time by treatment group (p = 0.331), time by age-band (p = 0.463), and time by treatment group by age-band (p = 0.564) effects were not.
Compared to the attention-control group who played a computerized crossword puzzle game, assignment to 10–14 hours of SOPT did not significantly improve a composite measure of cognitive abilities.
Paranoia is one of the commonest symptoms of psychosis but has rarely been studied in a population at risk of developing psychosis. Based on existing theoretical models, including the proposed distinction between ‘poor me’ and ‘bad me’ paranoia, we aimed to test specific predictions about associations between negative cognition, metacognitive beliefs and negative emotions and paranoid ideation and the belief that persecution is deserved (deservedness).
We used data from 117 participants from the Early Detection and Intervention Evaluation for people at risk of psychosis (EDIE-2) trial of cognitive–behaviour therapy, comparing them with samples of psychiatric in-patients and healthy students from a previous study. Multi-level modelling was utilized to examine predictors of both paranoia and deservedness, with post-hoc planned comparisons conducted to test whether person-level predictor variables were associated differentially with paranoia or with deservedness.
Our sample of at-risk mental state participants was not as paranoid, but reported higher levels of ‘bad-me’ deservedness, compared with psychiatric in-patients. We found several predictors of paranoia and deservedness. Negative beliefs about self were related to deservedness but not paranoia, whereas negative beliefs about others were positively related to paranoia but negatively with deservedness. Both depression and negative metacognitive beliefs about paranoid thinking were specifically related to paranoia but not deservedness.
This study provides evidence for the role of negative cognition, metacognition and negative affect in the development of paranoid beliefs, which has implications for psychological interventions and our understanding of psychosis.
This paper, a report by the Clinical Governance and Audit Committee of the Scottish Otolaryngological Society, presents a consensus view of the minimal requirements for ENT clinics in National Health Service hospitals.
Results and conclusion:
The provision of adequate equipment and staff has gained increasing importance as the vast majority of ENT procedures can be safely performed in the out-patient or office setting.