To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Antibiotic prescribing practices across the VA experienced significant shifts during the coronavirus (COVID-19) pandemic. From 2015 to 2019, antibiotic use between January – May decreased from 638 to 602 DOT/1000 DP, while the corresponding months in 2020 saw antibiotic utilization rise to 628 DOT/1000 DP.
The EAT-Lancet Commission on Food, Planet, Health promulgated a universal reference diet. Subsequently, researchers constructed an EAT-Lancet diet score (0-14 points), with lower bound intake values for various dietary components set at 0 g/d, and reported inverse associations with risks of major health outcomes in a high-income population. We assessed associations between EAT-Lancet diet scores, without or with (>0 g/d) minimum intake values, and the Mean Probability of Micronutrient Adequacy (MPA) in food and nutrition insecure women of reproductive age (WRA) from low- and middle-income countries (LMICs). We analysed single 24-h diet recall data (n=1,950) from studies in rural Democratic Republic of the Congo, Ecuador, Kenya, Sri Lanka, and Vietnam. Associations between EAT-Lancet diet scores and MPA were assessed by fitting linear mixed-effects models with random intercept and slope. EAT-Lancet diet scores (mean ± SD) were 8.8 ± 1.3 and 1.9 ± 1.1 without or with minimum intake values, respectively. Furthermore, pooled MPA was 0.58 ± 0.22 and total energy intake was 2521 ± 1100 kcal/d. One-point increase in the EAT-Lancet diet score, without minimum intake values, was associated with a 2.6 ± 0.7 percentage points decrease in MPA (P<0.001). In contrast, the EAT-Lancet diet score, with minimum intake values, was associated with a 2.4 ± 1.3 percentage points increase in MPA (P=0.07). Further analysis indicated positive associations between EAT-Lancet diet scores and MPA adjusted for total energy intake (P<0.05). Our findings indicate that the EAT-Lancet diet score requires minimum intake values for nutrient-dense dietary components to avoid positively scoring non-consumption of food groups and subsequently predicting lower MPA of diets, when applied to rural WRA in LMICs.
This paper discusses the evidence for periodic human activity in the Cairngorm Mountains of Scotland from the late 9th millennium to the early 4th millennium cal bc. While contemporary paradigms for Mesolithic Europe acknowledge the significance of upland environments, the archaeological record for these areas is not yet as robust as that for the lowland zone. Results of excavation at Chest of Dee, along the headwaters of the River Dee, are set into a wider context with previously published excavations in the area. A variety of site types evidences a sophisticated relationship between people and a dynamic landscape through a period of changing climate. Archaeological benefits of the project include the ability to examine novel aspects of the archaeology leading to a more comprehensive understanding of Mesolithic lifeways. It also offers important lessons in site survival, archaeological investigation, and the management of the upland zone.
A new high time resolution observing mode for the Murchison Widefield Array (MWA) is described, enabling full polarimetric observations with up to
MHz of bandwidth and a time resolution of
s. This mode makes use of a polyphase synthesis filter to ‘undo’ the polyphase analysis filter stage of the standard MWA’s Voltage Capture System observing mode. Sources of potential error in the reconstruction of the high time resolution data are identified and quantified, with the
loss induced by the back-to-back system not exceeding
dB for typical noise-dominated samples. The system is further verified by observing three pulsars with known structure on microsecond timescales.
On coronavirus disease 2019 (COVID-19) wards, severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) nucleic acid was frequently detected on high-touch surfaces, floors, and socks inside patient rooms. Contamination of floors and shoes was common outside patient rooms on the COVID-19 wards but decreased after improvements in floor cleaning and disinfection were implemented.
An increasing number of patients are being prescribed direct oral anticoagulants (DOACs), while the patients who remain on warfarin are becoming more complex. There is currently a lack of a standardised anticoagulation review for patients in primary care, resulting in potentially preventable harm events. Our aim was to implement a new service, where a standardised review is carried out by a specialist multidisciplinary secondary care anticoagulation team. Overall, the implementation of a standardised review resulted in better optimisation of anticoagulation management for patients taking either a DOAC or a warfarin. Of the 172 eligible patients prescribed warfarin, 47 (27%) chose to switch a DOAC. The average time in therapeutic range for patients on warfarin before and after the pilot increased from 73.5% to 75%. Of 482 patients taking a DOAC, 35 (7%) were found to be on incorrect dose. In 32 (91%) of 35 patients, the dose was amended after notifying the patient’s general practitioner. We also found a significant number of patients inappropriately prescribed concomitant medication such as antiplatelet or non-steroidal anti-inflammatory drugs, potentially putting the patients at an elevated risk of bleeding. While further research is needed; we believe the results of this pilot can be used to help build a case to influence the commissioning of anticoagulation services. Secondary care anticoagulation teams, like our own, may be well-placed to provide or support such services, by working across the primary care and secondary care interface to support our primary care colleagues.
Introduction: Emergency department (ED) buprenorphine/naloxone inductions for opioid use disorder are an effective and safe way to initiate addictions care in the ED. Kelowna General Hospital's ED buprenorphine/naloxone (KEDSS) program was implemented in September 2018 in order to respond to a community need for accessible and evidence-based addictions care. The objective of our program evaluation study was to examine the implementation of the first five months of the KEDSS program through evaluating patient characteristics and service outcomes. Methods: The KEDSS treatment pathway consists of a standardized protocol (pre-printed order set) to facilitate buprenorphine/naloxone induction and stabilization in the acute care setting (ED and inpatient wards) at Kelowna General Hospital, a community academic hospital. All patients referred to the outpatient addictions clinic via the order set during September 2018-January 2019 (the first 5 months) were included in the study population. A retrospective descriptive chart review was completed. Outcome measures included population characteristics (sociodemographic information, clinical characteristics) and service outcomes (number of patients initiated, patient follow-up). Descriptive statistics and bivariate analyses using t-tests or Pearson's χ2 statistic, as appropriate, were conducted to compare the ED-initiated group with the inpatient-initiated group. Results: During the first five months of the KEDSS program, a total of 35 patients (26% female, mean age 36.6 years, 54% homeless) were started on the treatment pathway, 16 (46%) in the ED. Compared to the inpatient-initiated group, the ED-initiated group were less likely to have psychiatric comorbidities (ED 1.0 vs. inpatient 1.5, p = 0.002), require methadone or sustained-release oral morphine (ED 13% vs. inpatient 37%, p = 0.048), and have attended follow-up (ED 56% vs. inpatient 84%, p = 0.004). Conclusion: This study provides a preliminary look at a new opioid agonist therapy (OAT) treatment pathway (KEDSS) at Kelowna General Hospital, and provides insight into the population that is accessing the program. We found that the majority of patients who are started on buprenorphine/naloxone in the ED are seen in follow-up at the addictions clinic. Future work will examine ongoing follow-up and OAT adherence rates in the study population to quantify the program's impact on improving access to addictions treatment within this community hospital setting.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The sternocleidomastoid can be used as a pedicled flap in head and neck reconstruction. It has previously been associated with high complication rates, likely due in part to the variable nature of its blood supply.
To provide clinicians with an up-to-date review of clinical outcomes of sternocleidomastoid flap surgery in head and neck reconstruction, integrated with a review of vascular anatomical studies of the sternocleidomastoid.
A literature search of the Medline and Web of Science databases was conducted. Complications were analysed for each study. The trend in success rates was analysed by date of the study.
Reported complication rates have improved over time. The preservation of two vascular pedicles rather than one may have contributed to improved outcomes.
The sternocleidomastoid flap is a versatile option for patients where prolonged free flap surgery is inappropriate. Modern vascular imaging techniques could optimise pre-operative planning.
India has the second largest number of people with type 2 diabetes (T2D) globally. Epidemiological evidence indicates that consumption of white rice is positively associated with T2D risk, while intake of brown rice is inversely associated. Thus, we explored the effect of substituting brown rice for white rice on T2D risk factors among adults in urban South India. A total of 166 overweight (BMI ≥ 23 kg/m2) adults aged 25–65 years were enrolled in a randomised cross-over trial in Chennai, India. Interventions were a parboiled brown rice or white rice regimen providing two ad libitum meals/d, 6 d/week for 3 months with a 2-week washout period. Primary outcomes were blood glucose, insulin, glycosylated Hb (HbA1c), insulin resistance (homeostasis model assessment of insulin resistance) and lipids. High-sensitivity C-reactive protein (hs-CRP) was a secondary outcome. We did not observe significant between-group differences for primary outcomes among all participants. However, a significant reduction in HbA1c was observed in the brown rice group among participants with the metabolic syndrome (−0·18 (se 0·08) %) relative to those without the metabolic syndrome (0·05 (se 0·05) %) (P-for-heterogeneity = 0·02). Improvements in HbA1c, total and LDL-cholesterol were observed in the brown rice group among participants with a BMI ≥ 25 kg/m2 compared with those with a BMI < 25 kg/m2 (P-for-heterogeneity < 0·05). We observed a smaller increase in hs-CRP in the brown (0·03 (sd 2·12) mg/l) compared with white rice group (0·63 (sd 2·35) mg/l) (P = 0·04). In conclusion, substituting brown rice for white rice showed a potential benefit on HbA1c among participants with the metabolic syndrome and an elevated BMI. A small benefit on inflammation was also observed.
Young people with 22q11.2 deletion syndrome (22q11.2DS) are at high risk for neurodevelopmental disorders. Sleep problems may play a role in this risk but their prevalence, nature and links to psychopathology and cognitive function remain undescribed in this population.
Sleep problems, psychopathology, developmental coordination and cognitive function were assessed in 140 young people with 22q11.2DS (mean age = 10.1, s.d. = 2.46) and 65 unaffected sibling controls (mean age = 10.8, s.d.SD = 2.26). Primary carers completed questionnaires screening for the children's developmental coordination and autism spectrum disorder.
Sleep problems were identified in 60% of young people with 22q11.2DS compared to 23% of sibling controls (OR 5.00, p < 0.001). Two patterns best-described sleep problems in 22q11.2DS: restless sleep and insomnia. Restless sleep was linked to increased ADHD symptoms (OR 1.16, p < 0.001) and impaired executive function (OR 0.975, p = 0.013). Both patterns were associated with elevated symptoms of anxiety disorder (restless sleep: OR 1.10, p = 0.006 and insomnia: OR 1.07, p = 0.045) and developmental coordination disorder (OR 0.968, p = 0.0023, and OR 0.955, p = 0.009). The insomnia pattern was also linked to elevated conduct disorder symptoms (OR 1.53, p = 0.020).
Clinicians and carers should be aware that sleep problems are common in 22q11.2DS and index psychiatric risk, cognitive deficits and motor coordination problems. Future studies should explore the physiology of sleep and the links with the neurodevelopment in these young people.
The longstanding association between the major histocompatibility complex (MHC) locus and schizophrenia (SZ) risk has recently been accounted for, partially, by structural variation at the complement component 4 (C4) gene. This structural variation generates varying levels of C4 RNA expression, and genetic information from the MHC region can now be used to predict C4 RNA expression in the brain. Increased predicted C4A RNA expression is associated with the risk of SZ, and C4 is reported to influence synaptic pruning in animal models.
Based on our previous studies associating MHC SZ risk variants with poorer memory performance, we tested whether increased predicted C4A RNA expression was associated with reduced memory function in a large (n = 1238) dataset of psychosis cases and healthy participants, and with altered task-dependent cortical activation in a subset of these samples.
We observed that increased predicted C4A RNA expression predicted poorer performance on measures of memory recall (p = 0.016, corrected). Furthermore, in healthy participants, we found that increased predicted C4A RNA expression was associated with a pattern of reduced cortical activity in middle temporal cortex during a measure of visual processing (p < 0.05, corrected).
These data suggest that the effects of C4 on cognition were observable at both a cortical and behavioural level, and may represent one mechanism by which illness risk is mediated. As such, deficits in learning and memory may represent a therapeutic target for new molecular developments aimed at altering C4’s developmental role.
A range of endophenotypes characterise psychosis, however there has been limited work understanding if and how they are inter-related.
This multi-centre study includes 8754 participants: 2212 people with a psychotic disorder, 1487 unaffected relatives of probands, and 5055 healthy controls. We investigated cognition [digit span (N = 3127), block design (N = 5491), and the Rey Auditory Verbal Learning Test (N = 3543)], electrophysiology [P300 amplitude and latency (N = 1102)], and neuroanatomy [lateral ventricular volume (N = 1721)]. We used linear regression to assess the interrelationships between endophenotypes.
The P300 amplitude and latency were not associated (regression coef. −0.06, 95% CI −0.12 to 0.01, p = 0.060), and P300 amplitude was positively associated with block design (coef. 0.19, 95% CI 0.10–0.28, p < 0.001). There was no evidence of associations between lateral ventricular volume and the other measures (all p > 0.38). All the cognitive endophenotypes were associated with each other in the expected directions (all p < 0.001). Lastly, the relationships between pairs of endophenotypes were consistent in all three participant groups, differing for some of the cognitive pairings only in the strengths of the relationships.
The P300 amplitude and latency are independent endophenotypes; the former indexing spatial visualisation and working memory, and the latter is hypothesised to index basic processing speed. Individuals with psychotic illnesses, their unaffected relatives, and healthy controls all show similar patterns of associations between endophenotypes, endorsing the theory of a continuum of psychosis liability across the population.
Burn patients are particularly vulnerable to infection, and an estimated half of all burn deaths are due to infections. This study explored risk factors for healthcare-associated infections (HAIs) in adult burn patients.
Retrospective cohort study.
Tertiary-care burn center.
Adults (≥18 years old) admitted with burn injury for at least 2 days between 2004 and 2013.
HAIs were determined in real-time by infection preventionists using Centers for Disease Control and Prevention criteria. Multivariable Cox proportional hazards regression was used to estimate the direct effect of each risk factor on time to HAI, with inverse probability of censor weights to address potentially informative censoring. Effect measure modification by burn size was also assessed.
Overall, 4,426 patients met inclusion criteria, and 349 (7.9%) patients had at least 1 HAI within 60 days of admission. Compared to <5% total body surface area (TBSA), patients with 5%–10% TBSA were almost 3 times as likely to acquire an HAI (hazard ratio [HR], 2.92; 95% CI, 1.63–5.23); patients with 10%–20% TBSA were >6 times as likely to acquire an HAI (HR, 6.38; 95% CI, 3.64–11.17); and patients with >20% TBSA were >10 times as likely to acquire an HAI (HR, 10.33; 95% CI, 5.74–18.60). Patients with inhalational injury were 1.5 times as likely to acquire an HAI (HR, 1.61; 95% CI, 1.17–2.22). The effect of inhalational injury (P=.09) appeared to be larger among patients with ≤20% TBSA.
Larger burns and inhalational injury were associated with increased incidence of HAIs. Future research should use these risk factors to identify potential interventions.
Over the past 30 years, the number of US doctoral anthropology graduates has increased by about 70%, but there has not been a corresponding increase in the availability of new faculty positions. Consequently, doctoral degree-holding archaeologists face more competition than ever before when applying for faculty positions. Here we examine where US and Canadian anthropological archaeology faculty originate and where they ultimately end up teaching. Using data derived from the 2014–2015 AnthroGuide, we rank doctoral programs whose graduates in archaeology have been most successful in the academic job market; identify long-term and ongoing trends in doctoral programs; and discuss gender division in academic archaeology in the US and Canada. We conclude that success in obtaining a faculty position upon graduation is predicated in large part on where one attends graduate school.
We studied neuroinflammation in individuals with late-life, depression, as a
risk factor for dementia, using [11C]PK11195 positron emission
tomography (PET). Five older participants with major depression and 13
controls underwent PET and multimodal 3T magnetic resonance imaging (MRI),
with blood taken to measure C-reactive protein (CRP). We found significantly
higher CRP levels in those with late-life depression and raised
[11C]PK11195 binding compared with controls in brain regions
associated with depression, including subgenual anterior cingulate cortex,
and significant hippocampal subfield atrophy in cornu ammonis 1 and
subiculum. Our findings suggest neuroinflammation requires further
investigation in late-life depression, both as a possible aetiological
factor and a potential therapeutic target.
Universal screening for postpartum depression is recommended in many countries. Knowledge of whether the disclosure of depressive symptoms in the postpartum period differs across cultures could improve detection and provide new insights into the pathogenesis. Moreover, it is a necessary step to evaluate the universal use of screening instruments in research and clinical practice. In the current study we sought to assess whether the Edinburgh Postnatal Depression Scale (EPDS), the most widely used screening tool for postpartum depression, measures the same underlying construct across cultural groups in a large international dataset.
Ordinal regression and measurement invariance were used to explore the association between culture, operationalized as education, ethnicity/race and continent, and endorsement of depressive symptoms using the EPDS on 8209 new mothers from Europe and the USA.
Education, but not ethnicity/race, influenced the reporting of postpartum depression [difference between robust comparative fit indexes (∆*CFI) < 0.01]. The structure of EPDS responses significantly differed between Europe and the USA (∆*CFI > 0.01), but not between European countries (∆*CFI < 0.01).
Investigators and clinicians should be aware of the potential differences in expression of phenotype of postpartum depression that women of different educational backgrounds may manifest. The increasing cultural heterogeneity of societies together with the tendency towards globalization requires a culturally sensitive approach to patients, research and policies, that takes into account, beyond rhetoric, the context of a person's experiences and the context in which the research is conducted.
The anticipated release of EnlistTM cotton, corn, and soybean cultivars likely will increase the use of 2,4-D, raising concerns over potential injury to susceptible cotton. An experiment was conducted at 12 locations over 2013 and 2014 to determine the impact of 2,4-D at rates simulating drift (2 g ae ha−1) and tank contamination (40 g ae ha−1) on cotton during six different growth stages. Growth stages at application included four leaf (4-lf), nine leaf (9-lf), first bloom (FB), FB + 2 wk, FB + 4 wk, and FB + 6 wk. Locations were grouped according to percent yield loss compared to the nontreated check (NTC), with group I having the least yield loss and group III having the most. Epinasty from 2,4-D was more pronounced with applications during vegetative growth stages. Importantly, yield loss did not correlate with visual symptomology, but more closely followed effects on boll number. The contamination rate at 9-lf, FB, or FB + 2 wk had the greatest effect across locations, reducing the number of bolls per plant when compared to the NTC, with no effect when applied at FB + 4 wk or later. A reduction of boll number was not detectable with the drift rate except in group III when applied at the FB stage. Yield was influenced by 2,4-D rate and stage of cotton growth. Over all locations, loss in yield of greater than 20% occurred at 5 of 12 locations when the drift rate was applied between 4-lf and FB + 2 wk (highest impact at FB). For the contamination rate, yield loss was observed at all 12 locations; averaged over these locations yield loss ranged from 7 to 66% across all growth stages. Results suggest the greatest yield impact from 2,4-D occurs between 9-lf and FB + 2 wk, and the level of impact is influenced by 2,4-D rate, crop growth stage, and environmental conditions.