We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Subsidised or cost-offset community-supported agriculture (CO-CSA) connects farms directly to low-income households and can improve fruit and vegetable intake. This analysis identifies factors associated with participation in CO-CSA.
Design:
Farm Fresh Foods for Healthy Kids (F3HK) provided a half-price, summer CO-CSA plus healthy eating classes to low-income households with children. Community characteristics (population, socio-demographics and health statistics) and CO-CSA operational practices (share sizes, pick up sites, payment options and produce selection) are described and associations with participation levels are examined.
Setting:
Ten communities in New York (NY), North Carolina (NC), Vermont and Washington states in USA.
Participants:
Caregiver–child dyads enrolled in spring 2016 or 2017.
Results:
Residents of micropolitan communities had more education and less poverty than in small towns. The one rural location (NC2) had the fewest college graduates (10 %) and most poverty (23 %) and poor health statistics. Most F3HK participants were white, except in NC where 45·2 % were African American. CO-CSA participation varied significantly across communities from 33 % (NC2) to 89 % (NY1) of weeks picked up. Most CO-CSA farms offered multiple share sizes (69·2 %) and participation was higher than when not offered (76·8 % v. 57·7 % of weeks); whereas 53·8 % offered a community pick up location, and participation in these communities was lower than elsewhere (64·7 % v. 78·2 % of weeks).
Conclusion:
CO-CSA programmes should consider offering a choice of share sizes and innovate to address potential barriers such as rural location and limited education and income among residents. Future research is needed to better understand barriers to participation, particularly among participants utilising community pick up locations.
The remarkable archaeological record of Neolithic Orkney has ensured that these islands play a prominent role in narratives of European late prehistory, yet knowledge of the subsequent Bronze Age is comparatively poor. The Bronze Age settlement and cemetery at the Links of Noltland, on the island of Westray, offers new evidence, including aDNA, that points to a substantial population replacement between the Late Neolithic and Bronze Age. Focusing on funerary practice, the authors argue for interconnecting identities centred on household and community, patrilocality and inheritance. The findings prompt a reconsideration of the Orcadian Bronze Age, with wider implications for population movement and the uptake of cultural innovations more widely across prehistoric north-western Europe.
This study investigated how bilingual experience alters neural mechanisms supporting novel word learning. We hypothesised that novel words elicit increased semantic activation in the larger bilingual lexicon, potentially stimulating stronger memory integration than in monolinguals. English monolinguals and Spanish–English bilinguals were trained on two sets of written Swahili–English word pairs, one set on each of two consecutive days, and performed a recognition task in the MRI-scanner. Lexical integration was measured through visual primed lexical decision. Surprisingly, no group difference emerged in explicit word memory, and priming occurred only in the monolingual group. This difference in lexical integration may indicate an increased need for slow neocortical interleaving of old and new information in the denser bilingual lexicon. The fMRI data were consistent with increased use of cognitive control networks in monolinguals and of articulatory motor processes in bilinguals, providing further evidence for experience-induced neural changes: monolinguals and bilinguals reached largely comparable behavioural performance levels in novel word learning, but did so by recruiting partially overlapping but non-identical neural systems to acquire novel words.
We present continuous estimates of snow and firn density, layer depth and accumulation from a multi-channel, multi-offset, ground-penetrating radar traverse. Our method uses the electromagnetic velocity, estimated from waveform travel-times measured at common-midpoints between sources and receivers. Previously, common-midpoint radar experiments on ice sheets have been limited to point observations. We completed radar velocity analysis in the upper ~2 m to estimate the surface and average snow density of the Greenland Ice Sheet. We parameterized the Herron and Langway (1980) firn density and age model using the radar-derived snow density, radar-derived surface mass balance (2015–2017) and reanalysis-derived temperature data. We applied structure-oriented filtering to the radar image along constant age horizons and increased the depth at which horizons could be reliably interpreted. We reconstructed the historical instantaneous surface mass balance, which we averaged into annual and multidecadal products along a 78 km traverse for the period 1984–2017. We found good agreement between our physically constrained parameterization and a firn core collected from the dry snow accumulation zone, and gained insights into the spatial correlation of surface snow density.
New guidelines for peanut allergy prevention in high-risk infants recommend introducing peanut during infancy but do not address breastfeeding or maternal peanut consumption. We assessed the independent and combined association of these factors with peanut sensitization in the general population CHILD birth cohort (N = 2759 mother–child dyads). Mothers reported peanut consumption during pregnancy, timing of first infant peanut consumption, and length of breastfeeding duration. Child peanut sensitization was determined by skin prick testing at 1, 3, and 5 years. Overall, 69% of mothers regularly consumed peanuts and 36% of infants were fed peanut in the first year (20% while breastfeeding and 16% after breastfeeding cessation). Infants who were introduced to peanut early (before 1 year) after breastfeeding cessation had a 66% reduced risk of sensitization at 5 years compared to those who were not (1.9% vs. 5.8% sensitization; aOR 0.34, 95% CI 0.14–0.68). This risk was further reduced if mothers introduced peanut early while breastfeeding and regularly consumed peanut themselves (0.3% sensitization; aOR 0.07, 0.01–0.25). In longitudinal analyses, these associations were driven by a higher odds of outgrowing early sensitization and a lower odds of late-onset sensitization. There was no apparent benefit (or harm) from maternal peanut consumption without breastfeeding. Taken together, these results suggest the combination of maternal peanut consumption and breastfeeding at the time of peanut introduction during infancy may help to decrease the risk of peanut sensitization. Mechanistic and clinical intervention studies are needed to confirm and understand this “triple exposure” hypothesis.
Treatment resistance causes significant burden in psychosis. Clozapine is the only evidence-based pharmacologic intervention available for people with treatment-resistant schizophrenia; current guidelines recommend commencement after two unsuccessful trials of standard antipsychotics.
Aims
This paper aims to explore the prevalence of treatment resistance and pathways to commencement of clozapine in UK early intervention in psychosis (EIP) services.
Method
Data were taken from the National Evaluation of the Development and Impact of Early Intervention Services study (N = 1027) and included demographics, medication history and psychosis symptoms measured by the Positive and Negative Syndrome Scale (PANSS) at baseline, 6 months and 12 months. Prescribing patterns and pathways to clozapine were examined. We adopted a strict criterion for treatment resistance, defined as persistent elevated positive symptoms (a PANSS positive score ≥16, equating to at least two items of at least moderate severity), across three time points.
Results
A total of 143 (18.1%) participants met the definition of treatment resistance of having continuous positive symptoms over 12 months, despite treatment in EIP services. Sixty-one (7.7%) participants were treatment resistant and eligible for clozapine, having had two trials of standard antipsychotics; however, only 25 (2.4%) were prescribed clozapine over the 12-month study period. Treatment-resistant participants were more likely to be prescribed additional antipsychotic medication and polypharmacy, instead of clozapine.
Conclusions
Prevalent treatment resistance was observed in UK EIP services, but prescription of polypharmacy was much more common than clozapine. Significant delays in the commencement of clozapine may reflect a missed opportunity to promote recovery in this critical period.
Hurricane Sandy made landfall in New Jersey on October 29, 2012, resulting in widespread power outages and gasoline shortages. These events led to potentially toxic exposures and the need for information related to poisons/toxins in the environment. This report characterizes the New Jersey Poison Information and Education System (NJPIES) call patterns in the days immediately preceding, during, and after Hurricane Sandy to identify areas in need of public health education and prevention.
Methods:
We examined NJPIES case data from October through December 2012. Most Sandy-related calls had been coded as such by NJPIES staff. Additional Sandy-related cases were identified by performing a case narrative review. Descriptive analyses were performed for timing, case frequencies, exposure substances, gender, caller site, type of information requests, and other data.
Results:
The most frequent Sandy-related exposures were gasoline and carbon monoxide (CO). Gasoline exposure cases were predominantly males and CO exposure cases, females (P < 0.0001). Other leading reasons for Sandy-related calls were poison information, food poisoning/spoilage information, and water contamination.
Conclusions:
This analysis identified the need for enhanced public health education and intervention to improve the handling of gasoline and encourage the proper use of gasoline-powered generators and cleaning and cooking equipment, thus reducing toxic exposures.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
Methods:
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Results:
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Conclusions:
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Heat stress is a global issue constraining pig productivity, and it is likely to intensify under future climate change. Technological advances in earth observation have made tools available that enable identification and mapping livestock species that are at risk of exposure to heat stress due to climate change. Here, we present a methodology to map the current and likely future heat stress risk in pigs using R software by combining the effects of temperature and relative humidity. We applied the method to growing-finishing pigs in Uganda. We mapped monthly heat stress risk and quantified the number of pigs exposed to heat stress using 18 global circulation models and projected impacts in the 2050s. Results show that more than 800 000 pigs in Uganda will be affected by heat stress in the future. The results can feed into evidence-based policy, planning and targeted resource allocation in the livestock sector.
Major depression is a significant problem for people with a traumatic brain injury (TBI) and its treatment remains difficult. A promising approach to treat depression is Mindfulness-based cognitive therapy (MBCT), a relatively new therapeutic approach rooted in mindfulness based stress-reduction (MBSR) and cognitive behavioral therapy (CBT). We conducted this study to examine the effectiveness of MBCT in reducing depression symptoms among people who have a TBI.
Methods:
Twenty individuals diagnosed with major depression were recruited from a rehabilitation clinic and completed the 8-week MBCT intervention. Instruments used to measure depression symptoms included: BDI-II, PHQ-9, HADS, SF-36 (Mental Health subscale), and SCL-90 (Depression subscale). They were completed at baseline and post-intervention.
Results:
All instruments indicated a statistically significant reduction in depression symptoms post-intervention (p < .05). For example, the total mean score on the BDI-II decreased from 25.2 (9.8) at baseline to 18.2 (11.7) post-intervention (p=.001). Using a PHQ threshold of 10, the proportion of participants with a diagnosis of major depression was reduced by 59% at follow-up (p=.012).
Conclusions:
Most participants reported reductions in depression symptoms after the intervention such that many would not meet the criteria for a diagnosis of major depression. This intervention may provide an opportunity to address a debilitating aspect of TBI and could be implemented concurrently with more traditional forms of treatment, possibly enhancing their success. The next step will involve the execution of multi-site, randomized controlled trials to fully demonstrate the value of the intervention.
Schizophrenia is a devastating mental disorder with diverse dimensions of symptoms like delusions, hallucinations, affective symptoms and alterations in cognition. Declarative memory deficits are among the most important factors leading to poor functional outcomes in this disorder. Recently it was supposed, that sleep disturbances in patients with schizophrenia might contribute to these memory impairments (Manoach et al. 2009, Ferrarelli et al. 2010, Lu and Göder 2012). In young healthy subjects it was shown that declarative memory consolidation was enhanced by inducing slow oscillation-like potential fields during sleep (Marshall et al. 2006). In the present study transcranial direct current stimulation (tDCS) was applied to 14 patients with schizophrenia on stable medication with a mean age of 33 years. The main effects of tDCS in comparison to sham stimulation were: An enhancement in declarative memory retention and an increase in mood after sleep. In conclusion, so-tDCS offers an interesting approach for studying the relationship of sleep and memory in psychiatric disorders and could possibly improve disturbed memory processing in patients with schizophrenia.
Traumatic brain injuries (TBI) may lead to persistent depression symptoms. We conducted several pilot studies to examine the efficacy of mindfulness-based interventions to deal with this issue; all showed strong effect sizes. The logical next step was to conduct a randomized controlled trial (RCT).
Objective
We sought to determine the efficacy of mindfulness-based cognitive therapy for people with depression symptoms post-TBI (MBCT-TBI).
Methods
Using a multi-site RCT design, participants (mean age = 47) were randomized to intervention or control arms. Treatment participants received a group-based, 10-week intervention; control participants waited. Outcome measures, administered pre- and post-intervention, and after three months, included: Beck Depression Inventory-II (BDI-II), Patient Health Questionnaire-9 (PHQ-9), and Symptom Checklist-90-Revised (SCL-90-R). The Philadelphia Mindfulness Scale (PHLMS) captured present moment awareness and acceptance.
Results
BDI-II scores decreased from 25.47 to 18.84 in treatment groups while they stayed relatively stable in control groups (respectively 27.13 to 25.00; p = .029). We did not find statistically significant differences on the PHQ-9 and SCL-90R post- treatment. However, after three months, all scores were statistically significantly lower than at baseline (ps < .01). Increases in mindfulness were associated with decreases in BDI-II scores (r[29] = -.401, p = .025).
Conclusions
MBCT-TBI may alleviate depression symptoms up to three months post-intervention. Greater mindfulness may have contributed to the reduction in depression symptoms although the association does not confirm causality. More work is required to replicate these findings, identify subgroups that may better respond to the intervention, and refine the intervention to maximize its effectiveness.
Donkeys facilitated trade and transport in much of the ancient world, but were seldom used in elite or leisure activities. While Tang Dynasty (AD 618–907) texts indicate that noble women played polo riding donkeys, this has never been documented archaeologically. Here, the authors present the first archaeological evidence of the significance of donkeys for elite Tang women through analyses of donkey remains recovered from the tomb of a Tang noblewoman in Xi'an, China. These findings broaden our understanding of the donkey's historic roles beyond simple load bearing.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
Setting:
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
Participants:
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
Results:
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Conclusions:
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
Chronic spontaneous urticaria (CSU) has been associated with depression and can have an impact on quality of life. Therefore, researchers have suggested the potential utility of psychological interventions for targeting depression among CSU patients. Psychological interventions that may hold the most promise are those that are brief and easily transportable, such as brief behavioural activation treatment for depression. We report results of a preliminary investigation of an uncontrolled open trial of a one-session behavioural activation treatment for depression designed for patients with CSU (BATD-CSU) at a university-based allergy and immunology clinic. Participants were 11 females with chronic, poorly controlled urticaria and symptoms of depression. Following the completion of pretreatment questionnaires, participants were administered BATD-CSU primarily by non-mental health professionals trained and supervised in its delivery. One month post-BATD-CSU, participants completed follow-up questionnaires. Participants exhibited significant reductions in depression severity, avoidance/rumination, and work/school impairment. BATD-CSU was also associated with improvements in urticaria control one month post-treatment. Moreover, five of nine patients reported reliable and clinically significant improvement on at least one outcome. Results demonstrate that BATD-CSU may have benefits for CSU patients even when consisting of one session and delivered by professionals with limited background in psychological interventions, thus speaking to its feasibility and transportability.
Dengue is the fastest spreading mosquito-transmitted disease in the world. In China, Guangzhou City is believed to be the most important epicenter of dengue outbreaks although the transmission patterns are still poorly understood. We developed an autoregressive integrated moving average model incorporating external regressors to examine the association between the monthly number of locally acquired dengue infections and imported cases, mosquito densities, temperature and precipitation in Guangzhou. In multivariate analysis, imported cases and minimum temperature (both at lag 0) were both associated with the number of locally acquired infections (P < 0.05). This multivariate model performed best, featuring the lowest fitting root mean squared error (RMSE) (0.7520), AIC (393.7854) and test RMSE (0.6445), as well as the best effect in model validation for testing outbreak with a sensitivity of 1.0000, a specificity of 0.7368 and a consistency rate of 0.7917. Our findings suggest that imported cases and minimum temperature are two key determinants of dengue local transmission in Guangzhou. The modelling method can be used to predict dengue transmission in non-endemic countries and to inform dengue prevention and control strategies.