To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: The opioid crisis has reached epidemic levels in Canada, driven in large part by prescription drug use. Emergency physicians are frequent prescribers of opioids; therefore, the emergency department (ED) represents an important setting for potential intervention to encourage rational and safe prescribing. The objective of this study was to systematically review the literature on interventions aimed to influence opioid prescribing in the ED. Methods: Electronic searches of Medline and Cochrane were conducted and reference lists were hand-searched. All quantitative studies published in English from 2009 to 2019 were eligible for inclusion. Two reviewers independently screened the search output to identify potentially eligible studies, the full texts of which were retrieved and assessed for inclusion. Outcomes of interest included opioid prescribing rate (proportion of ED visits resulting in an opioid prescription at discharge), morphine milligram equivalents per prescription and variability among prescribers. Results: The search strategy yielded 797 potentially relevant citations. After eliminating duplicate citations and studies that did not meet eligibility criteria, 34 potentially relevant studies were retrieved in full text. Of these, 28 studies were included in the review. The majority (26, 92.9%) of studies were based in the United States and two (7.1%) were from Australia. Four (14.3%) were randomized controlled trials. The interventions were classified into six categories: prescribing guidelines (n = 10), regulation/rescheduling of opioids (n = 6), prescribing data transparency (n = 4), education (n = 4), care coordination (n = 3), and electronic medical record changes (n = 1). The majority of interventions reduced the opioid prescribing rate from the ED (21/28, 75.0%), although regulation/rescheduling of opioids had mixed effectiveness, with 3/6 (50%) studies reporting a small increase in the opioid prescribing rate post-intervention. Education had small yet consistent effects on reducing the opioid prescribing rate. Conclusion: A variety of interventions have attempted to improve opioid prescribing from the ED. These interventions include prescribing guidelines, regulation/rescheduling, data transparency, education, care coordination, and electronic medical record changes. The majority of interventions reduced the opioid prescribing rate; however, regulation/rescheduling of opioids demonstrated mixed effectiveness.
Australian conservation cropping systems are practiced on very large farms (approximately 3,000 ha) where herbicides are relied on for effective and timely weed control. In many fields, though, there are low weed densities (e.g., <1.0 plant 10 m−2) and whole-field herbicide treatments are wasteful. For fallow weed control, commercially available weed detection systems provide the opportunity for site-specific herbicide treatments, removing the need for whole-field treatment of fallow fields with low weed densities. Concern about the sustainability of herbicide-reliant weed management systems remain and there has not been interest in the use of weed detection systems for alternative weed control technologies, such as targeted tillage. In this paper, we discuss the use of a targeted tillage technique for site-specific weed control in large-scale crop production systems. Three small-scale prototypes were used for engineering and weed control efficacy testing across a range of species and growth stages. With confidence established in the design approach and a demonstrated 100% weed-control potential, a 6-m wide pre-commercial prototype, the “Weed Chipper,” was built incorporating commercially available weed-detection cameras for practical field-scale evaluation. This testing confirmed very high (90%) weed control efficacies and associated low levels (1.8%) of soil disturbance where the weed density was fewer than 1.0 plant 10 m−2 in a commercial fallow. These data established the suitability of this mechanical approach to weed control for conservation cropping systems. The development of targeted tillage for fallow weed control represents the introduction of site-specific, nonchemical weed control for conservation cropping systems.
Little is known about who would benefit from Internet-based personalised nutrition (PN) interventions. This study aimed to evaluate the characteristics of participants who achieved greatest improvements (i.e. benefit) in diet, adiposity and biomarkers following an Internet-based PN intervention. Adults (n 1607) from seven European countries were recruited into a 6-month, randomised controlled trial (Food4Me) and randomised to receive conventional dietary advice (control) or PN advice. Information on dietary intake, adiposity, physical activity (PA), blood biomarkers and participant characteristics was collected at baseline and month 6. Benefit from the intervention was defined as ≥5 % change in the primary outcome (Healthy Eating Index) and secondary outcomes (waist circumference and BMI, PA, sedentary time and plasma concentrations of cholesterol, carotenoids and omega-3 index) at month 6. For our primary outcome, benefit from the intervention was greater in older participants, women and participants with lower HEI scores at baseline. Benefit was greater for individuals reporting greater self-efficacy for ‘sticking to healthful foods’ and who ‘felt weird if [they] didn’t eat healthily’. Participants benefited more if they reported wanting to improve their health and well-being. The characteristics of individuals benefiting did not differ by other demographic, health-related, anthropometric or genotypic characteristics. Findings were similar for secondary outcomes. These findings have implications for the design of more effective future PN intervention studies and for tailored nutritional advice in public health and clinical settings.
Introduction: Alcohol use disorder (AUD) is a chronic relapsing and highly comorbid disease. Patients suffering from AUD are frequently seen in the emergency department (ED) presenting intoxicated or in withdrawal. Brief interactions in the ED are often the only portal of entry to the healthcare system for many of these patients. Oral naltrexone and long acting injectable naltrexone are effective treatment options for AUD associated with decreased cravings, shorter length of hospital stay, and lower cost of healthcare utilization. This study's objective was to perform a systematic review of the literature evaluating initiation of naltrexone in the ED. Methods: Electronic searches of Medline, EMBASE, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews and CINAHL were conducted and reference lists were hand-searched. Randomized controlled trials (RCTs) comparing initiation of naltrexone in patients (≥18 years) to standard care in the ED were included. Two reviewers independently screened titles and abstracts, reviewed full text articles for inclusion, assessed quality of the studies, and extracted data. Results: The search strategy yielded 183 potentially relevant citations. After eliminating duplicate citations and studies that did not meet eligibility criteria, 10 articles were retrieved for full text review. There were no published RCTs that examined naltrexone initiation in the ED. There is one ongoing study being conducted in New York, which aims to assess naltrexone initiation in the ED and measure health outcomes and quality of life of study participants, as well as potential healthcare cost savings. Conclusion: The lack of published research in this area demonstrates a significant gap in knowledge. It is clear that well-designed RCTs are needed to evaluate the effectiveness of initiating naltrexone for those with AUD at the ED visit.
Harvest weed seed control (HWSC) is an Australian innovation, developed to target high proportions of weed seed retained at crop maturity by many major weed species. There is the potential, however, that a reduction in the average height of retained seed is an adaptation to the long-term use of HWSC practices. With the aim of examining the distribution of rigid ryegrass (Lolium rigidum Gaudin) seed through crop canopies, a survey of Australian wheat (Triticum aestivum L.) fields was conducted at crop maturity. Nine sites with medium to long-term HWSC use were specifically included to examine the influence of HWSC use on seed retention height. During the 2013 wheat harvest, L. rigidum and wheat plant samples were collected at five heights downward through the crop canopy (40, 30, 20, 10, and 0 cm above ground level) in 71 wheat fields. Increased crop competition resulted in higher proportions of L. rigidum seed in the upper crop canopy (>40 cm). The increase in plant height is likely a shade-intolerance response of L. rigidum plants attempting to capture more light. This plant attribute creates the opportunity to use crop competition to improve HWSC efficacy by increasing the average height of seed retention. Crop competition can, therefore, have a double impact by reducing overall L. rigidum seed production and increasing seed retention height. Examining the distribution of wheat biomass and L. rigidum seed through the crop canopy, we determined that reducing harvest height for HWSC considerably increased the collection of L. rigidum seed (25%) but to a lesser extent wheat crop biomass (14%). Comparison of + and − HWSC use at nine locations found no evidence of adaptation to this form of weed control following 5 to 10 yr of use. Although the potential for resistance to HWSC remains, these results indicate that this will not readily occur in the field.
Harvest weed seed control (HWSC) techniques have been implemented in Australian cropping systems to target and reduce the number of weed seeds entering the seedbank and thereby reduce the number of problematic weeds emerging in subsequent years to infest subsequent crops. However, the influence of HWSC on ameliorating herbicide-resistance (HR) evolution has not been investigated. This research used integrated spatial modeling to examine how the frequency and efficacy of HWSC affected the evolution of resistance to initially effective herbicides. Herbicides were, in all cases, better protected from future resistance evolution when their use was combined with annual HWSC. Outbreaks of multiple HR were very unlikely to occur and were nearly always eliminated by adding annual, efficient HWSC. The efficacy of the HWSC was important, with greater reductions in the number of resistance genes achieved with higher-efficacy HWSC. Annual HWSC was necessary to protect sequences of lower-efficacy herbicides, but HWSC could still protect herbicides if it was used less often than once per year, when the HWSC and the herbicides were highly effective. Our results highlight the potential benefits of combining HWSC with effective herbicides for controlling weed populations and reducing the future evolution of HR.
B. Sicardy, Observatoire de Paris and University Pierre et Marie Curie Paris, FRANCE,
M. El Moutamid, Cornell University Ithaca, New York, USA,
A. C. Quillen, University of Rochester Rochester, New York, USA,
P. M. Schenk, Lunar and Planetary Institute Houston, Texas, USA,
M. R. Showalter, SETI Institute Mountain View, California, USA,
K. Walsh, Southwest Research Institute Boulder, Colorado, USA
Habits are behavioral routines that are automatic and frequent, relatively independent of any desired outcome, and have potent antecedent cues. Among individuals with anorexia nervosa (AN), behaviors that promote the starved state appear habitual, and this is the foundation of a recent neurobiological model of AN. In this proof-of-concept study, we tested the habit model of AN by examining the impact of an intervention focused on antecedent cues for eating disorder routines.
The primary intervention target was habit strength; we also measured clinical impact via eating disorder psychopathology and actual eating. Twenty-two hospitalized patients with AN were randomly assigned to 12 sessions of either Supportive Psychotherapy or a behavioral intervention aimed at cues for maladaptive behavioral routines, Regulating Emotions and Changing Habits (REaCH).
Covarying for baseline, REaCH was associated with a significantly lower Self-Report Habit Index (SRHI) score and significantly lower Eating Disorder Examination-Questionnaire (EDE-Q) global score at the end-of-treatment. The end-of-treatment effect size for SRHI was d = 1.28, for EDE-Q was d = 0.81, and for caloric intake was d = 1.16.
REaCH changed habit strength of maladaptive routines more than an active control therapy, and targeting habit strength yielded improvement in clinically meaningful measures. These findings support a habit-based model of AN, and suggest habit strength as a mechanism-based target for intervention.
In Australia, widespread evolution of multi-resistant weed populations has driven the development and adoption of harvest weed seed control (HWSC). However, due to incompatibility of commonly used HWSC systems with highly productive conservation cropping systems, better HWSC systems are in demand. This study aimed to evaluate the efficacy of the integrated Harrington Seed Destructor (iHSD) mill on the seeds of Australia’s major crop weeds during wheat chaff processing. Also examined were the impacts of chaff type and moisture content on weed seed destruction efficacy. Initially, the iHSD mill speed of 3,000 rpm was identified as the most effective at destroying rigid ryegrass seeds present in wheat chaff. Subsequent testing determined that the iHSD mill was highly effective (>95% seed kill) on all Australian crop weeds examined. Rigid ryegrass seed kill was found to be highest for lupin chaff and lowest in barley, with wheat and canola chaff intermediate. Similarly, wheat chaff moisture reduced rigid ryegrass seed kill when moisture level exceeded 12%. The broad potential of the iHSD mill was evident, in that the reductions in efficacy due to wide-ranging differences in chaff type and moisture content were relatively small (≤10%). The results from these studies confirm the high efficacy and widespread suitability of the iHSD for use in Australian crop production systems. Additionally, as this system allows the conservation of all harvest residues, it is the best HWSC technique for conservation cropping systems.
A range of endophenotypes characterise psychosis, however there has been limited work understanding if and how they are inter-related.
This multi-centre study includes 8754 participants: 2212 people with a psychotic disorder, 1487 unaffected relatives of probands, and 5055 healthy controls. We investigated cognition [digit span (N = 3127), block design (N = 5491), and the Rey Auditory Verbal Learning Test (N = 3543)], electrophysiology [P300 amplitude and latency (N = 1102)], and neuroanatomy [lateral ventricular volume (N = 1721)]. We used linear regression to assess the interrelationships between endophenotypes.
The P300 amplitude and latency were not associated (regression coef. −0.06, 95% CI −0.12 to 0.01, p = 0.060), and P300 amplitude was positively associated with block design (coef. 0.19, 95% CI 0.10–0.28, p < 0.001). There was no evidence of associations between lateral ventricular volume and the other measures (all p > 0.38). All the cognitive endophenotypes were associated with each other in the expected directions (all p < 0.001). Lastly, the relationships between pairs of endophenotypes were consistent in all three participant groups, differing for some of the cognitive pairings only in the strengths of the relationships.
The P300 amplitude and latency are independent endophenotypes; the former indexing spatial visualisation and working memory, and the latter is hypothesised to index basic processing speed. Individuals with psychotic illnesses, their unaffected relatives, and healthy controls all show similar patterns of associations between endophenotypes, endorsing the theory of a continuum of psychosis liability across the population.
Traditionally, personalised nutrition was delivered at an individual level. However, the concept of delivering tailored dietary advice at a group level through the identification of metabotypes or groups of metabolically similar individuals has emerged. Although this approach to personalised nutrition looks promising, further work is needed to examine this concept across a wider population group. Therefore, the objectives of this study are to: (1) identify metabotypes in a European population and (2) develop targeted dietary advice solutions for these metabotypes. Using data from the Food4Me study (n 1607), k-means cluster analysis revealed the presence of three metabolically distinct clusters based on twenty-seven metabolic markers including cholesterol, individual fatty acids and carotenoids. Cluster 2 was identified as a metabolically healthy metabotype as these individuals had the highest Omega-3 Index (6·56 (sd 1·29) %), carotenoids (2·15 (sd 0·71) µm) and lowest total saturated fat levels. On the basis of its fatty acid profile, cluster 1 was characterised as a metabolically unhealthy cluster. Targeted dietary advice solutions were developed per cluster using a decision tree approach. Testing of the approach was performed by comparison with the personalised dietary advice, delivered by nutritionists to Food4Me study participants (n 180). Excellent agreement was observed between the targeted and individualised approaches with an average match of 82 % at the level of delivery of the same dietary message. Future work should ascertain whether this proposed method could be utilised in a healthcare setting, for the rapid and efficient delivery of tailored dietary advice solutions.
The Recorrido Arqueológico de Coixtlahuaca (RAC) presents period-by-period settlement pattern maps for the valley of Coixtlahuaca in the northern Mixteca Alta. The RAC project made improvements in full-coverage survey methods. We identify limitations and suggest that similar projects in the future need to resolve several management and budget problems. The survey revealed two periods of heavy occupation, 700–300 BC and AD 1200–1520, separated by a long period of lower population. Archaeological and historical data indicate that during the AD 1200–1520 period, and probably earlier, small landholders organized in strong communities managed an intensive agroecosystem, investing in landesque capital. Urbanization was impressive, yet cities were aggregations of communities and barrios. Today local citizens pose questions about how the large prehispanic population could have organized and sustained itself; these questions coincide with anthropological interest in collective agency, property, landesque capital, and collapse.
Harvest weed seed control (HWSC) systems have been developed to exploit the high proportions of seed retained at maturity by the annual weeds rigid ryegrass, wild radish, bromegrass, and wild oats. To evaluate the efficacy of HWSC systems on rigid ryegrass populations, three systems, the Harrington Seed Destructor (HSD), chaff carts, and narrow-windrow burning were compared at 24 sites across the western and southern wheat production regions of Australia. HWSC treatments were established at harvest (Nov. – Dec.) in wheat crops with low to moderate rigid ryegrass densities (1 to 26 plants m−2). Rigid ryegrass counts at the commencement of the next growing season (Apr. – May) determined that HWSC treatments were similarly effective in reducing emergence. Chaff carts, narrow-windrow burning, or HSD systems act similarly on rigid ryegrass seed collected during harvest to deliver substantial reductions in subsequent rigid ryegrass populations by restricting seedbank inputs. On average, population densities were reduced by 60%, but there was considerable variation between sites (37 to 90%) as influenced by seed production and the residual seedbank. Given the observed high rigid ryegrass seed production levels at crop maturity it is clear that HWSC has a vital role in preventing seedbank inputs in Australian conservation cropping systems.
Open-water swimming is increasingly popular, often in water not considered safe for bathing. Limited evidence exists on the associated health risks. We investigated gastrointestinal illness in 1100 swimmers in a River Thames event in London, UK, to describe the outbreak and identify risk factors. We conducted a retrospective cohort study. Our case definition was swimmers with any: diarrhoea, vomiting, abdominal cramps lasting ⩾48 h, nausea lasting ⩾48 h, with onset within 9 days after the event. We used an online survey to collect information on symptoms, demographics, pre- and post-swim behaviours and open-water experience. We tested associations using robust Poisson regression. We followed up case microbiological results. Survey response was 61%, and attack rate 53% (338 cases). Median incubation period was 34 h and median symptom duration 4 days. Five cases had confirmed microbiological diagnoses (four Giardia, one Cryptosporidium). Wearing a wetsuit [adjusted relative risk (aRR) 6·96, 95% confidence interval (CI) 1·04–46·72] and swallowing water (aRR 1·42, 95% CI 1·03–1·97) were risk factors. Recent river-swimming (aRR 0·78, 95% CI 0·67–0·92) and age >40 years (aRR 0·83, 95% CI 0·70–0·98) were protective. Action to reduce risk of illness in future events is recommended, including clarification of oversight arrangements for future swims to ensure appropriate risk assessment and advice is provided.
It is postulated that knowledge of genotype may be more powerful than other types of personalised information in terms of motivating behaviour change. However, there is also a danger that disclosure of genetic risk may promote a fatalistic attitude and demotivate individuals. The original concept of personalised nutrition (PN) focused on genotype-based tailored dietary advice; however, PN can also be delivered based on assessment of dietary intake and phenotypic measures. Whilst dietitians currently provide PN advice based on diet and phenotype, genotype-based PN advice is not so readily available. The aim of this review is to examine the evidence for genotype-based personalised information on motivating behaviour change, and factors which may affect the impact of genotype-based personalised advice. Recent findings in PN will also be discussed, with respect to a large European study, Food4Me, which investigated the impact of varying levels of PN advice on motivating behaviour change. The researchers reported that PN advice resulted in greater dietary changes compared with general healthy eating advice, but no additional benefit was observed for PN advice based on phenotype and genotype information. Within Food4Me, work from our group revealed that knowledge of MTHFR genotype did not significantly improve intakes of dietary folate. In general, evidence is weak with regard to genotype-based PN advice. For future work, studies should test the impact of PN advice developed on a strong nutrigenetic evidence base, ensure an appropriate study design for the research question asked, and incorporate behaviour change techniques into the intervention.
Individual response to dietary interventions can be highly variable. The phenotypic characteristics of those who will respond positively to personalised dietary advice are largely unknown. The objective of this study was to compare the phenotypic profiles of differential responders to personalised dietary intervention, with a focus on total circulating cholesterol. Subjects from the Food4Me multi-centre study were classified as responders or non-responders to dietary advice on the basis of the change in cholesterol level from baseline to month 6, with lower and upper quartiles defined as responder and non-responder groups, respectively. There were no significant differences between demographic and anthropometric profiles of the groups. Furthermore, with the exception of alcohol, there was no significant difference in reported dietary intake, at baseline. However, there were marked differences in baseline fatty acid profiles. The responder group had significantly higher levels of stearic acid (18 : 0, P=0·034) and lower levels of palmitic acid (16 : 0, P=0·009). Total MUFA (P=0·016) and total PUFA (P=0·008) also differed between the groups. In a step-wise logistic regression model, age, baseline total cholesterol, glucose, five fatty acids and alcohol intakes were selected as factors that successfully discriminated responders from non-responders, with sensitivity of 82 % and specificity of 83 %. The successful delivery of personalised dietary advice may depend on our ability to identify phenotypes that are responsive. The results demonstrate the potential use of metabolic profiles in identifying response to an intervention and could play an important role in the development of precision nutrition.