To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Australian conservation cropping systems are practiced on very large farms (~3000 ha) where herbicides are relied on for effective and timely weed control. In many fields though there are low weed densities (e.g. <1.0 plant 10 m-2) and whole field herbicide treatments are wasteful. For fallow weed control, commercially available weed detection systems provide the opportunity for site-specific herbicide treatments removing the need for whole field treatment of fallow fields with low weed densities. Concern for the sustainability of herbicide reliant weed management systems remain and there is not now interest in the use of weed detection systems for alternative weed control technologies, such as targeted tillage. This paper presents the use of a targeted tillage technique for site-specific weed control in large-scale crop production systems. Three small-scale prototypes were used for engineering and weed control efficacy testing across a range of species and growth stages. With confidence established in the design approach and a demonstrated 100% weed-control potential, a 6 m wide pre-commercial prototype, the “Weed Chipper” was built incorporating commercially available weed detection cameras for practical field-scale evaluation. This testing confirmed very high (90%) weed control efficacies and associated low-levels (1.8%) of soil disturbance where the weed density was <1.0 plant 10 m-2 in a commercial fallow. These data established the suitability of this mechanical approach to weed control for conservation cropping systems. The development of targeted tillage for fallow weed control represents the introduction of site-specific, non-chemical weed control for conservation cropping systems.
Although apps are increasingly being used to support the diagnosis, treatment and management of mental illness, there is no single means through which costs associated with mental apps are being reimbursed. Furthermore, different apps are amenable to different means of reimbursement as not all apps generate value in the same way.
To provide insights into how apps are currently generating value and being reimbursed across the world, with a particular focus on the situation in the USA.
An international team performed secondary research on how apps are being used and on common pathways to remuneration.
The uses of apps today and in the future are reviewed, the nature of the value delivered by apps is summarised and an overview of app reimbursement in the USA and other countries is provided. Recommendations regarding how payments might be made for apps in the future are discussed.
Currently, apps are being reimbursed through channels with other original purposes. There may be a need to develop an app-specific channel for reimbursement which is analogous to the channels used for devices, drugs and laboratory tests.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
Questions of consciousness pervade the social sciences. Yet, despite persistent tendencies to anthropomorphize states, most International Relations scholarship implicitly adopts the position that humans are conscious and states are not. Recognizing that scholarly disagreement over fundamental issues prevents answering definitively whether states are truly conscious, I instead demonstrate how scholars of multiple dispositions can incorporate a pragmatic notion of state consciousness into their theorizing. Drawing on recent work from Eric Schwitzgebel and original supplementary arguments, I demonstrate that states are not only complex informationally integrated systems with emergent properties, but they also exhibit seemingly genuine responses to qualia that are irreducible to individuals within them. Though knowing whether states possess an emergent ‘stream’ of consciousness indiscernible to their inhabitants may not yet be possible, I argue that a pragmatic notion of state consciousness can contribute to a more complete understanding of state personhood, as well as a revised model of the international system useful to multiple important theoretical debates. In the article's final section, I apply this model to debate over the levels of analysis at which scholarship applies ontological security theory. I suggest the possibility of emergent state-level ontological insecurity that need not be understood via problematic reduction to individuals.
Cadmium telluride (CdTe) is one of the leading photovoltaic technologies with a market share of around 5%. However, there still exist challenges to fabricate a rear contact for efficient transport of photogenerated holes. Here, etching effects of various iodine compounds including elemental iodine (I2), ammonium iodide (NH4I), mixture of elemental iodine and NH4I (I−/I3− etching), and formamidinium iodide were investigated. The treated CdTe surfaces were investigated using Raman spectroscopy, X-ray diffraction (XRD), scanning electron microscopy, and energy-dispersive X-ray spectroscopy. The CdTe devices were completed with or without treatments and tested under simulated AM1.5G solar spectrum to find photoconversion efficiency (PCE). Based on Raman spectra, XRD patterns, and surface morphology, it was shown that treatment with iodine compounds produced Te-rich surface on CdTe films, and temperature-dependent current–voltage characteristics showed reduced back barrier heights, which are essential for the formation of ohmic contact and reduce contact resistance. Based on current–voltage characteristics, the treatment enhanced open-circuit voltage (VOC) up to 841 mV, fill factor (FF) up to 78.2%, and PCE up to 14.0% compared with standard untreated CdTe devices (VOC ∼ 814 mV, FF ∼ 74%, and PCE ∼ 12.7%) with copper/gold back contact.
This chapter, provides an overview on the concerns centered on neurotoxicity from anesthetics in children following the FDA warning from 2016 and 2017. The authors provide a background for both clinical and basic science evidence on the effects of commonly used anesthetics on neural development. After review of the landmark studies in both animals and humans, the chapter provides suggestions for reducing exposure as well as addressing parental concerns.
The prevalence and impact of motor coordination difficulties in children with copy number variants associated with neurodevelopmental disorders (ND-CNVs) remains unknown. This study aims to advance understanding of motor coordination difficulties in children with ND-CNVs and establish relationships between intelligence quotient (IQ) and psychopathology.
169 children with an ND-CNV (67% male, median age = 8.88 years, range 6.02–14.81) and 72 closest-in-age unaffected siblings (controls; 55% male, median age = 10.41 years, s.d. = 3.04, range 5.89–14.75) were assessed with the Developmental Coordination Disorder Questionnaire, alongside psychiatric interviews and standardised assessments of IQ.
The children with ND-CNVs had poorer coordination ability (b = 28.98, p < 0.001) and 91% of children with an ND-CNV screened positive for suspected developmental coordination disorder, compared to 19% of controls (OR = 42.53, p < 0.001). There was no difference in coordination ability between ND-CNV genotypes (F = 1.47, p = 0.184). Poorer coordination in children with ND-CNV was associated with more attention deficit hyperactivity disorder (ADHD) (β = −0.18, p = 0.021) and autism spectrum disorder trait (β = −0.46, p < 0.001) symptoms, along with lower full-scale (ß = 0.21, p = 0.011), performance (β = −0.20, p = 0.015) and verbal IQ (β = 0.17, p = 0.036). Mediation analysis indicated that coordination ability was a full mediator of anxiety symptoms (69% mediated, p = 0.012), and a partial mediator of ADHD (51%, p = 0.001) and autism spectrum disorder trait symptoms (66%, p < 0.001) as well as full scale IQ (40%, p = 0.002), performance IQ (40%, p = 0.005) and verbal IQ (38%, p = 0.006) scores.
The findings indicate that poor motor coordination is highly prevalent and closely linked to risk of mental health disorder and lower intellectual function in children with ND-CNVs. Future research should explore whether early interventions for poor coordination ability could ameliorate neurodevelopmental risk.
Harvest weed seed control (HWSC) technology, such as impact mills that destroy weed seeds in seed-bearing chaff material during grain crop harvest, has been highly effective in Australian cropping systems. However, the impact mill has never been tested in soybeans [Glycine max (L.) Merr.] and weeds common to soybean production systems in the midwestern and mid-Atlantic United States. We conducted stationary testing of Harrington Seed Destructor (HSD) impact mill and winter burial studies during 2015 to 2016 and 2017 to 2018 to determine (1) the efficacy of the impact mill to target weed seeds of seven common weeds in midwestern and five in the mid-Atlantic United States, and (2) the fate of impact mill–processed weed seeds after winter burial. The impact mill was highly effective in destroying seeds of all the species tested, with 93.5% to 99.8% weed seed destruction in 2015 and 85.6% to 100% in 2017. The weak relationships (positive or negative) between seed size and seed destruction by impact mill and the high percentage of weed seed destruction by impact mill across all seed sizes indicate that the biological or practical effect of seed size is limited. The impact mill–processed weed seeds that retained at least 50% of their original size, labeled as potentially viable seed (PVS), were buried for 90 d overwinter to determine the fate of weed seeds after winter burial. At 90 d after burial, the impact mill–processed PVS were significantly less viable than unprocessed control seeds, indicating that impact mill processing physically damaged the PVS and promoted seed mortality overwinter. A very small fraction (<0.4%) of the total weed seed processed by the impact mill remained viable after winter burial. The results presented here demonstrate that the impact mill is highly effective in increasing seed mortality and could potentially be used as an HWSC tactic for weed management in this region.
Retrospective voting is vital for democracy. But, are the objective performance metrics widely thought to be relevant for retrospection—such as the performance of the economy, criminal justice system, and schools, to name a few—valid criteria for evaluating government performance? That is, do political coalitions actually have the power to influence the performance metrics used for retrospection on the timeline introduced by elections? Using difference-in-difference and regression discontinuity techniques, we find that US states governed by Democrats and those by Republicans perform equally well on economic, education, crime, family, social, environmental, and health outcomes on the timeline introduced by elections (2–4 years downstream). Our results suggest that voters may struggle to truly hold government coalitions accountable, as objective performance metrics appear to be largely out of the immediate control of political coalitions.
Non-tuberculous mycobacterium encephalitis is rare. Since 2013, a global outbreak of Mycobacterium chimaera infection has been attributed to point-source contamination of heater cooler units used in cardiac surgery. Disseminated M. chimaera infection has presented many unique challenges, including non-specific clinical presentations with delays in diagnosis, and a high mortality rate among predominantly immunocompetent adults. Here, we describe three patients with fatal disseminated Mycobacterium chimaera infection showing initially non-specific, progressively worsening neurocognitive decline, including confusion, delirium, depression and apathy. Autopsy revealed widespread granulomatous encephalitis of the cerebrum, brain stem and spinal cord, along with granulomatous chorioretinitis. Cerebral involvement and differentiation between mycobacterial granulomas and microangiopathic changes can be assessed best on MRI with contrast enhancement. The prognosis of M. chimaera encephalitis appears to be very poor, but might be improved by increased awareness of this new syndrome and timely antimicrobial treatment.
This presentation will enable the learner to:
1.Describe the clinical, radiological and neuropathological findings of Mycobacterium chimaera encephalitis
2.Be aware of this rare form of encephalitis, and explain its diagnosis, prognosis and management
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
The contribution of milk and dairy products to daily iodine intake is high but variable in many industrialised countries. Factors that affect iodine concentrations in milk and dairy products are only poorly understood. Our aim was to: (1) assess the effect of feed iodine concentration on milk iodine by supplementing five groups of five cows each with one of five dosages from 0–2 mg iodine/kg DM; (2) quantify iodine losses during manufacturing of cheese and yogurt from milk with varying iodine concentrations and assess the effect of cellar-ripening; and (3) systematically measure iodine partitioning during heat treatment and skimming of milk. Milk iodine reached a near-steady state after 3 weeks of feeding. Median milk iodine (17–302 μg/l for 0–2 mg iodine/kg DM) increased linearly with feed iodine (R2 0·96; P < 0·001). At curd separation, 75–84 % of iodine was lost in whey. Dairy iodine increased linearly with milk iodine (semi-hard cheese: R2 0·95; P < 0·001; fresh cheese and yogurt: R2 1·00; P < 0·001), and cellar-ripening had no effect. Heat treatment had no significant effect, whereas skimming increased (P < 0·001) milk iodine concentration by only 1–2 μg/l. Mean daily intake of dairy products by Swiss adults is estimated at 213 g, which would contribute 13–52 % of the adults’ RDA for iodine if cow feed is supplemented with 0·5–2 mg iodine/kg DM. Thus, modulation of feed iodine levels can help achieve desirable iodine concentrations in milk and dairy products, and thereby optimise their contribution to human iodine nutrition to avoid both deficiency and excess.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
Elevated left ventricular end diastolic pressure is a risk factor for ventricular arrhythmias in patients with tetralogy of Fallot. The objective of this retrospective study was to identify echocardiographic measures associated with left ventricular end diastolic pressure >12 mmHg in this population. Repaired tetralogy of Fallot patients age ≥13 years, who underwent a left heart catheterisation within 7 days of having an echocardiogram were evaluated. Univariate comparison was made in echocardiographic and clinical variables between patients with left ventricular end diastolic pressure >12 versus ≤12 mmHg. Ninety-four patients (54% male) with a median age of 24.6 years were included. Thirty-four (36%) had left ventricular end diastolic pressure >12 mmHg. Patients with left ventricular end diastolic pressure >12mmHg were older (median 32.9 versus 24.0 years, p = 0.02), more likely to have a history of an aortopulmonary shunt (62% versus 38%, p = 0.03), and have a diagnosis of hypertension (24% versus 7%, p = 0.03) compared to those with left ventricular end diastolic pressure ≤12 mmHg. There were no significant differences in mitral valve E/A ratio, annular e’ velocity, or E/e’ ratio between patients with left ventricular end diastolic pressure >12 versus ≤12 mmHg. Patients with left ventricular end diastolic pressure >12mmHg had larger left atrial area (mean 17.7 versus 14.0 cm2, p = 0.03) and larger left atrium anterior–posterior diameter (mean 36.0 versus 30.6 mm, p = 0.004). In conclusion, typical echocardiographic measures of left ventricular diastolic dysfunction may not be reliable in tetralogy of Fallot patients. Prospective studies with the use of novel echocardiographic measures are needed.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Quality Improvement and Patient Safety (QIPS) plays an important role in addressing shortcomings in optimal healthcare delivery. However, there is little published guidance available for emergency department (ED) teams with respect to developing their own QIPS programs. We sought to create recommendations for established and aspiring ED leaders to use as a pathway to better patient care through programmatic QIPS activities, starting internally and working towards interdepartmental collaboration.
An expert panel comprised of ten ED clinicians with QIPS and leadership expertise was established. A scoping review was conducted to identify published literature on establishing QIPS programs and frameworks in healthcare. Stakeholder consultations were conducted among Canadian healthcare leaders, and recommendations were drafted by the expert panel based on all the accumulated information. These were reviewed and refined at the 2018 CAEP Academic Symposium in Calgary using in-person and technologically-supported feedback.
Recommendations include: creating a sense of urgency for improvement; engaging relevant stakeholders and leaders; creating a formal local QIPS Committee; securing funding and resources; obtaining local data to guide the work; supporting QIPS training for team members; encouraging interprofessional, cross-departmental, and patient collaborations; using an established QIPS framework to guide the work; developing reward mechanisms and incentive structures; and considering to start small by focusing on a project rather than a program.
A list of 10 recommendations is presented as guiding principles for the establishment and sustainable deployment of QIPS activities in EDs throughout Canada and abroad. ED leaders are encouraged to implement our recommendations in an effort to improve patient care.
Micronutrient supplementation is recommended in Ebola Virus Disease (EVD). However, there is limited data on its therapeutic impacts. This study evaluated the association between vitamin A supplementation and mortality outcomes in EVD patients.
This retrospective cohort study accrued patients with EVD admitted to five International Medical Corps run Ebola Treatment Units (ETU) in two countries from 2014-2015. Protocolized treatments with antimicrobials and micronutrients were used at all ETUs. However, due to resource limitations and care variations, only a subset of patients received vitamin A. Standardized data on demographics, clinical characteristics, malaria status, and Ebola virus RT-PCR cycle threshold (CT) values were collected. The outcome of interest was mortality compared between cases treated with 200,000 International Units of vitamin A on care days one and two and those not. Propensity scores (PS) based on the first 48-hours of care were derived using the covariates of age, duration of ETU function, malaria status, CT values, symptoms of confusion, hemorrhage, diarrhea, dysphagia, and dyspnea. Treated and non-treated cases were matched 1:1 based on nearest neighbors with replacement. Covariate balance met predefined thresholds. Mortality proportions between cases treated and untreated with vitamin A were compared using generalized estimating equations to calculate relative risks (RR) with associated 95% confidence intervals (CI).
There were 424 cases analyzed, with 330 (77.8%) being vitamin A-treated cases. The mean age was 30.5 years and 57.0% were female. The most common symptoms were diarrhea (86%), anorexia (81%), and vomiting (77%). Mortality proportions among cases untreated and treated with vitamin A were 71.9% and 55.0%, respectively. In a propensity-matched analysis, mortality was significantly lower among cases receiving vitamin A (RR = 0.77 95%; CI:0.59-0.99; p = 0.041).
Early vitamin A supplementation was associated with reduced mortality in EVD patients and should be provided routinely during future epidemics.