To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Most oviposition by Helicoverpa zea (Boddie) occurs near the top of the canopy in soybean, Glycine max (L.) Merr, and larval abundance is influenced by the growth habit of plants. However, the vertical distribution of larvae within the canopy is not as well known. We evaluated the vertical distribution of H. zea larvae in determinate and indeterminate varieties, hypothesizing that larval distribution in the canopy would vary between these two growth habits and over time. We tested this hypothesis in a naturally infested replicated field experiment and two experimentally manipulated cage experiments. In the field experiment, flowering time was synchronized between the varieties by manipulating planting date, while infestation timing was manipulated in the cage experiments. Larvae were recovered using destructive sampling of individual soybean plants, and their vertical distribution by instar was recorded from three sampling points over time in each experiment. While larval population growth and development varied between the determinate and indeterminate varieties within and among experiments, we found little evidence that larvae have preference for different vertical locations in the canopy. This study lends support to the hypothesis that larval movement and location within soybean canopies do not result entirely from oviposition location and nutritional requirements.
Little is known about the determinants of community integration (i.e. recovery) for individuals with a history of homelessness, yet such information is essential to develop targeted interventions.
We recruited homeless Veterans with a history of psychotic disorders and evaluated four domains of correlates of community integration: perception, non-social cognition, social cognition, and motivation. Baseline assessments occurred after participants were engaged in supported housing services but before they received housing, and again after 12 months. Ninety-five homeless Veterans with a history of psychosis were assessed at baseline and 53 returned after 12 months. We examined both cross-sectional and longitudinal relationships with 12-month community integration.
The strongest longitudinal association was between a baseline motivational measure and social integration at 12 months. We also observed cross-sectional associations at baseline between motivational measures and community integration, including social, work, and independent living. Cross-lagged panel analyses did not suggest causal associations for the motivational measures. Correlations with perception and non-social cognition were weak. One social cognition measure showed a significant longitudinal correlation with independent living at 12 months that was significant for cross-lagged analysis, consistent with a causal relationship and potential treatment target.
The relatively selective associations for motivational measures differ from what is typically seen in psychosis, in which all domains are associated with community integration. These findings are presented along with a partner paper (Study 2) to compare findings from this study to an independent sample without a history of psychotic disorders to evaluate the consistency in findings regarding community integration across projects.
In an initial study (Study 1), we found that motivation predicted community integration (i.e. functional recovery) 12 months after receiving housing in formerly homeless Veterans with a psychotic disorder. The current study examined whether the same pattern would be found in a broader, more clinically diverse, homeless Veteran sample without psychosis.
We examined four categories of variables as potential predictors of community integration in non-psychotic Veterans: perception, non-social cognition, social cognition, and motivation at baseline (after participants were engaged in a permanent supported housing program but before receiving housing) and a 12-month follow-up. A total of 82 Veterans had a baseline assessment and 41 returned for testing after 12 months.
The strongest longitudinal association was between an interview-based measure of motivation (the motivation and pleasure subscale from the Clinical Assessment Interview for Negative Symptoms) at baseline and measures of social integration at 12 months. In addition, cross-lagged panel analyses were consistent with a causal influence of general psychiatric symptoms at baseline driving social integration at 12 months, and reduced expressiveness at baseline driving independent living at 12 months, but there were no significant causal associations with measures of motivation.
The findings from this study complement and reinforce those in Veterans with psychosis. Across these two studies, our findings suggest that motivational factors are associated at baseline and at 12 months and are particularly important for understanding and improving community integration in recently-housed Veterans across psychiatric diagnoses.
Infant feeding guidelines worldwide recommend first foods to be Fe rich with no added sugars and that nutrient-poor discretionary foods are to be avoided. Feeding guidelines also recommend exposing infants to a variety of foods and flavours with increasingly complex textures. Here, we compare nutritional and textural properties of commercial infant and toddler foods available in Australia with established infant feeding guidelines. Nutrition information and ingredient lists were obtained from food labels, manufacturer and/or retailer websites. In total, 414 foods were identified, comprising mostly mixed main dishes, fruit and vegetable first foods and snacks. Most products were poor sources of Fe, and 80 % of first foods were fruit-based. Half of all products were purées in squeeze pouches, and one-third of all products were discretionary foods. The nutritional content of many products was inconsistent with guidelines, being low in Fe, sweet, smooth in consistency or classified as discretionary. Reformulation of products is warranted to improve Fe content, particularly in mixed main dishes, expand the range of vegetable-only foods and textural variety. Greater regulatory oversight may be needed to better inform parents and caregivers. Frequent consumption of commercial baby foods low in Fe may increase the risk of Fe deficiency. Excessive consumption of purées via squeeze pouches may also have implications for overweight and obesity risk.
Levamisole is an increasingly common cutting agent used with cocaine. Both cocaine and levamisole can have local and systemic effects on patients.
A retrospective case series was conducted of patients with a cocaine-induced midline destructive lesion or levamisole-induced vasculitis, who presented to a Dundee hospital or the practice of a single surgeon in Paisley, from April 2016 to April 2019. A literature review on the topic was also carried out.
Nine patients from the two centres were identified. One patient appeared to have levamisole-induced vasculitis, with raised proteinase 3, perinuclear antineutrophil cytoplasmic antibodies positivity and arthralgia which improved on systemic steroids. The other eight patients had features of a cocaine-induced midline destructive lesion.
As the use of cocaine increases, ENT surgeons will see more of the complications associated with it. This paper highlights some of the diagnostic issues and proposes a management strategy as a guide to this complex patient group. Often, multidisciplinary management is needed.
Evidence from previous small trials has suggested the effectiveness of early social communication interventions for autism.
The Preschool Autism Communication Trial (PACT) investigated the efficacy of such an intervention in the largest psychosocial autism trial to date.
To provide a stringent test of a pre-school communication intervention for autism.
152 children with core autism aged 2 years - 4 years 11 months in a 3 site 2 arm single (assessor) blinded randomised controlled trial of the parent-mediated communication-focused intervention added to treatment as usual (TAU) against TAU alone. Primary outcome; severity of autism symptoms (modified social communication algorithm from Autism Diagnostic Observation Schedule-Generic, ADOS-G). Secondary outcomes; blinded measures of parent-child interaction, child language, and adaptation in school.
At 13 month endpoint the treatment resulted in strong improvement in parental synchronous response to child (adjusted between-group effect size 1.22 (95% CI 0.85, 1.59) and child initiations with parent (ES 0.41 (0.08, 0.74) but small effect on autism symptomatology (ADOS-G, ES -0.24 (95% CI -0.59, 0.11) ns). Parents (not blind to allocation) reported strong treatment effects on child language and social adaptation but effects on blinded research assessed language and school adaptation were small.
Addition of the PACT intervention showed clear benefit in improving parent-child dyadic social communication but no substantive benefit over TAU in modifying objectively rated autism symptoms. This attenuation on generalisation from ‘proximal’ intervention effects to wider symptom change in other contexts remains a significant challenge for autism treatment and measurement methodology.
Narrow-windrow burning has been a successful form of harvest weed seed control in Australian cropping systems, but little is known about the efficacy of narrow-windrow burning on weed seeds infesting U.S. cropping systems. An experiment was conducted using a high-fire kiln that exposed various grass and broadleaf weed seeds to temperatures of 200, 300, 400, 500, and 600 C for 20, 40, 60, and 80 s to determine the temperature and time needed to kill weed seeds. Weeds evaluated included Italian ryegrass, barnyardgrass, johnsongrass, sicklepod, Palmer amaranth, prickly sida, velvetleaf, pitted morningglory, and hemp sesbania. Two field experiments were also conducted over consecutive growing seasons, with the first experiment aimed at determining the amount of heat produced during burning of narrow windrows of soybean harvest residues (chaff and straw) and the effect of this heat on weed seed mortality. The second field experiment aimed to determine the effect of wind speed on the duration and intensity of burning narrow windrows of soybean harvest residues. Following exposure to the highest temperature and longest duration in the kiln, only sicklepod showed any survival (<1% average); however, in most cases, the seeds were completely destroyed (ash). A heat index of only 22,600 was needed to kill all seeds of Palmer amaranth, barnyardgrass, and Italian ryegrass. In the field, all seeds of the evaluated weed species were completely destroyed by narrow-windrow burning of 1.08 to 1.95 kg m−2 of soybean residues. The burn duration of the soybean harvest residues declined as wind speed increased. Findings from the kiln and field experiments show that complete kill is likely for weed seeds concentrated into narrow windrows of burned soybean residues. Given the low cost of implementation of narrow-windrow burning and the seed kill efficacy on various weed species, this strategy may be an attractive option for destroying weed seed.
Observational studies have shown a relationship between maternal mental health (MMH) and child development, but few studies have evaluated whether MMH interventions improve child-related outcomes, particularly in low- and middle-income countries. The objective of this review is to synthesise findings on the effectiveness of MMH interventions to improve child-related outcomes in low- and middle-income countries (LMICs).
We searched for randomised controlled trials conducted in LMICs evaluating interventions with a MMH component and reporting children's outcomes. Meta-analysis was performed on outcomes included in at least two trials.
We identified 21 trials with 28 284 mother–child dyads. Most trials were conducted in middle-income countries, evaluating home visiting interventions delivered by general health workers, starting in the third trimester of pregnancy. Only ten trials described acceptable methods for blinding outcome assessors. Four trials showed high risk of bias in at least two of the seven domains assessed in this review. Narrative synthesis showed promising but inconclusive findings for child-related outcomes. Meta-analysis identified a sizeable impact of interventions on exclusive breastfeeding (risk ratio = 1.39, 95% confidence interval (CI): 1.13–1.71, ten trials, N = 4749 mother–child dyads, I2 = 61%) and a small effect on child height-for-age at 6-months (std. mean difference = 0.13, 95% CI: 0.02–0.24, three trials, N = 1388, I2 = 0%). Meta-analyses did not identify intervention benefits for child cognitive and other growth outcomes; however, few trials measured these outcomes.
These findings support the importance of MMH to improve child-related outcomes in LMICs, particularly exclusive breastfeeding. Given, the small number of trials and methodological limitations, more rigorous trials should be conducted.
Despite increased awareness that non-suicidal self-injury (NSSI) poses a significant public health concern on college campuses worldwide, few studies have prospectively investigated the incidence of NSSI in college and considered targeting college entrants at high risk for onset of NSSI.
Using data from the Leuven College Surveys (n = 4,565; 56.8%female, Mage = 18.3, SD = 1.1), students provided data on NSSI, sociodemographics, traumatic experiences, stressful events, perceived social support, and mental disorders. A total of 2,163 baseline responders provided data at a two-year annual follow-up assessment (63.2% conditional response rate).
One-year incidence of first onset NSSI was 10.3% in year 1 and 6.0% in year 2, with a total of 8.6% reporting sporadic NSSI (1–4 times per year) and 7.0% reporting repetitive NSSI (≥ 5 times per year) during the first two years of college. Many hypothesized proximal and distal risk factors were associated with the subsequent onset of NSSI (ORs = 1.5–18.2). Dating violence prior to age 17 and severe role impairment in daily life were the strongest predictors. Multivariate prediction suggests that an intervention focused on the 10% at highest risk would reach 23.9% of students who report sporadic, and 36.1% of students who report repetitive NSSI during college (cross-validated AUCs =.70–.75).
The college period carries high risk for the onset of NSSI. Individualized web-based screening may be a promising approach for detecting young adults at high risk for self-injury and offering timely intervention.
Clostridioides difficile infection (CDI) can be prevented through infection prevention practices and antibiotic stewardship. Diagnostic stewardship (ie, strategies to improve use of microbiological testing) can also improve antibiotic use. However, little is known about the use of such practices in US hospitals, especially after multidisciplinary stewardship programs became a requirement for US hospital accreditation in 2017. Thus, we surveyed US hospitals to assess antibiotic stewardship program composition, practices related to CDI, and diagnostic stewardship.
Surveys were mailed to infection preventionists at 900 randomly sampled US hospitals between May and October 2017. Hospitals were surveyed on antibiotic stewardship programs; CDI prevention, treatment, and testing practices; and diagnostic stewardship strategies. Responses were compared by hospital bed size using weighted logistic regression.
Overall, 528 surveys were completed (59% response rate). Almost all (95%) responding hospitals had an antibiotic stewardship program. Smaller hospitals were less likely to have stewardship team members with infectious diseases (ID) training, and only 41% of hospitals met The Joint Commission accreditation standards for multidisciplinary teams. Guideline-recommended CDI prevention practices were common. Smaller hospitals were less likely to use high-tech disinfection devices, fecal microbiota transplantation, or diagnostic stewardship strategies.
Following changes in accreditation standards, nearly all US hospitals now have an antibiotic stewardship program. However, many hospitals, especially smaller hospitals, appear to struggle with access to ID expertise and with deploying diagnostic stewardship strategies. CDI prevention could be enhanced through diagnostic stewardship and by emphasizing the role of non–ID-trained pharmacists and clinicians in antibiotic stewardship.
The physiology of mesophotic Scleractinia varies with depth in response to environmental change. Previous research has documented trends in heterotrophy and photosynthesis with depth, but has not addressed between-site variation for a single species. Environmental differences between sites at a local scale and heterogeneous microhabitats, because of irradiance and food availability, are likely important factors when explaining the occurrence and physiology of Scleractinia. Here, 108 colonies of Agaricia lamarcki were sampled from two locations off the coast of Utila, Honduras, distributed evenly down the observed 50 m depth range of the species. We found that depth alone was not sufficient to fully explain physiological variation. Pulse Amplitude-Modulation fluorometry and stable isotope analyses revealed that trends in photochemical and heterotrophic activity with depth varied markedly between sites. Our isotope analyses do not support an obligate link between photosynthetic activity and heterotrophic subsidy with increasing depth. We found that A. lamarcki colonies at the bottom of the species depth range can be physiologically similar to those nearer the surface. As a potential explanation, we hypothesize sites with high topographical complexity, and therefore varied microhabitats, may provide more physiological niches distributed across a larger depth range. Varied microhabitats with depth may reduce the dominance of depth as a physiological determinant. Thus, A. lamarcki may ‘avoid’ changes in environment with depth, by instead existing in a subset of favourable niches. Our observations correlate with site-specific depth ranges, advocating for linking physiology and abiotic profiles when defining the distribution of mesophotic taxa.
There is increasing evidence to support integration of simulation into medical training; however, no national emergency medicine (EM) simulation curriculum exists. Using Delphi methodology, we aimed to identify and establish content validity for adult EM curricular content best suited for simulation-based training, to inform national postgraduate EM training.
A national panel of experts in EM simulation iteratively rated potential curricular topics, on a 4-point scale, to determine those best suited for simulation-based training. After each round, responses were analyzed. Topics scoring <2/4 were removed and remaining topics were resent to the panel for further ratings until consensus was achieved, defined as Cronbach α ≥ 0.95. At conclusion of the Delphi process, topics rated ≥ 3.5/4 were considered “core” curricular topics, while those rated 3.0-3.5 were considered “extended” curricular topics.
Forty-five experts from 13 Canadian centres participated. Two hundred eighty potential curricular topics, in 29 domains, were generated from a systematic literature review, relevant educational documents and Delphi panellists. Three rounds of surveys were completed before consensus was achieved, with response rates ranging from 93-100%. Twenty-eight topics, in eight domains, reached consensus as “core” curricular topics. Thirty-five additional topics, in 14 domains, reached consensus as “extended” curricular topics.
Delphi methodology allowed for achievement of expert consensus and content validation of EM curricular content best suited for simulation-based training. These results provide a foundation for improved integration of simulation into postgraduate EM training and can be used to inform a national simulation curriculum to supplement clinical training and optimize learning.
Influenza and respiratory syncytial virus (RSV) are common causes of respiratory tract infections and place a burden on health services each winter. Systems to describe the timing and intensity of such activity will improve the public health response and deployment of interventions to these pressures. Here we develop early warning and activity intensity thresholds for monitoring influenza and RSV using two novel data sources: general practitioner out-of-hours consultations (GP OOH) and telehealth calls (NHS 111). Moving Epidemic Method (MEM) thresholds were developed for winter 2017–2018. The NHS 111 cold/flu threshold was breached several weeks in advance of other systems. The NHS 111 RSV epidemic threshold was breached in week 41, in advance of RSV laboratory reporting. Combining the use of MEM thresholds with daily monitoring of NHS 111 and GP OOH syndromic surveillance systems provides the potential to alert to threshold breaches in real-time. An advantage of using thresholds across different health systems is the ability to capture a range of healthcare-seeking behaviour, which may reflect differences in disease severity. This study also provides a quantifiable measure of seasonal RSV activity, which contributes to our understanding of RSV activity in advance of the potential introduction of new RSV vaccines.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
Mismatch negativity (MMN) is an event-related potential (ERP) component reflecting auditory predictive coding. Repeated standard tones evoke increasing positivity (‘repetition positivity’; RP), reflecting strengthening of the standard's memory trace and the prediction it will recur. Likewise, deviant tones preceded by more standard repetitions evoke greater negativity (‘deviant negativity’; DN), reflecting stronger prediction error signaling. These memory trace effects are also evident in MMN difference wave. Here, we assess group differences and test-retest reliability of these indices in schizophrenia patients (SZ) and healthy controls (HC).
Electroencephalography was recorded twice, 2 weeks apart, from 43 SZ and 30 HC, during a roving standard paradigm. We examined ERPs to the third, eighth, and 33rd standards (RP), immediately subsequent deviants (DN), and the corresponding MMN. Memory trace effects were assessed by comparing amplitudes associated with the three standard repetition trains.
Compared with controls, SZ showed reduced MMNs and DNs, but normal RPs. Both groups showed memory trace effects for RP, MMN, and DN, with a trend for attenuated DNs in SZ. Intraclass correlations obtained via this paradigm indicated good-to-moderate reliabilities for overall MMN, DN and RP, but moderate to poor reliabilities for components associated with short, intermediate, and long standard trains, and poor reliability of their memory trace effects.
MMN deficits in SZ reflected attenuated prediction error signaling (DN), with relatively intact predictive code formation (RP) and memory trace effects. This roving standard MMN paradigm requires additional development/validation to obtain suitable levels of reliability for use in clinical trials.
High body mass index (BMI) has been associated with lower risks of suicidal behaviour and being underweight with increased risks. However, evidence is inconsistent and sparse, particularly for women. We aim to study this relationship in a large cohort of UK women.
In total 1.2 million women, mean age 56 (s.d. 5) years, without prior suicide attempts or other major illness, recruited in 1996–2001 were followed by record linkage to national hospital admission and death databases. Cox regression yielded relative risks (RRs) and 95% confidence intervals (CIs) for attempted suicide and suicide by BMI, adjusted for baseline lifestyle factors and self-reported treatment for depression or anxiety.
After 16 (s.d. 3) years of follow-up, 4930 women attempted suicide and 642 died by suicide. The small proportion (4%) with BMI <20 kg/m2 were at clearly greater risk of attempted suicide (RR = 1.38, 95% CI 1.23–1.56) and suicide (RR = 2.10, 1.59–2.78) than women of BMI 20–24.9 kg/m2; p < 0.0001 for both comparisons. Small body size at 10 and 20 years old was also associated with increased risks. Half the cohort had BMIs >25 kg/m2 and, while risks were somewhat lower than for BMI 20–24.9 kg/m2 (attempted suicide RR = 0.91, 0.86–0.96; p = 0.001; suicide RR = 0.79, 0.67–0.93; p = 0.006), the reductions in risk were not strongly related to level of BMI.
Being underweight is associated with a definite increase in the risk of suicidal behaviour, particularly death by suicide. Residual confounding cannot be excluded for the small and inconsistent decreased risk of suicidal behaviour associated with being overweight or obese.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.