We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Similar to adults with posttraumatic stress disorder, children with early life adversity show bias in memory for negative emotional stimuli. However, it is not well understood how childhood adversity impacts mechanisms underlying emotional memory. N = 56 children (8–14 years, 48% female) reported on adverse experiences including potentially traumatic events and underwent fMRI while attending to emotionally pleasant, neutral, or negative images. Post-scan, participants completed a cued recall test to assess memory for these images. Emotional difference-in-memory (DM) scores were computed by subtracting negative or positive from neutral recall performance. All children showed enhancing effects of emotion on recall, with no effect of trauma load. However, children with less trauma showed a larger emotional DM for both positive and negative stimuli when amygdala or anterior hippocampal activity was higher. In contrast, highly trauma-exposed children demonstrated a lower emotional DM with greater amygdala or hippocampal activity. This suggested that alternative neural mechanisms might support emotional enhancement of encoding in children with greater trauma load. Whole-brain analyses revealed that right fusiform activity during encoding positively correlated with both trauma load and successful later recall of positive images. Therefore, highly trauma-exposed children may use alternative, potentially adaptive neural pathways via the ventral visual stream to encode positive emotional events.
COVID-19 misinformation proliferating online has led to adverse health and societal consequences. Older adults are a particularly vulnerable population due to increased risk for both COVID-19 related complications and susceptibility to, as well as sharing of, misinformation on social networking sites. The present study aimed to: 1) investigate differences in COVID-19 headline accuracy discernment and online sharing of COVID-19 misinformation in older and younger adults; and 2) examine individual differences in global cognition, health literacy and verbal IQ in online sharing of COVID-19 misinformation.
Participants and Methods:
Fifty-two younger (age 18 to 35 years) and fifty older adults (age 50 and older) completed a telephone neurocognitive battery, health literacy and numeracy measures and self-report questionnaires. Participants also completed a social media headline-sharing experiment (Pennycook et al.,2020) in which they were presented true and false COVID-19 headlines and asked to indicate: 1) the likelihood that they would share the story on social media; and 2) the factual accuracy of the story.
Results:
A repeated measures multivariate analysis of variance controlling for gender and race/ethnicity showed no effects of age (p=.099), but a significant interaction between actual COVID-19 headline accuracy and likelihood of sharing (p<.001), such that accuracy is more strongly related to sharing false headlines (r=-.64) versus true headlines (r=-.43). Moreover, higher likelihood of sharing false COVID-19 headlines was associated with lower verbal IQ and numeracy skills in older adults (rs=-.51--.40; ps<.01) and with lower verbal IQ, numeracy, and global cognition in younger adults (rs=-.66--.60; ps<.01).
Conclusions:
Findings indicate that headline accuracy judgements are an important predictor of sharing COVID-19 misinformation in both older and younger adults. Further, individual differences in cognition, IQ, and numeracy may predict the likelihood of misinformation sharing in younger adults, while IQ and numeracy skills may act as important antecedents of misinformation sharing in older adults. Future work might leverage modern, neuropsychologically-based psychoeducation approaches to improving health and science literacy related to COVID-19.
To evaluate the efficacy of a new continuously active disinfectant (CAD) to decrease bioburden on high-touch environmental surfaces compared to a standard disinfectant in the intensive care unit.
Design:
A single-blind randomized controlled trial with 1:1 allocation.
Setting:
Medical intensive care unit (MICU) at an urban tertiary-care hospital.
Participants:
Adult patients admitted to the MICU and on contact precautions.
Intervention:
A new CAD wipe used for daily cleaning.
Methods:
Samples were collected from 5 high-touch environmental surfaces before cleaning and at 1, 4, and 24 hours after cleaning. The primary outcome was the mean bioburden 24 hours after cleaning. The secondary outcome was the detection of any epidemiologically important pathogen (EIP) 24 hours after cleaning.
Results:
In total, 843 environmental samples were collected from 43 unique patient rooms. At 24 hours, the mean bioburden recovered from the patient rooms cleaned with the new CAD wipe (intervention) was 52 CFU/mL, and the mean bioburden was 92 CFU/mL in the rooms cleaned the standard disinfectant (control). After log transformation for multivariable analysis, the mean difference in bioburden between the intervention and control arm was −0.59 (95% CI, −1.45 to 0.27). The odds of EIP detection were 14% lower in the rooms cleaned with the CAD wipe (OR, 0.86; 95% CI, 0.31–2.32).
Conclusions:
The bacterial bioburden and odds of detection of EIPs were not statistically different in rooms cleaned with the CAD compared to the standard disinfectant after 24 hours. Although CAD technology appears promising in vitro, larger studies may be warranted to evaluate efficacy in clinical settings.
Pompe disease results from lysosomal acid α-glucosidase deficiency, which leads to cardiomyopathy in all infantile-onset and occasional late-onset patients. Cardiac assessment is important for its diagnosis and management. This article presents unpublished cardiac findings, concomitant medications, and cardiac efficacy and safety outcomes from the ADVANCE study; trajectories of patients with abnormal left ventricular mass z score at enrolment; and post hoc analyses of on-treatment left ventricular mass and systolic blood pressure z scores by disease phenotype, GAA genotype, and “fraction of life” (defined as the fraction of life on pre-study 160 L production-scale alglucosidase alfa). ADVANCE evaluated 52 weeks’ treatment with 4000 L production-scale alglucosidase alfa in ≥1-year-old United States of America patients with Pompe disease previously receiving 160 L production-scale alglucosidase alfa. M-mode echocardiography and 12-lead electrocardiography were performed at enrolment and Week 52. Sixty-seven patients had complete left ventricular mass z scores, decreasing at Week 52 (infantile-onset patients, change −0.8 ± 1.83; 95% confidence interval −1.3 to −0.2; all patients, change −0.5 ± 1.71; 95% confidence interval −1.0 to −0.1). Patients with “fraction of life” <0.79 had left ventricular mass z score decreasing (enrolment: +0.1 ± 3.0; Week 52: −1.1 ± 2.0); those with “fraction of life” ≥0.79 remained stable (enrolment: −0.9 ± 1.5; Week 52: −0.9 ± 1.4). Systolic blood pressure z scores were stable from enrolment to Week 52, and no cohort developed systemic hypertension. Eight patients had Wolff–Parkinson–White syndrome. Cardiac hypertrophy and dysrhythmia in ADVANCE patients at or before enrolment were typical of Pompe disease. Four-thousand L alglucosidase alfa therapy maintained fractional shortening, left ventricular posterior and septal end-diastolic thicknesses, and improved left ventricular mass z score.
Social Media Statement: Post hoc analyses of the ADVANCE study cohort of 113 children support ongoing cardiac monitoring and concomitant management of children with Pompe disease on long-term alglucosidase alfa to functionally improve cardiomyopathy and/or dysrhythmia.
To determine the utility of the Sofia SARS rapid antigen fluorescent immunoassay (FIA) to guide hospital-bed placement of patients being admitted through the emergency department (ED).
Design:
Cross-sectional analysis of a clinical quality improvement study.
Setting:
This study was conducted in 2 community hospitals in Maryland from September 21, 2020, to December 3, 2020. In total, 2,887 patients simultaneously received the Sofia SARS rapid antigen FIA and SARS-CoV-2 RT-PCR assays on admission through the ED.
Methods:
Rapid antigen results and symptom assessment guided initial patient placement while confirmatory RT-PCR was pending. The sensitivity, specificity, positive predictive values, and negative predictive values of the rapid antigen assay were calculated relative to RT-PCR, overall and separately for symptomatic and asymptomatic patients. Assay sensitivity was compared to RT-PCR cycle threshold (Ct) values. Assay turnaround times were compared. Clinical characteristics of RT-PCR–positive patients and potential exposures from false-negative antigen assays were evaluated.
Results:
For all patients, overall agreement was 97.9%; sensitivity was 76.6% (95% confidence interval [CI], 71%–82%), and specificity was 99.7% (95% CI, 99%–100%). We detected no differences in performance between asymptomatic and symptomatic individuals. As RT-PCR Ct increased, the sensitivity of the antigen assay decreased. The mean turnaround time for the antigen assay was 1.2 hours (95% CI, 1.0–1.3) and for RT-PCR it was 20.1 hours (95% CI, 18.9–40.3) (P < .001). No transmission from antigen-negative/RT-PCR–positive patients was identified.
Conclusions:
Although not a replacement for RT-PCR for detection of all SARS-CoV-2 infections, the Sofia SARS antigen FIA has clinical utility for potential initial timely patient placement.
The escalating evolution of weed species resistant to acetolactase synthase (ALS)-inhibitor herbicides makes alternative weed control strategies necessary for field crops that are dependent on this herbicide group. A fully integrated strategy that combined increased crop seeding rates (2X or 4X recommended), mechanical weed control with a minimum-tillage rotary hoe, and reduced-rate non–ALS inhibitor herbicides was compared with herbicides, rotary hoe, and seeding rates alone as a method of controlling ALS inhibitor–tolerant Indian mustard as a model weed. The full-rate herbicide treatment had the lowest weed biomass (98% reduction) and the highest yield of all treatments in 3 of 4 site-years, regardless of seeding rate. The fully integrated treatment at the 4X seeding rate had weed suppression rates equal to the full herbicide treatment at the recommended seeding rate. The fully integrated and reduced-rate herbicide treatments at the 4X seeding rate reduced weed biomass by 89% and 83%, respectively, compared with the control at the recommended seeding rate. The rotary hoe treatment alone resulted in poor weed control (≤38%), even at the highest seeding rate. Fully integrated and reduced-rate herbicide treatments at 2X and 4X seeding rates had yields equal to those of the full herbicide treatment at the recommended seeding rate. Partially or fully integrated weed control strategies that combine increased crop seeding rates and reduced-rate non–ALS inhibitor herbicides, with or without the use of a rotary hoe, can control weeds resistant to ALS-inhibitor herbicides, while maintaining crop yields similar to those achieved with full-rate herbicides. However, combining increased seeding rate, reduced-rate herbicides, and mechanical rotary hoe treatment into a fully integrated strategy maximized weed control, while reducing reliance on and selection pressure against any single weed control tactic.
Concern over the development of herbicide-resistant weeds has led to interest in integrated weed management systems that reduce selection pressure by utilizing mechanical and cultural weed control practices in addition to herbicides. Increasing crop seeding rate increases crop competitive ability and thus can enhance herbicide efficacy. However, it is unknown how increasing the seeding rate affects an herbicide’s efficacy. The objective of this study was to examine the interaction between increasing seeding rate and herbicide dose to control weeds. To meet this objective, the herbicide fluthiacet-methyl was applied to field-grown lentil, with Indian mustard, a proxy for wild mustard, used as a model weed. The experiment was a factorial design with four lentil seeding rates and seven herbicide rates. Overall the herbicide dose response was altered by changing lentil seeding rate. Increasing lentil seeding rate decreased the weed biomass production when herbicides were not applied. In two of the four site-years, increasing lentil seeding rate lowered the herbicide ED50, the dose required to result in a 50% reduction in weed biomass. Increasing the crop seeding rate altered the dose response to provide greater weed control at lower herbicide rates compared with normal crop seeding rates. Increased seeding rates also resulted in higher and more stable crop seed yields across a wider range of herbicide dosages. These results suggest that dose–response models can be used to evaluate the efficacy of other weed management practices that can interact with herbicide performance.
Field studies were conducted at 35 sites throughout the north-central United States in 1998 and 1999 to determine the effect of postemergence glyphosate application timing on weed control and grain yield in glyphosate-resistant corn. Glyphosate was applied at various timings based on the height of the most dominant weed species. Weed control and corn grain yields were considerably more variable when glyphosate was applied only once. The most effective and consistent season-long annual grass and broadleaf weed control occurred when a single glyphosate application was delayed until weeds were 15 cm or taller. Two glyphosate applications provided more consistent weed control when weeds were 10 cm tall or less and higher corn grain yields when weeds were 5 cm tall or less, compared with a single application. Weed control averaged at least 94 and 97% across all sites in 1998 and 1999, respectively, with two glyphosate applications but was occasionally less than 70% because of late emergence of annual grass and Amaranthus spp. or reduced control of Ipomoea spp. With a single application of glyphosate, corn grain yield was most often reduced when the application was delayed until weeds were 23 cm or taller. Averaged across all sites in 1998 and 1999, corn grain yields from a single glyphosate application at the 5-, 10-, 15-, 23-, and 30-cm timings were 93, 94, 93, 91, and 79% of the weed-free control, respectively. There was a significant effect of herbicide treatment on corn grain yield in 23 of the 35 sites when weed reinfestation was prevented with a second glyphosate application. When weed reinfestation was prevented, corn grain yield at the 5-, 10-, and 15-cm application timings was 101, 97, and 93% of the weed-free control, respectively, averaged across all sites. Results of this study suggested that the optimum timing for initial glyphosate application to avoid corn grain yield loss was when weeds were less than 10 cm in height, no more than 23 d after corn planting, and when corn growth was not more advanced than the V4 stage.
The recently developed three-dimensional electron microscopic (EM) method of serial block-face scanning electron microscopy (SBEM) has rapidly established itself as a powerful imaging approach. Volume EM imaging with this scanning electron microscopy (SEM) method requires intense staining of biological specimens with heavy metals to allow sufficient back-scatter electron signal and also to render specimens sufficiently conductive to control charging artifacts. These more extreme heavy metal staining protocols render specimens light opaque and make it much more difficult to track and identify regions of interest (ROIs) for the SBEM imaging process than for a typical thin section transmission electron microscopy correlative light and electron microscopy study. We present a strategy employing X-ray microscopy (XRM) both for tracking ROIs and for increasing the efficiency of the workflow used for typical projects undertaken with SBEM. XRM was found to reveal an impressive level of detail in tissue heavily stained for SBEM imaging, allowing for the identification of tissue landmarks that can be subsequently used to guide data collection in the SEM. Furthermore, specific labeling of individual cells using diaminobenzidine is detectable in XRM volumes. We demonstrate that tungsten carbide particles or upconverting nanophosphor particles can be used as fiducial markers to further increase the precision and efficiency of SBEM imaging.
This study investigates lifetime prevalence rates, demographic characteristics, childhood conduct disorder and adult antisocial features, suicide attempts, and cognitive impairment in individuals with obsessive-compulsive disorder (OCD) uncomplicated by or comorbid with any other psychiatric disorder. The data are from the NIMH Epidemiological Catchment Area (ECA) study, and the current analyses compared subjects with uncomplicated OCD (no history of any other lifetime psychiatric disorder) comorbid OCD (with any other lifetime disorder), other lifetime psychiatric disorders, and no lifetime psychiatric disorders across these variables. OCD in its uncomplicated and comorbid form had significantly higher rates of childhood conduct symptoms, adult antisocial personality disorder problems, and of suicide attempts than did no or other disorders. Comorbid OCD subjects had higher rates of mild cognitive impairment on the Mini-Mental Status Exam than did subjects with other disorders. These findings suggest that a subgroup of OCD patients may have impulsive features, including childhood conduct disorder symptoms and an increased rate of suicide attempts; wider clinical attention to these outcomes is needed.
To validate the utility of a previously published scoring model (Italian) to identify patients infected with community-onset extended-spectrum β-lactamase-producing Enterobacteriaceae (ESBL-EKP) and develop a new model (Duke) based on local epidemiology.
Methods.
This case-control study included patients 18 years of age or more admitted to Duke University Hospital between January 1, 2008, and December 31, 2010, with culture-confirmed infection due to an ESBL-EKP (cases). Uninfected controls were matched to cases (3 : 1). The Italian model was applied to our patient population for validation. The Duke model was developed through logistic-regression-based prediction scores calculated on variables independently associated with ESBL-EKP isolation. Sensitivities and specificities at various point cutoffs were determined, and determination of the area under the receiver operating characteristic curve (ROC AUC) was performed.
Results.
A total of 123 cases and 375 controls were identified. Adjusted odds ratios and 95% confidence intervals for variables previously identified in the Italian model were as follows: hospitalization (3.20 [1.62–6.55]), transfer (4.31 [2.15–8.78]), urinary catheterization (5.92 [3.09–11.60]), β-lactam and/or fluoroquinolone therapy (3.76 [2.06–6.95]), age 70 years or more (1.55 [0.79–3.01]), and Charlson Comorbidity Score of 4 or above (1.06 [0.55–2.01]). Sensitivity and specificity were, respectively, more than or equal to 95% and less than or equal to 47% for scores 3 or below and were less than or equal to 50% and more than or equal to 96% for scores 8 or above. The ROC AUC was 0.88. Variables identified in the Duke model were as follows: hospitalization (2.63 [1.32–5.41]), transfer (5.30 [2.67–10.71]), urinary catheterization (6.89 [3.62–13.38]), β-lactam and/or fluoroquinolone therapy (3.47 [1.91–6.41]), and immunosuppression (2.34 [1.14–4.80]). Sensitivity and specificity were, respectively, more than or equal to 94% and less than or equal to 65% for scores 3 or below and were less than or equal to 58% and more than or equal to 95% for scores 8 or above. The ROC AUC was 0.89.
Conclusion.
While the previously reported model was an excellent predictor of community-onset ESBL-EKP infection, models utilizing factors based on local epidemiology may be associated with improved performance.