To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the efficacy of a new continuously active disinfectant (CAD) to decrease bioburden on high-touch environmental surfaces compared to a standard disinfectant in the intensive care unit.
A single-blind randomized controlled trial with 1:1 allocation.
Medical intensive care unit (MICU) at an urban tertiary-care hospital.
Adult patients admitted to the MICU and on contact precautions.
A new CAD wipe used for daily cleaning.
Samples were collected from 5 high-touch environmental surfaces before cleaning and at 1, 4, and 24 hours after cleaning. The primary outcome was the mean bioburden 24 hours after cleaning. The secondary outcome was the detection of any epidemiologically important pathogen (EIP) 24 hours after cleaning.
In total, 843 environmental samples were collected from 43 unique patient rooms. At 24 hours, the mean bioburden recovered from the patient rooms cleaned with the new CAD wipe (intervention) was 52 CFU/mL, and the mean bioburden was 92 CFU/mL in the rooms cleaned the standard disinfectant (control). After log transformation for multivariable analysis, the mean difference in bioburden between the intervention and control arm was −0.59 (95% CI, −1.45 to 0.27). The odds of EIP detection were 14% lower in the rooms cleaned with the CAD wipe (OR, 0.86; 95% CI, 0.31–2.32).
The bacterial bioburden and odds of detection of EIPs were not statistically different in rooms cleaned with the CAD compared to the standard disinfectant after 24 hours. Although CAD technology appears promising in vitro, larger studies may be warranted to evaluate efficacy in clinical settings.
OBJECTIVES/GOALS: Social distancing practices during COVID-19 may impact experience of stress, substance use and violence exposure. This study aims to describe the effect of the COVID-19 stay-at-home orders on stress, substance use, and teen dating violence (TDV) among young women living in Baltimore City. METHODS/STUDY POPULATION: Study participants were recruited from an observational study examining TDV before the COVID-19 pandemic, through snowball sampling, pediatric and adolescent primary care clinics, the pediatric emergency department, and a registry for patients interested in participating in COVID-19 research. Participants were between the ages of 16 and 22, identified as female, and lived in Baltimore, Maryland. They were asked to complete a baseline survey. March 16, 2020 (Maryland governor’s stay-at-home order) through June 2022 defined the COVID-19 pandemic period. The survey assessed stress experiences, including isolation, finances, job loss, transportation, school stress, substance use, experiences of violence and adherence to COVID-19 safety measures. We conducted descriptive and bivariate analyses. RESULTS/ANTICIPATED RESULTS: Participants (n=105) had a mean age of 19.4 years (SD 1.73). Preliminary analyses demonstrate that stress associated with isolation, finances, transportation, and school increased during the pandemic compared to pre-pandemic. In addition, the majority of participants who used marijuana, e-cigarettes, and alcohol used about the same amount or more of each substance during the pandemic. For the next steps, we will examine experiences of TDV for young women during the pandemic and examine whether experiences of TDV differ for young women who reported a greater adherence to COVID-19 safety measures compared to participants who adhered less. DISCUSSION/SIGNIFICANCE: Assessing the impact of COVID-19 safety measures on stress, substance use, and TDV is critical to informing and designing future public health interventions. In addition, the information obtained from this study may be used to address the unique challenges faced by disenfranchised populations while curbing the spread of infectious diseases.
Long-term sequelae of severe acute respiratory coronavirus-2 (SARS-CoV-2) infection may include increased incidence of diabetes. Here we describe the temporal relationship between new type 2 diabetes and SARS-CoV-2 infection in a nationwide database. We found that while the proportion of newly diagnosed type 2 diabetes increased during the acute period of SARS-CoV-2 infection, the mean proportion of new diabetes cases in the 6 months post-infection was about 83% lower than the 6 months preinfection. These results underscore the need for further investigation to understand the timing of new diabetes after COVID-19, etiology, screening, and treatment strategies.
Despite increasing evidence for the effectiveness of individual psychological interventions for bipolar disorder, research on older adults is lacking. We report the first randomised controlled trial of psychological therapy designed specifically for older adults with bipolar disorder.
To evaluate the feasibility and acceptability of recovery-focused therapy, designed in collaboration with older people living with bipolar disorder.
A parallel, two-armed, randomised controlled trial comparing treatment as usual with up to 14 sessions of recovery-focused therapy plus treatment as usual, for older adults with bipolar disorder.
Thirty-nine participants (67% female, mean age 67 years) were recruited over a 17-month period. Feasibility and acceptability of recruitment, retention (>80% observer-rated outcomes at both 24 and 48 weeks) and intervention processes were demonstrated. The majority of participants started therapy when offered, adhered to the intervention (68% attended all sessions and 89% attended six or more sessions) and reported positive benefits. Clinical assessment measures provide evidence of a signal for effectiveness on a range of outcomes including mood symptoms, time to relapse and functioning. No trial-related serious adverse events were identified.
Recovery-focused therapy is feasible, acceptable and has the potential to improve a range of outcomes for people living with bipolar disorder in later life. A large-scale trial is warranted to provide a reliable estimate of its clinical and cost-effectiveness.
Cesario claims that all bias research tells us is that people “end up using the information they have come to learn as being probabilistically accurate in their daily lives” (sect. 5, para. 4). We expose Cesario's flawed assumptions about the relationship between accuracy and bias. Through statistical simulations and empirical work, we show that even probabilistically accurate responses are regularly accompanied by bias.
Airway injuries are the second leading cause of potentially survivable battlefield death and often require airway management strategies. Airway suction, the act of using negative pressure in a patient’s upper airway, removes debris that can prevent respiration, decreases possible aspiration risks, and allows clearer viewing of the airway for intubation. The most important characteristics for a portable airway suction device for prehospital combat care are portability, strong suction, and ease of use.
This market review searched academic papers, military publications, Google searches, and Amazon to identify devices. The search included specific characteristics that would increase the likelihood that the devices would be suitable for battlefield use including weight, size, battery life, noise emission, canister size, tubing, and suction power.
Sixty portable airway suction devices were resulted, 31 of which met inclusion criteria – 11 manually powered devices and 20 battery-operated devices. One type of manual suction pump was a bag-like design with a squeezable suction pump that was extremely lightweight but had limited suction capabilities (vacuum pressure of 100mmHg). Another type of manual suction pump had a trigger-like design which is pulled back to create suction with a firm collection canister that had increased suction capabilities (vacuum pressures of 188-600mmHg), though still less than the battery operated, and was slightly heavier (0.23-0.458kg). Battery-operated devices had increased suction capabilities and were easier to use, but they were larger and weighed more (1.18-11.0kg).
Future research should work to lighten and debulk battery-operated suction devices with high suction performance.
We report on compact and robust supercontinuum generation and post-compression using transmission of light through multiple thin solid plates at the SwissFEL X-ray free-electron laser facility. A single stage consisting of three thin plates followed by a chirped mirror compressor achieves compression of initially 30-fs pulses with 800-nm center wavelength to sub-10-fs duration. We also demonstrate a two-stage implementation to compress the pulses further to sub-5-fs duration. With the two-stage setup, the generated supercontinuum includes wavelengths ranging from 500 to 1100 nm. The multi-plate setup is compact, robust, and stable, which makes it ideal for applications at free-electron laser facilities such as pump-probe experiments and laser-arrival timing tools.
Pompe disease results from lysosomal acid α-glucosidase deficiency, which leads to cardiomyopathy in all infantile-onset and occasional late-onset patients. Cardiac assessment is important for its diagnosis and management. This article presents unpublished cardiac findings, concomitant medications, and cardiac efficacy and safety outcomes from the ADVANCE study; trajectories of patients with abnormal left ventricular mass z score at enrolment; and post hoc analyses of on-treatment left ventricular mass and systolic blood pressure z scores by disease phenotype, GAA genotype, and “fraction of life” (defined as the fraction of life on pre-study 160 L production-scale alglucosidase alfa). ADVANCE evaluated 52 weeks’ treatment with 4000 L production-scale alglucosidase alfa in ≥1-year-old United States of America patients with Pompe disease previously receiving 160 L production-scale alglucosidase alfa. M-mode echocardiography and 12-lead electrocardiography were performed at enrolment and Week 52. Sixty-seven patients had complete left ventricular mass z scores, decreasing at Week 52 (infantile-onset patients, change −0.8 ± 1.83; 95% confidence interval −1.3 to −0.2; all patients, change −0.5 ± 1.71; 95% confidence interval −1.0 to −0.1). Patients with “fraction of life” <0.79 had left ventricular mass z score decreasing (enrolment: +0.1 ± 3.0; Week 52: −1.1 ± 2.0); those with “fraction of life” ≥0.79 remained stable (enrolment: −0.9 ± 1.5; Week 52: −0.9 ± 1.4). Systolic blood pressure z scores were stable from enrolment to Week 52, and no cohort developed systemic hypertension. Eight patients had Wolff–Parkinson–White syndrome. Cardiac hypertrophy and dysrhythmia in ADVANCE patients at or before enrolment were typical of Pompe disease. Four-thousand L alglucosidase alfa therapy maintained fractional shortening, left ventricular posterior and septal end-diastolic thicknesses, and improved left ventricular mass z score.
Social Media Statement: Post hoc analyses of the ADVANCE study cohort of 113 children support ongoing cardiac monitoring and concomitant management of children with Pompe disease on long-term alglucosidase alfa to functionally improve cardiomyopathy and/or dysrhythmia.
To determine the utility of the Sofia SARS rapid antigen fluorescent immunoassay (FIA) to guide hospital-bed placement of patients being admitted through the emergency department (ED).
Cross-sectional analysis of a clinical quality improvement study.
This study was conducted in 2 community hospitals in Maryland from September 21, 2020, to December 3, 2020. In total, 2,887 patients simultaneously received the Sofia SARS rapid antigen FIA and SARS-CoV-2 RT-PCR assays on admission through the ED.
Rapid antigen results and symptom assessment guided initial patient placement while confirmatory RT-PCR was pending. The sensitivity, specificity, positive predictive values, and negative predictive values of the rapid antigen assay were calculated relative to RT-PCR, overall and separately for symptomatic and asymptomatic patients. Assay sensitivity was compared to RT-PCR cycle threshold (Ct) values. Assay turnaround times were compared. Clinical characteristics of RT-PCR–positive patients and potential exposures from false-negative antigen assays were evaluated.
For all patients, overall agreement was 97.9%; sensitivity was 76.6% (95% confidence interval [CI], 71%–82%), and specificity was 99.7% (95% CI, 99%–100%). We detected no differences in performance between asymptomatic and symptomatic individuals. As RT-PCR Ct increased, the sensitivity of the antigen assay decreased. The mean turnaround time for the antigen assay was 1.2 hours (95% CI, 1.0–1.3) and for RT-PCR it was 20.1 hours (95% CI, 18.9–40.3) (P < .001). No transmission from antigen-negative/RT-PCR–positive patients was identified.
Although not a replacement for RT-PCR for detection of all SARS-CoV-2 infections, the Sofia SARS antigen FIA has clinical utility for potential initial timely patient placement.
Influenza can be introduced and propagated in healthcare settings by healthcare workers (HCWs) working while ill with influenza. However, reasons driving this behavior are unclear. In this study, we examined barriers to and facilitators of absenteeism during the influenza season.
Cross-sectional mixed methods study.
Ambulatory and inpatient settings in a large, tertiary-care healthcare system.
An anonymous electronic survey was sent to HCWs between June 11 and July 13, 2018, asking participants to self-report influenza-like illness (ie, ILI symptoms of fever, chills, cough, or sore throat) during the 2017–2018 influenza season. We conducted a logistical regression analysis to identify factors associated with absenteeism.
Of 14,250 HCWs, 17% responded to the survey. Although 1,180 respondents (51%) reported symptoms of ILI, 575 (43%) did not stay home while ill. The most commonly perceived barriers to ILI absenteeism included being understaffed (odds ratio [OR], 1.78; P = .04), unable to find a replacement for work (OR, 2.26; P = .03), desiring not to use time off (OR, 2.25; P = .003), and paid by the hour or unable to afford being absent (OR, 2.05; P = .02). Common perceived facilitators of absenteeism included support from coworkers and management, clearer policy, better sick days availability, and lower perceived threat of disciplinary action.
Reporting to work with ILI symptoms is common among HCWs. Most barriers and facilitators are related to systems. Addressing system factors, such as policies regarding sick days and sick leave and ensuring adequate backup staffing, is likely to facilitate absenteeism among ill HCWs.
Background: Healthcare-associated infections caused by antibiotic-resistant organisms (AROs) are a major cause of significant morbidity and mortality. To create and optimize infection prevention strategies, it is crucial to delineate the role of the environment and clinical infections. Methods: Over a 14-month period, we collected environmental samples, patient feces, and patient bloodstream infection (BSI) isolates in a newly built bone marrow transplant (BMT) intensive care unit (ICU). Samples were collected from 13 high-touch areas in the patient room and 4 communal areas. Samples were collected from the old BMT ICU, in the new BMT ICU before patients moved in, and for 1 year after patients moved in. Selective microbiologic culture was used to isolate AROs, and whole-genome sequencing (WGS) was used to determine clonality. Antibiotic susceptibility testing was performed using Kirby-Bauer disk diffusion assays. Using linear mixed modeling, we compared ARO recovery across time and sample area. Results: AROs were collected and cultured from environmental samples, patient feces, and BSI isolates (Fig. 1a). AROs were found both before and after a patient entered the ICU (Fig. 1b). Sink drains had significantly more AROs recovered per sample than any other surface area (P < .001) (Fig. 1c). The most common ARO isolates were Pseudomonas aeruginosa and Stenotrophomonas maltophila (Fig. 1d). The new BMT ICU had fewer AROs recovered per sample than the old BMT ICU (P < .001) and no increase in AROs recovered over the first year of opening (P > .05). Furthermore, there was no difference before versus after patients moved into the hospital (P > .05). Antibiotic susceptibility testing reveal that P. aeruginosa isolates recovered from the old ICU were resistant to more antibiotics than isolates recovered from the new ICU (Fig. 2a). ANI and clonal analyses of P. aeruginosa revealed a large cluster of clonal isolates (34 of 76) (Fig. 2b). This clonal group included isolates found before patients moved into the BMT ICU and patient blood isolates. Furthermore, this clonal group was initially found in only 1 room in the BMT ICU, and over 26 weeks, it was found in sink drains in all 6 rooms sampled (Fig. 2b). Conclusions: AROs are present before patients move into a new BMT ICU, and sink drains act as a reservoir for AROs over time. Furthermore, sink-drain P. aeruginosa isolates are clonally related to isolates found in patient BSIs. Overall, these results provide insight into ARO transmission dynamics in the hospital environment.
Funding: Research reported in this publication was supported by the Washington University Institute of Clinical and Translational Sciences grant UL1TR002345 from the National Center for Advancing Translational Sciences (NCATS) of the National Institutes of Health (NIH). The content is solely the responsibility of the authors and does not necessarily represent the official view of the NIH.
The last three decades have seen the biotherapeutic drug market evolve from promising concept to market dominance in a range of clinical indications. This growth has been spurred by the success of established drug classes like monoclonal antibodies, but also by the introduction of biosimilars, and more recently, multiple novel cell and gene therapies. Biotherapeutic drug development presents many unique challenges, but unintended immune responses are among the most common reasons for program attrition. Anti-drug antibodies can impact the safety and efficacy of drug products, and related immune responses, like the cytokine release syndrome that occurred in the infamous TGN-1412 clinical trial, can be challenging to predict with nonclinical models. For this reason, it is important that development programs proceed with a scientifically grounded and measured approach to these responses. This process begins at the discovery stage with the application of “quality by design,” continues into the clinic with the development of quality assays and management strategies, and culminates in the effective presentation of this information in regulatory documents. This review provides an overview of some of the key strategic and regulatory considerations for biotherapeutics as they pertain to immunogenicity and related responses.
Seismic-reflection surveys of the Isle Royale sub-basin, central Lake Superior, reveal two large end moraines and associated glacial sediments deposited during the last cycle of the Laurentide Ice Sheet in the basin. The Isle Royale moraines directly overlie bedrock and are cored with dense, acoustically massive till intercalated down-ice with acoustically stratified outwash. Till and outwash are overlain by glacial varves, a lower red unit and an upper gray unit.
The maximum extent of late Younger Dryas-age readvance into the western Lake Superior basin is uncertain, but it was probably controlled by both ice dynamics and climate. Our data indicate that during retreat from the maximum, the ice paused just long enough to construct the outer of the two moraines, >100 m high, and then retreated to the inner moraine, during which time most of the lower glacial-lacustrine sequence (red varves) was deposited. Retreat from the inner moraine coincided with a marked flux of icebergs at the calving margin and a change to gray varves. Rapid retreat may be related to both an influx of meltwater from Glacial Lake Agassiz about 10,500 cal yr BP and retreat of the calving margin down an adverse slope into the Isle Royale sub-basin.
Increasing fluorination of organosilyl nitrile solvents improves ionic conductivities of lithium salt electrolytes, resulting from higher values of salt dissociation. Ionic conductivities at 298 K range from 1.5 to 3.2 mS/cm for LiPF6 salt concentrations at 0.6 or 0.7 M. The authors also report on solvent blend electrolytes where the fluoroorganosilyl (FOS) nitrile solvent is mixed with ethylene carbonate and diethyl carbonate. Ionic conductivities of the FOS solvent/carbonate blend electrolytes increase achieving ionic conductivities at 298 K of 5.5–6.3 mS/cm and salt dissociation values ranging from 0.42 to 0.45. Salt dissociation generally decreases with increasing temperature.
The escalating evolution of weed species resistant to acetolactase synthase (ALS)-inhibitor herbicides makes alternative weed control strategies necessary for field crops that are dependent on this herbicide group. A fully integrated strategy that combined increased crop seeding rates (2X or 4X recommended), mechanical weed control with a minimum-tillage rotary hoe, and reduced-rate non–ALS inhibitor herbicides was compared with herbicides, rotary hoe, and seeding rates alone as a method of controlling ALS inhibitor–tolerant Indian mustard as a model weed. The full-rate herbicide treatment had the lowest weed biomass (98% reduction) and the highest yield of all treatments in 3 of 4 site-years, regardless of seeding rate. The fully integrated treatment at the 4X seeding rate had weed suppression rates equal to the full herbicide treatment at the recommended seeding rate. The fully integrated and reduced-rate herbicide treatments at the 4X seeding rate reduced weed biomass by 89% and 83%, respectively, compared with the control at the recommended seeding rate. The rotary hoe treatment alone resulted in poor weed control (≤38%), even at the highest seeding rate. Fully integrated and reduced-rate herbicide treatments at 2X and 4X seeding rates had yields equal to those of the full herbicide treatment at the recommended seeding rate. Partially or fully integrated weed control strategies that combine increased crop seeding rates and reduced-rate non–ALS inhibitor herbicides, with or without the use of a rotary hoe, can control weeds resistant to ALS-inhibitor herbicides, while maintaining crop yields similar to those achieved with full-rate herbicides. However, combining increased seeding rate, reduced-rate herbicides, and mechanical rotary hoe treatment into a fully integrated strategy maximized weed control, while reducing reliance on and selection pressure against any single weed control tactic.
Studies were conducted in 1989 and 1990 to determine the phytotoxicity of chlorimuron and tank mixtures on ‘Florunner’ peanut. Chlorimuron plus a petroleum oil adjuvant or 2,4-DB was more phytotoxic (P = 0.05) than chlorimuron plus a nonionic surfactant, based on stunting and chlorosis. Chlorimuron mixed with chlorothalonil, chlorothalonil plus sulfur, or esfenvalerate were no more phytotoxic than the standard. Adding sulfur or nonionic surfactant to chlorimuron plus chlorothalonil did not affect phytoxicity. Sequential applications of chlorimuron and 2,4-DB did not completely negate the phytotoxicity of the tank mixture. Despite differences in phytotoxicity, yields were not reduced.
Concern over the development of herbicide-resistant weeds has led to interest in integrated weed management systems that reduce selection pressure by utilizing mechanical and cultural weed control practices in addition to herbicides. Increasing crop seeding rate increases crop competitive ability and thus can enhance herbicide efficacy. However, it is unknown how increasing the seeding rate affects an herbicide’s efficacy. The objective of this study was to examine the interaction between increasing seeding rate and herbicide dose to control weeds. To meet this objective, the herbicide fluthiacet-methyl was applied to field-grown lentil, with Indian mustard, a proxy for wild mustard, used as a model weed. The experiment was a factorial design with four lentil seeding rates and seven herbicide rates. Overall the herbicide dose response was altered by changing lentil seeding rate. Increasing lentil seeding rate decreased the weed biomass production when herbicides were not applied. In two of the four site-years, increasing lentil seeding rate lowered the herbicide ED50, the dose required to result in a 50% reduction in weed biomass. Increasing the crop seeding rate altered the dose response to provide greater weed control at lower herbicide rates compared with normal crop seeding rates. Increased seeding rates also resulted in higher and more stable crop seed yields across a wider range of herbicide dosages. These results suggest that dose–response models can be used to evaluate the efficacy of other weed management practices that can interact with herbicide performance.