To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Identification of COVID-19 patients at risk for deterioration following discharge from the emergency department (ED) remains a clinical challenge. Our objective was to develop a prediction model that identifies COVID-19 patients at risk for return and hospital admission within 30 days of ED discharge. METHODS/STUDY POPULATION: We performed a retrospective cohort study of discharged adult ED patients (n = 7,529) with SARS-CoV-2 infection from 116 unique hospitals contributing to the national REgistry of suspected COVID-19 in EmeRgency care (RECOVER). The primary outcome was return hospital admission within 30 days. Models were developed using Classification and Regression Tree (CART), Gradient Boosted Machine (GBM), Random Forest (RF), and least absolute shrinkage and selection (LASSO) approaches. RESULTS/ANTICIPATED RESULTS: Among COVID-19 patients discharged from the ED on their index encounter, 571 (7.6%) returned for hospital admission within 30 days. The machine learning (ML) models (GBM, RF,: and LASSO) performed similarly. The RF model yielded a test AUC of 0.74 (95% confidence interval [CI] 0.71–0.78) with a sensitivity of 0.46 (0.39-0.54) and specificity of 0.84 (0.82-0.85). Predictive variables including: lowest oxygen saturation, temperature; or history of hypertension,: diabetes, hyperlipidemia, or obesity, were common to all ML models. DISCUSSION/SIGNIFICANCE: A predictive model identifying adult ED patients with COVID-19 at risk for return hospital admission within 30 days is feasible. Ensemble/boot-strapped classification methods outperform the single tree CART method. Future efforts may focus on the application of ML models in the hospital setting to optimize allocation of follow up resources.
We explored the acceptability of a personalised proteomic risk intervention for patients at increased risk of type 2 diabetes and their healthcare providers, as well as their experience of participating in the delivery of proteomic-based risk feedback in UK primary care.
Advances in proteomics now allow the provision of personalised proteomic risk reports, with the intention of achieving positive behaviour change. This technology has the potential to encourage behaviour change in people at risk of developing type 2 diabetes.
A semi-structured interview study was carried out with patients at risk of type 2 diabetes and their healthcare providers in primary care in the North of England. Participants (n = 17) and healthcare provider (n = 4) were interviewed either face to face or via telephone. Data were analysed using thematic analysis. This qualitative study was nested within a single-arm pilot trial and undertaken in primary care.
The personalised proteomic risk intervention was generally acceptable and the experience was positive. The personalised nature of the report was welcomed, especially the way it provided a holistic approach to risks of organ damage and lifestyle factors. Insights were provided as to how this may change behaviour. Some participants reported difficulties in understanding the format of the presentation of risk and expressed surprise at receiving risk estimates for conditions other than type 2 diabetes. Personalised proteomic risk interventions have the potential to provide holistic and comprehensive assessments of risk factors and lifestyle factors which may lead to positive behaviour change.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.
This chapter introduces readers to the use of time-varying effect modeling (TVEM), a statistical tool for capturing dynamic changes over time, as applied to the study of substance use disorder recovery processes. The chapter presents an empirical demonstration of using TVEM to examine the effect of an intervention, Recovery Management Checkups (RMCs), on substance use and key features of the ongoing process of recovery (life satisfaction, cognitive avoidance, self-efficacy) as a continuous function of time. The example application data come from the Early Re-Intervention experiment of 446 adults from a large addiction treatment agency who were randomly assigned to receive RMCs or an assessment control. Given the time-varying nature of the effect of the RMC on recovery outcomes and the differential patterns observed by type of outcome, TVEM may be a viable option in lieu of or in addition to using common metrics of “treatment success.” SAS syntax is provided.
Yarkoni's analysis clearly articulates a number of concerns limiting the generalizability and explanatory power of psychological findings, many of which are compounded in infancy research. ManyBabies addresses these concerns via a radically collaborative, large-scale and open approach to research that is grounded in theory-building, committed to diversification, and focused on understanding sources of variation.
Storytelling is increasingly recognized as a culturally relevant, human-centered strategy and has been linked to improvements in health knowledge, behavior, and outcomes. The Community Engagement Program of the Johns Hopkins Institute for Clinical and Translational Research designed and implemented a storytelling training program as a potentially versatile approach to promote stakeholder engagement. Data collected from multiple sources, including participant ratings, responses to open-ended questions, and field notes, consistently pointed to high-level satisfaction and acceptability of the program. As a next step, the storytelling training process and its impact need to be further investigated.
Background: Urine cultures are the most common microbiological tests in the outpatient setting and heavily influence treatment of suspected urinary tract infections (UTIs). Antibiotics for UTI are usually prescribed on an empiric basis in primary care before the urine culture results are available. However, culture results may be needed to confirm a UTI diagnosis and to verify that the correct antibiotic was prescribed. Although urine cultures are considered as the gold standard for diagnosis of UTI, cultures can easily become contaminated during collection. We determined the prevalence, predictors, and antibiotic use associated with contaminated urine cultures in 2 adult safety net primary care clinics. Methods: We conducted a retrospective chart review of visits with provider-suspected UTI in which a urine culture was ordered (November 2018–March 2020). Patient demographics, culture results, and prescription orders were captured for each visit. Culture results were defined as no culture growth, contaminated (ie, mixed flora, non-uropathogens, or ≥3 bacteria isolated on culture), low-count positive (growth between 100 and 100,000 CFU/mL), and high-count positive (>100,000 CFU/mL). A multivariable multinomial logistic regression model was used to identify factors associated with contaminated culture results. Results: There were 1,265 visits with urine cultures: 264 (20.9%) had no growth, 694 (54.9%) were contaminated, 159 (12.6%) were low counts, and 148 (11.7%) were high counts. Encounter-level factors are presented in Table 1. Female gender (adjusted odds ratio [aOR], 15.8; 95% confidence interval [CI], 10.21–23.46; P < .001), pregnancy (aOR, 13.98; 95% CI, 7.93–4.67; P < .001), and obesity (aOR, 1.9; 95% CI 1.31–2.77; P < .001) were independently associated with contaminated cultures. Of 264 patients whose urine cultures showed no growth, 36 (14%) were prescribed an antibiotic. Of 694 patients with contaminated cultures, 153 (22%) were prescribed an antibiotic (Figure 1). Conclusions: More than half of urine cultures were contaminated, and 1 in 5 patients were treated with antibiotics. Reduction of contamination should improve patient care by providing a more accurate record of the organism in the urine (if any) and its susceptibilities, which are relevant to managing future episodes of UTI in that patient. Optimizing urine collection represents a diagnostic stewardship opportunity in primary care.
Funding: This study was supported by the National Institute of Allergy and Infectious Diseases of the National Institutes of Health (grant no. UM1AI104681). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Magnetic reconnection is explored on the Terrestrial Reconnection Experiment (TREX) for asymmetric inflow conditions and in a configuration where the absolute rate of reconnection is set by an external drive. Magnetic pileup enhances the upstream magnetic field of the high-density inflow, leading to an increased upstream Alfvén speed and helping to lower the normalized reconnection rate to values expected from theoretical consideration. In addition, a shock interface between the far upstream supersonic plasma inflow and the region of magnetic flux pileup is observed, important to the overall force balance of the system, thereby demonstrating the role of shock formation for configurations including a supersonically driven inflow. Despite the specialized geometry where a strong reconnection drive is applied from only one side of the reconnection layer, previous numerical and theoretical results remain robust and are shown to accurately predict the normalized rate of reconnection for the range of system sizes considered. This experimental rate of reconnection is dependent on system size, reaching values as high as 0.8 at the smallest normalized system size applied.
Dynamic changes in microRNAs in oocyte and cumulus cells before and after maturation may explain the spatiotemporal post-transcriptional gene regulation within bovine follicular cells during the oocyte maturation process. miR-20a has been previously shown to regulate proliferation and differentiation as well as progesterone levels in cultured bovine granulosa cells. In the present study, we aimed to demonstrate the function of miR-20a during the bovine oocyte maturation process. Maturation of cumulus–oocyte complexes (COCs) was performed at 39°C in an humidified atmosphere with 5% CO2 in air. The expression of miR-20a was investigated in the cumulus cells and oocytes at 22 h post culture. The functional role of miR-20a was examined by modulating the expression of miR-20a in COCs during in vitro maturation (IVM). We found that the miR-20a expression was increased in cumulus cells but decreased in oocytes after IVM. Overexpression of miR-20a increased the oocyte maturation rate. Even though not statistically significant, miR-20a overexpression during IVM increased progesterone levels in the spent medium. This was further supported by the expression of STAR and CYP11A1 genes in cumulus cells. The phenotypes observed due to overexpression of miR-20a were validated by BMP15 supplementation during IVM and subsequent transfection of BMP15-treated COCs using miR-20a mimic or BMPR2 siRNA. We found that miR-20a mimic or BMPR2 siRNA transfection rescued BMP15-reduced oocyte maturation and progesterone levels. We concluded that miR-20a regulates oocyte maturation by increasing cumulus cell progesterone synthesis by simultaneous suppression of BMPR2 expression.
Genetic susceptibility to late maturity alpha-amylase (LMA) in wheat (Triticum aestivum L.) results in increased alpha-amylase activity in mature grain when cool conditions occur during late grain maturation. Farmers are forced to sell wheat grain with elevated alpha-amylase at a discount because it has an increased risk of poor end-product quality. This problem can result from either LMA or preharvest sprouting, grain germination on the mother plant when rain occurs before harvest. Whereas preharvest sprouting is a well-understood problem, little is known about the risk LMA poses to North American wheat crops. To examine this, LMA susceptibility was characterized in a panel of 251 North American hard spring wheat lines, representing ten geographical areas. It appears that there is substantial LMA susceptibility in North American wheat since only 27% of the lines showed reproducible LMA resistance following cold-induction experiments. A preliminary genome-wide association study detected six significant marker-trait associations. LMA in North American wheat may result from genetic mechanisms similar to those previously observed in Australian and International Maize and Wheat Improvement Center (CIMMYT) germplasm since two of the detected QTLs, QLMA.wsu.7B and QLMA.wsu.6B, co-localized with previously reported loci. The Reduced height (Rht) loci also influenced LMA. Elevated alpha-amylase levels were significantly associated with the presence of both wild-type and tall height, rht-B1a and rht-D1a, loci in both cold-treated and untreated samples.
The transplantation of endocrine organs can be regarded as the oldest form of transplantation in modern medical history. By the end of the nineteenth and beginning of the twentieth centuries, a large research focus was set on endocrine transplantations. Before the complex endocrine secretion and function was even understood, researchers attempted to cure endocrine diseases and infertility through transplantation of the endocrine glands and gonads. Hence, most endocrine organs have been transplanted in that period, including the thyroid , the adrenal gland , the testis  and the ovary . Even though the principles of transplant rejection have not been understood at that time, researchers already noticed successful transplantations almost exclusively in experiments with autografts. The first published allogeneic ovarian transplantations in animals have been performed by Paul Bert in the sixties of the nineteenth century .
Subglacial hydrological systems require innovative technological solutions to access and observe. Wireless sensor platforms can be used to collect and return data, but their performance in deep and fast-moving ice requires quantification. We report experimental results from Cryoegg: a spherical probe that can be deployed into a borehole or moulin and transit through the subglacial hydrological system. The probe measures temperature, pressure and electrical conductivity in situ and returns all data wirelessly via a radio link. We demonstrate Cryoegg's utility in studying englacial channels and moulins, including in situ salt dilution gauging. Cryoegg uses VHF radio to transmit data to a surface receiving array. We demonstrate transmission through up to 1.3 km of cold ice – a significant improvement on the previous design. The wireless transmission uses Wireless M-Bus on 169 MHz; we present a simple radio link budget model for its performance in cold ice and experimentally confirm its validity. Cryoegg has also been tested successfully in temperate ice. The battery capacity should allow measurements to be made every 2 h for more than a year. Future iterations of the radio system will enable Cryoegg to transmit data through up to 2.5 km of ice.
In drafting the first Australian class actions regime under Part IVA of the Federal Court of Australia Act 1976 (Cth) (FCAA), the Commonwealth legislature had the difficult task of creating a procedure that was appropriate for the Australian jurisdiction, including being in keeping with its litigation culture, while also learning from the procedures already employed in the United States and Canada. After twenty-seven years of federal class actions, it can be said that Australia has fashioned in Part IVA an effective and efficient framework for resolving mass litigation, accompanied by a robust body of jurisprudence. Equally, class action practice in Australia has evolved in ways that would have been beyond the reasonable comprehension of those who initially drafted Part IVA, third-party litigation funding being a key development. This chapter tells the story of Part IVA’s creation and maturation, providing an overview of the jurisprudence that has characterised its evolution, as well as an account of contentious issues at the forefront of modern class action practice.
Years of sport participation (YoP) is conventionally used to estimate cumulative repetitive head impacts (RHI) experienced by contact sport athletes. The relationship of this measure to other estimates of head impact exposure and the potential associations of these measures with neurobehavioral functioning are unknown. We investigated the association between YoP and the Head Impact Exposure Estimate (HIEE), and whether associations between the two estimates of exposure and neurobehavioral functioning varied.
Former American football players (N = 58; age = 37.9 ± 1.5 years) completed in-person evaluations approximately 15 years following sport discontinuation. Assessments consisted of neuropsychological assessment and structured interviews of head impact history (i.e., HIEE). General linear models were fit to test the association between YoP and the HIEE, and their associations with neurobehavioral outcomes.
YoP was weakly correlated with the HIEE, p = .005, R2 = .13. Higher YoP was associated with worse performance on the Symbol Digit Modalities Test, p = .004, R2 = .14, and Trail Making Test-B, p = .001, R2 = .18. The HIEE was associated with worse performance on the Delayed Recall trial of the Hopkins Verbal Learning Test-Revised, p = .020, R2 = .09, self-reported cognitive difficulties (Neuro-QoL Cognitive Function), p = .011, R2 = .10, psychological distress (Brief Symptom Inventory-18), p = .018, R2 = .10, and behavioral regulation (Behavior Rating Inventory of Executive Function for Adults), p = .017, R2 = .10.
YoP was marginally associated with the HIEE, a comprehensive estimate of head impacts sustained over a career. Associations between each exposure estimate and neurobehavioral functioning outcomes differed. Findings have meaningful implications for efforts to accurately quantify the risk of adverse long-term neurobehavioral outcomes potentially associated with RHI.
Due to shortages of N95 respirators during the coronavirus disease 2019 (COVID-19) pandemic, it is necessary to estimate the number of N95s required for healthcare workers (HCWs) to inform manufacturing targets and resource allocation.
We developed a model to determine the number of N95 respirators needed for HCWs both in a single acute-care hospital and the United States.
For an acute-care hospital with 400 all-cause monthly admissions, the number of N95 respirators needed to manage COVID-19 patients admitted during a month ranges from 113 (95% interpercentile range [IPR], 50–229) if 0.5% of admissions are COVID-19 patients to 22,101 (95% IPR, 5,904–25,881) if 100% of admissions are COVID-19 patients (assuming single use per respirator, and 10 encounters between HCWs and each COVID-19 patient per day). The number of N95s needed decreases to a range of 22 (95% IPR, 10–43) to 4,445 (95% IPR, 1,975–8,684) if each N95 is used for 5 patient encounters. Varying monthly all-cause admissions to 2,000 requires 6,645–13,404 respirators with a 60% COVID-19 admission prevalence, 10 HCW–patient encounters, and reusing N95s 5–10 times. Nationally, the number of N95 respirators needed over the course of the pandemic ranges from 86 million (95% IPR, 37.1–200.6 million) to 1.6 billion (95% IPR, 0.7–3.6 billion) as 5%–90% of the population is exposed (single-use). This number ranges from 17.4 million (95% IPR, 7.3–41 million) to 312.3 million (95% IPR, 131.5–737.3 million) using each respirator for 5 encounters.
We quantified the number of N95 respirators needed for a given acute-care hospital and nationally during the COVID-19 pandemic under varying conditions.
Background: Measles is a highly contagious virus that reemerged in 2019 with the highest number of reported cases in the United States since 1992. Beginning in March 2019, The Johns Hopkins Hospital (JHH) responded to an influx of patients with concern for measles as a result of outbreaks in Maryland and the surrounding states. We report the JHH Department of Infection Control and Hospital Epidemiology (HEIC) response to this measles outbreak using a multidisciplinary measles incident command system (ICS). Methods: The JHH HEIC and the Johns Hopkins Office of Emergency Management established the HEIC Clinical Incident Command Center and coordinated a multipronged response to the measles outbreak with partners from occupational health services, microbiology, the adult and pediatric emergency departments, marketing and communication and local and state public health departments. The multidisciplinary structure rapidly developed, approved, and disseminated tools to improve the ability of frontline providers to quickly identify, isolate, and determine testing needs for patients suspected to have measles infection and reduce the risk of secondary transmission. The tools included a triage algorithm, visitor signage, staff and patient vaccination guidance and clinics, and standard operating procedures for measles evaluation and testing. The triage algorithms were developed for phone or in-person and assessed measles exposure history, immune status, and symptoms, and provided guidance regarding isolation and the need for testing. The algorithms were distributed to frontline providers in clinics and emergency rooms across the Johns Hopkins Health System. The incident command team also distributed resources to community providers to reduce patient influx to JHH and staged an outdoor measles evaluation and testing site in the event of a case influx that would exceed emergency department resources. Results: From March 2019 through June 2019, 37 patients presented with symptoms or concern for measles. Using the ICS tools and algorithms, JHH rapidly identified, isolated, and tested 11 patients with high suspicion for measles, 4 of whom were confirmed positive. Of the other 26 patients not tested, none developed measles infection. Exposures were minimized, and there were no secondary measles transmissions among patients. Conclusions: Using the ICS and development of tools and resources to prevent measles transmission, including a patient triage algorithm, the JHH team successfully identified, isolated, and evaluated patients with high suspicion for measles while minimizing exposures and secondary transmission. These strategies may be useful to other institutions and locales in the event of an emerging or reemerging infectious disease outbreak.
Disclosures: Aaron Milstone reports consulting for Becton Dickinson.