We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
One of six nursing home residents and staff with positive SARS-CoV-2 tests ≥90 days after initial infection had specimen cycle thresholds (Ct) <30. Individuals with specimen Ct<30 were more likely to report symptoms but were not different from individuals with high Ct value specimens by other clinical and testing data.
OBJECTIVES/GOALS: #NAME? METHODS/STUDY POPULATION: Cell culture & protein identification: human T cells were purified from healthy blood, then activated & cultured for 5d. CAR-T cells were collected from infusion bags of cancer patients undergoing CAR-T. Silver staining of naive & activated healthy T-cell lysates was compared; B-II spectrin was upregulated and confirmed by Western blot. Migration assays: naive & activated T-cells were imaged during migration on ICAM-1 and ICAM-1 + CXCL12 coated plates. T-cells were transfected with BII-spectrin cDNA & the chemokine dependence of migration was compared with controls. In-vivo studies: in a melanoma mouse model, BII-spectrin transfected or control T-cells were injected; tumors were followed with serial imaging. Human patient records were examined to correlate endogenous BII-spectrin levels and CAR-T response. RESULTS/ANTICIPATED RESULTS: Activated T-cells downregulate the cytoskeletal protein B-II spectrin compared to naive cells, leading to chemokine-independent migration in in vitro assays and off-target trafficking when CAR-T cells are given in vivo. Restoration of B-II spectrin levels via transfection restores chemokine-dependence of activated T-cells. In a mouse melanoma model, control mice injected with standard activated T-cells showed fewer cells in the tumor site and more cells in the off-target organs (spleen, lungs) when compared to mice injected with B-II spectrin transfected cells. Furthermore, among 3 human patients undergoing CAR-T therapy, those with higher endogenous B-II spectrin levels experienced fewer side-effects, measured by the neurotoxicity and cytokine release syndrome grades. DISCUSSION/SIGNIFICANCE: A major hurdle to widespread CAR-T therapy for cancer is significant, often fatal side-effects. Our work shows that the protein B-II spectrin is downregulated during CAR-T production, and that restoring B-II spectrin levels decreases side-effects while increasing tumor clearance--hopefully translating to better CAR-T regimens for the future.
Understanding the relative longevity of different seed lots, perhaps of different species or genotypes, but also following production under different environments or using different cultivation methods, or following different post-harvest treatments, is relevant to anyone concerned with the retention of seed lot viability and vigour during storage. However, different scientists over the years have used different conditions to assess seed lot longevity, as well as different variables as the measure of ‘longevity.’ Here, we give some of the backgrounds to how two standard protocols, with an open and closed system respectively, were derived, and explain why we consider p50, defined as the time during storage when seed lot viability, as measured through a germination test, has declined to 50%, is a suitable longevity trait parameter.
Using fluctuating hydrodynamics we investigate the effect of thermal fluctuations in the dissipation range of homogeneous isotropic turbulence. Simulations confirm theoretical predictions that the energy spectrum is dominated by these fluctuations at length scales comparable to the Kolmogorov length. We also find that the extreme intermittency in the far-dissipation range predicted by Kraichnan is replaced by Gaussian thermal equipartition.
A hedonic model was employed to examine factors that influence the resale price of row crop planters on the used machinery market. Planter sale data from 2016 to 2018 were utilized to conduct the analysis. Results suggested that the primary factors impacting planter resale prices were make, age, condition, planter configuration, row number, and row spacing. As a function of age (depreciation), planter values were generally determined to decrease at a decreasing rate. Finally, it was determined that there was a significant interaction between the variables make and age, suggesting that different planter makes depreciate differently.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Nonspecific respiratory symptoms overlap with coronavirus disease 2019 (COVID-19). Prompt diagnosis of COVID-19 in hospital employees is crucial to prevent nosocomial transmission. Rapid molecular SARS-CoV-2 testing was performed for 115 symptomatic employees. The case positivity rate was 2.6%. Employees with negative tests returned to work after 80 (±28) minutes.
In response to the 2014-2016 West Africa Ebola virus disease (EVD) epidemic, the Centers for Disease Control and Prevention (CDC) designated 56 US hospitals as Ebola treatment centers (ETCs) with high-level isolation capabilities. We aimed to determine ongoing sustainability of ETCs and identify how ETC capabilities have impacted hospital, local, and regional COVID-19 readiness and response.
Design:
An electronic survey included both qualitative and quantitative questions and was structured into two sections: operational sustainability and role in the COVID-19 response.
Setting and Participants:
The survey was distributed to site representatives from the 56 originally designated ETCs; 37 (66%) responded.
Methods:
Data were coded and analyzed using descriptive statistics.
Results:
Of the 37 responding ETCs, 33 (89%) reported they were still operating while 4 had decommissioned. ETCs that maintain high-level isolation capabilities incurred a mean of $234,367 in expenses per year. All but one ETC reported that existing capabilities (e.g., trained staff, infrastructure) before COVID-19 positively affected their hospital, local, and regional COVID-19 readiness and response (e.g., ETCs trained staff, donated supplies, and shared developed protocols).
Conclusions:
Existing high-level isolation capabilities and expertise developed following the 2014-2016 EVD epidemic were leveraged by ETCs to assist hospital-wide readiness for COVID-19 and support response for other local and regional hospitals However, ETCs face continued challenges in sustaining those capabilities for high-consequence infectious diseases.
Evidence suggests that cognitive subtypes exist in schizophrenia that may reflect different neurobiological trajectories. We aimed to identify whether IQ-derived cognitive subtypes are present in early-phase schizophrenia-spectrum disorder and examine their relationship with brain structure and markers of neuroinflammation.
Method
161 patients with recent-onset schizophrenia spectrum disorder (<5 years) were recruited. Estimated premorbid and current IQ were calculated using the Wechsler Test of Adult Reading and a 4-subtest WAIS-III. Cognitive subtypes were identified with k-means clustering. Freesurfer was used to analyse 3.0 T MRI. Blood samples were analysed for hs-CRP, IL-1RA, IL-6 and TNF-α.
Results
Three subtypes were identified indicating preserved (PIQ), deteriorated (DIQ) and compromised (CIQ) IQ. Absolute total brain volume was significantly smaller in CIQ compared to PIQ and DIQ, and intracranial volume was smaller in CIQ than PIQ (F(2, 124) = 6.407, p = 0.002) indicative of premorbid smaller brain size in the CIQ group. CIQ had higher levels of hs-CRP than PIQ (F(2, 131) = 5.01, p = 0.008). PIQ showed differentially impaired processing speed and verbal learning compared to IQ-matched healthy controls.
Conclusions
The findings add validity of a neurodevelopmental subtype of schizophrenia identified by comparing estimated premorbid and current IQ and characterised by smaller premorbid brain volume and higher measures of low-grade inflammation (CRP).
Emerging evidence has suggested that mushrooms, which are a rich source of the potent antioxidants ergothioneine and glutathione as well as vitamin D, may have neuroprotective properties. This study investigated the association between mushroom consumption and cognitive performance in a nationally representative sample of US older adults. We analysed data from older adults aged ≥ 60 years from the 2011–2014 National Health and Nutrition Examination Survey. Mushroom intake was measured using up to two 24-h dietary recalls and was categorised into three groups (lowest, middle and highest). Cognitive function tests included the Animal Fluency (AF) Test; Consortium to Establish a Registry for Alzheimer’s Disease Delayed Recall (CERAD-DR) and Word Learning (CERAD-WL); and Digit Symbol Substitution Test (DSST). Multivariable linear regression models were developed, adjusting for socio-demographics, major lifestyle factors, self-reported chronic diseases and dietary factors, including the Healthy Eating Index-2015 score and total energy. The study included 2840 participants. Compared with the lowest category of mushroom intake, participants in the highest category (median intake = 13·4 g /4184 KJ (1000 kcal)/d) had higher scores for DSST (β = 3·87; 95 % CI 0·30, 7·45; P for trend = 0·03) and CERAD-WL (β = 1·05; 95 % CI 0·0003, 2·10; P for trend = 0·04). Similar non-significant trends were observed for AF (β = 0·24; 95 % CI −2·26, 2·73; P for trend = 0·92) but not for the CERAD-DR. Greater mushroom intake was associated with certain cognitive performance tests, suggesting regular mushroom consumption may reduce the risk of cognitive decline.
The effect of sample preparation on a pre-aged Al–Mg–Si–Cu alloy has been evaluated using atom probe tomography. Three methods of preparation were investigated: electropolishing (control), Ga+ focused ion beam (FIB) milling, and Xe+ plasma FIB (PFIB) milling. Ga+-based FIB preparation was shown to introduce significant amount of Ga contamination throughout the reconstructed sample (≈1.3 at%), while no Xe contamination was detected in the PFIB-prepared sample. Nevertheless, a significantly higher cluster density was observed in the Xe+ PFIB-prepared sample (≈25.0 × 1023 m−3) as compared to the traditionally produced electropolished sample (≈3.2 × 1023 m−3) and the Ga+ FIB sample (≈5.6 × 1023 m−3). Hence, the absence of the ion milling species does not necessarily mean an absence of specimen preparation defects. Specifically, the FIB and PFIB-prepared samples had more Si-rich clusters as compared to electropolished samples, which is indicative of vacancy stabilization via solute clustering.
COVID-19 has caused tremendous death and suffering since it first emerged in 2019. Soon after its emergence, models were developed to help predict the course of various disease metrics, and these models have been relied upon to help guide public health policy.
Methods:
Here we present a method called COVIDNearTerm to “forecast” hospitalizations in the short term, two to four weeks from the time of prediction. COVIDNearTerm is based on an autoregressive model and utilizes a parametric bootstrap approach to make predictions. It is easy to use as it requires only previous hospitalization data, and there is an open-source R package that implements the algorithm. We evaluated COVIDNearTerm on San Francisco Bay Area hospitalizations and compared it to models from the California COVID Assessment Tool (CalCAT).
Results:
We found that COVIDNearTerm predictions were more accurate than the CalCAT ensemble predictions for all comparisons and any CalCAT component for a majority of comparisons. For instance, at the county level our 14-day hospitalization median absolute percentage errors ranged from 16 to 36%. For those same comparisons, the CalCAT ensemble errors were between 30 and 59%.
Conclusion:
COVIDNearTerm is a simple and useful tool for predicting near-term COVID-19 hospitalizations.
This paper presents a compilation of atmospheric radiocarbon for the period 1950–2019, derived from atmospheric CO2 sampling and tree rings from clean-air sites. Following the approach taken by Hua et al. (2013), our revised and extended compilation consists of zonal, hemispheric and global radiocarbon (14C) data sets, with monthly data sets for 5 zones (Northern Hemisphere zones 1, 2, and 3, and Southern Hemisphere zones 3 and 1–2). Our new compilation includes smooth curves for zonal data sets that are more suitable for dating applications than the previous approach based on simple averaging. Our new radiocarbon dataset is intended to help facilitate the use of atmospheric bomb 14C in carbon cycle studies and to accommodate increasing demand for accurate dating of recent (post-1950) terrestrial samples.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
The COVID-19 health crisis triggered changes in the workplace. This paper explores the insights from scholarly work published in the Journal of Management and Organization (JMO) and systematizes this body of knowledge to build a scientific overview that looks at how the COVID-19 health crisis and its repercussions may be managed by organizations. We conducted a bibliometric investigation of JMO's most influential papers published from 1995 to June 2020 that offers insights into the management of the COVID-19 crisis. Our bibliometric investigation reveals six clusters: (1) conservation of resources theory, entrepreneurs, gender and work–family conflict; (2) corporate governance, corporate social responsibility and stakeholder salience; (3) family firms, innovation and research methods; (4) creativity, leadership and organizational change; (5) job satisfaction and psychological empowerment; and (6) team performance. We discuss the theoretical and practical implications of our findings.