To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter introduces readers to the use of time-varying effect modeling (TVEM), a statistical tool for capturing dynamic changes over time, as applied to the study of substance use disorder recovery processes. The chapter presents an empirical demonstration of using TVEM to examine the effect of an intervention, Recovery Management Checkups (RMCs), on substance use and key features of the ongoing process of recovery (life satisfaction, cognitive avoidance, self-efficacy) as a continuous function of time. The example application data come from the Early Re-Intervention experiment of 446 adults from a large addiction treatment agency who were randomly assigned to receive RMCs or an assessment control. Given the time-varying nature of the effect of the RMC on recovery outcomes and the differential patterns observed by type of outcome, TVEM may be a viable option in lieu of or in addition to using common metrics of “treatment success.” SAS syntax is provided.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
To estimate prior severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among skilled nursing facility (SNF) staff in the state of Georgia and to identify risk factors for seropositivity as of fall 2020.
Baseline survey and seroprevalence of the ongoing longitudinal Coronavirus 2019 (COVID-19) Prevention in Nursing Homes study.
The study included 14 SNFs in the state of Georgia.
In total, 792 SNF staff employed or contracted with participating SNFs were included in this study. The analysis included 749 participants with SARS-CoV-2 serostatus results who provided age, sex, and complete survey information.
We estimated unadjusted odds ratios (ORs) and 95% confidence intervals (95% CIs) for potential risk factors and SARS-CoV-2 serostatus. We estimated adjusted ORs using a logistic regression model including age, sex, community case rate, SNF resident infection rate, working at other facilities, and job role.
Staff working in high-infection SNFs were twice as likely (unadjusted OR, 2.08; 95% CI, 1.45–3.00) to be seropositive as those in low-infection SNFs. Certified nursing assistants and nurses were 3 times more likely to be seropositive than administrative, pharmacy, or nonresident care staff: unadjusted OR, 2.93 (95% CI, 1.58–5.78) and unadjusted OR, 3.08 (95% CI, 1.66–6.07). Logistic regression yielded similar adjusted ORs.
Working at high-infection SNFs was a risk factor for SARS-CoV-2 seropositivity. Even after accounting for resident infections, certified nursing assistants and nurses had a 3-fold higher risk of SARS-CoV-2 seropositivity than nonclinical staff. This knowledge can guide prioritized implementation of safer ways for caregivers to provide necessary care to SNF residents.
We analyzed the efficacy, cost, and cost-effectiveness of predictive decision-support systems based on surveillance interventions to reduce the spread of carbapenem-resistant Enterobacteriaceae (CRE).
We developed a computational model that included patient movement between acute-care hospitals (ACHs), long-term care facilities (LTCFs), and communities to simulate the transmission and epidemiology of CRE. A comparative cost-effectiveness analysis was conducted on several surveillance strategies to detect asymptomatic CRE colonization, which included screening in ICUs at select or all hospitals, a statewide registry, or a combination of hospital screening and a statewide registry.
We investigated 51 ACHs, 222 LTCFs, and skilled nursing facilities, and 464 ZIP codes in the state of Maryland.
Patients or participants:
The model was informed using 2013–2016 patient-mix data from the Maryland Health Services Cost Review Commission. This model included all patients that were admitted to an ACH.
On average, the implementation of a statewide CRE registry reduced annual CRE infections by 6.3% (18.8 cases). Policies of screening in select or all ICUs without a statewide registry had no significant impact on the incidence of CRE infections. Predictive algorithms, which identified any high-risk patient, reduced colonization incidence by an average of 1.2% (3.7 cases) without a registry and 7.0% (20.9 cases) with a registry. Implementation of the registry was estimated to save $572,000 statewide in averted infections per year.
Although hospital-level surveillance provided minimal reductions in CRE infections, regional coordination with a statewide registry of CRE patients reduced infections and was cost-effective.
To determine the impact of an inpatient stewardship intervention targeting fluoroquinolone use on inpatient and postdischarge Clostridioides difficile infection (CDI).
We used an interrupted time series study design to evaluate the rate of hospital-onset CDI (HO-CDI), postdischarge CDI (PD-CDI) within 12 weeks, and inpatient fluoroquinolone use from 2 years prior to 1 year after a stewardship intervention.
An academic healthcare system with 4 hospitals.
All inpatients hospitalized between January 2017 and September 2020, excluding those discharged from locations caring for oncology, bone marrow transplant, or solid-organ transplant patients.
Introduction of electronic order sets designed to reduce inpatient fluoroquinolone prescribing.
Among 163,117 admissions, there were 683 cases of HO-CDI and 1,104 cases of PD-CDI. In the context of a 2% month-to-month decline starting in the preintervention period (P < .01), we observed a reduction in fluoroquinolone days of therapy per 1,000 patient days of 21% after the intervention (level change, P < .05). HO-CDI rates were stable throughout the study period. In contrast, we also detected a change in the trend of PD-CDI rates from a stable monthly rate in the preintervention period to a monthly decrease of 2.5% in the postintervention period (P < .01).
Our systemwide intervention reduced inpatient fluoroquinolone use immediately, but not HO-CDI. However, a downward trend in PD-CDI occurred. Relying on outcome measures limited to the inpatient setting may not reflect the full impact of inpatient stewardship efforts.
BASF Corporation has developed P-hydroxyphenylpyruvate dioxygenase (HPPD) inhibitor-resistant cotton and soybean that will allow growers to use isoxaflutole in future weed management programs. In 2019 and 2020, a multi-state research project was conducted non-crop to examine weed control following isoxaflutole applied preemergence alone and with a number of tank mix partners at high and low labeled rates. At 28 DAT, Palmer amaranth was controlled ≥95% at 6 of 7 locations with isoxaflutole plus the high rate of diuron or fluridone. These same combinations provided the greatest control 42 DAT at 4 of 7 locations. Where large crabgrass was present, isoxaflutole plus the high rate of diuron, fluridone, pendimethalin, or S-metolachlor or isoxaflutole plus the low rate of fluometuron controlled large crabgrass ≥95% in 2 of 3 locations 28 DAT. In 2 of 3 locations, isoxaflutole plus the high rate of pendimethalin or S-metolachlor improved large crabgrass control 42 DAT when compared to isoxaflutole alone. At 21 DAT, morningglory was controlled ≥95% at all locations with isoxaflutole plus the high rate of diuron and at 3 of 4 locations with isoxaflutole plus the high rate of fluometuron. At 42 DAT at all locations, isoxaflutole plus diuron or fluridone and isoxaflutole plus the high rate of fluometuron improved morningglory control compared to isoxaflutole alone. These results suggest that isoxaflutole applied preemergence alone or in tank mixture is efficacious on a number of cross-spectrum annual weeds in cotton and extended weed control may be achieved when isoxaflutole is tank mixed with a number of soil residual herbicides.
In the wake of the COVID-19 pandemic, those planning and conducting research involving older adults have faced many challenges, in part because of the public health measures in place. This article details the early steps and corresponding strategies implemented by our multidisciplinary team to pivot our large-scale aging and mobility study. Based on the premise that all current and emerging research in aging has been impacted by the pandemic, we propose a continuum approach whereby the research question, analysis, and interpretation are situated in accordance with the stage of the pandemic. Using examples from our own study, we outline potential ways to partner with older adults and other stakeholders as well as to encourage collaboration beyond disciplinary silos even under the current circumstances. Finally, we suggest the formation of a Canadian-led consortium that leverages cross-disciplinary expertise to address the complexities of our aging population in the COVID-19 era and beyond.
In this paper, the kinetic energy cascade in stably stratified open-channel flows is investigated. A mathematical framework to incorporate vertical scales into the conventional kinetic energy spectrum and its budget is introduced. This framework defines kinetic energy density in horizontal spectral and vertical scale space. The energy cascade is studied by analysing the evolution of kinetic energy density. It is shown that energetic streamwise scales ($\lambda _x$) become larger with increasing vertical scale. For the strongest stratification, for which the turbulence becomes intermittent, the energetic streamwise scales are suppressed, and energy density resides in $\lambda _x$ of the size of the domain. It is shown that, in an unstratified case, vertical scales of the size comparable to the height of the logarithmic layer connect viscous regions to the outer layer. By contrast, in stratified cases, such a connection is not observed. Moreover, it is shown that nonlinear transfer for streamwise scales is dominated by in-plane triad interactions and inter-plane transfer is more active in transferring energy density among small vertical scales of the size comparable to the height of viscous sublayer. The vertical scales of size comparable to the height of the viscous sublayer and buffer layer are the most active scales in the viscous term and the production term in the energy density budget, respectively.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Of 100 patients discharged from short-stay units (SSUs) with antibiotics, 47 had a skin and soft-tissue infection, 22 had pneumonia, and 21 had a urinary tract infection. Among all discharge antibiotic prescriptions, 78% involved antibiotic overuse, most commonly excess duration (54 of 100) and guideline discordant selection (44 of 100).
Epidemiological and intervention studies have reported negative health effects of sucrose intake, but many of these studies were not representative of typical dietary habits. In this pilot study, we aimed to test the effect of increasing sucrose intakes for 1 week on body composition and blood pressure and explore the feasibility of consuming high intakes of sucrose in addition to a habitual diet. In a randomised crossover design study, twelve healthy participants (50 % female, age 28⋅4 ± 10 years, BMI 25 ± 3 kg/m2), consumed either 40, 80 or 120 g sucrose in 500 ml water in addition to their habitual diet every day for 1 week, with a 1-week washout between treatment periods. Body composition (assessed using bioelectrical impedance) and blood pressure measurements were taken before and after each intervention phase. All participants reported no issues with consuming the sucrose dose for the intervention period. There was a significant increase in systolic blood pressure following 120 g sucrose intake (P = 0⋅006), however there was no significant changes to blood pressure, body weight, BMI, percentage protein, fat or water (P > 0⋅05) when comparing change from baseline values. There was also no effect of sucrose intakes on energy or macronutrient intakes during the intervention (P > 0⋅05). We show here that varying doses of sucrose over a 1-week period have no effect on body composition or blood pressure. The amounts of sucrose used were an acceptable addition to the habitual diet and demonstrate the feasibility of larger-scale studies of chronic sucrose supplementation.
Recommendations for free sugar intake in the UK should be no more than 5 % of total energy due to increased health risks associated with overconsumption. It was therefore of interest to examine free sugar intakes and associations with health parameters in the UK population. The UK National Diet and Nutrition Survey rolling programme (2008–2017) was used for this study. Dietary intake, anthropometrical measurements and clinical biomarker data collated from 5121 adult respondents aged 19–64 years were statistically analysed. Compared with the average total carbohydrate intake (48 % of energy), free sugars comprised 12·5 %, with sucrose 9 % and fructose 3·5 %. Intakes of these sugars, apart from fructose, were significantly different over collection year (P < 0·001) and significantly higher in males (P < 0·001). Comparing those consuming above or below the UK recommendations for free sugars (5 % energy), significant differences were found for BMI (P < 0·001), TAG (P < 0·001), HDL (P = 0·006) and homocysteine concentrations (P = 0·028), and significant sex differences were observed (e.g. lower blood pressure in females). Regression analysis demonstrated that free sugar intake could predict plasma TAG, HDL and homocysteine concentrations (P < 0·0001), consistent with the link between these parameters and CVD. We also found selected unhealthy food choices (using the UK Eatwell Guide) to be significantly higher in those that consumed above the recommendations (P < 0·0001) and were predictors of free sugar intakes (P < 0·0001). We have shown that adult free sugar intakes in the UK population are associated with certain negative health parameters that support the necessary reduction in free sugar intakes for the UK population.
Background: Effective inpatient stewardship initiatives can improve antibiotic prescribing, but impact on outcomes like Clostridioides difficile infections (CDIs) is less apparent. However, the effect of inpatient stewardship efforts may extend to the postdischarge setting. We evaluated whether an intervention targeting inpatient fluoroquinolone (FQ) use in a large healthcare system reduced incidence of postdischarge CDI. Methods: In August 2019, 4 acute-care hospitals in a large healthcare system replaced standalone FQ orders with order sets containing decision support. Order sets redirected prescribers to syndrome order sets that prioritize alternative antibiotics. Monthly patient days (PDs) and antibiotic days of therapy (DOT) administered for FQs and NHSN-defined broad-spectrum hospital-onset (BS-HO) antibiotics were calculated using patient encounter data for the 23 months before and 13 months after the intervention (COVID-19 admissions in the previous 7 months). We evaluated hospital-onset CDI (HO-CDI) per 1,000 PD (defined as any positive test after hospital day 3) and 12-week postdischarge (PDC- CDI) per 100 discharges (any positive test within healthcare system <12 weeks after discharge). Interrupted time-series analysis using generalized estimating equation models with negative binomial link function was conducted; a sensitivity analysis with Medicare case-mix index (CMI) adjustment was also performed to control for differences after start of the COVID-19 pandemic. Results: Among 163,117 admissions, there were 683 HO-CDIs and 1,009 PDC-CDIs. Overall, FQ DOT per 1,000 PD decreased by 21% immediately after the intervention (level change; P < .05) and decreased at a consistent rate throughout the entire study period (−2% per month; P < .01) (Fig. 1). There was a nonsignificant 5% increase in BS-HO antibiotic use immediately after intervention and a continued increase in use after the intervention (0.3% per month; P = .37). HO-CDI rates were stable throughout the study period, with a nonsignificant level change decrease of 10% after the intervention. In contrast, there was a reversal in the trend in PDC-CDI rates from a 0.4% per month increase in the preintervention period to a 3% per month decrease in the postintervention period (P < .01). Sensitivity analysis with adjustment for facility-specific CMI produced similar results but with wider confidence intervals, as did an analysis with a distinct COVID-19 time point. Conclusion: Our systemwide intervention using order sets with decision support reduced inpatient FQ use by 21%. The intervention did not significantly reduce HO-CDI but significantly decreased the incidence of CDI within 12 weeks after discharge. Relying on outcome measures limited to inpatient setting may not reflect the full impact of inpatient stewardship efforts and incorporating postdischarge outcomes, such as CDI, should increasingly be considered.
In total, 13 facilities changed C. difficile testing to reflexive testing by enzyme immunoassay (EIA) only after a positive nucleic acid-amplification test (NAAT); the standardized infection ratio (SIR) decreased by 46% (range, −12% to −71% per hospital). Changing testing practice greatly influenced a performance metric without changing C. difficile infection prevention practice.
Recent cannabis exposure has been associated with lower rates of neurocognitive impairment in people with HIV (PWH). Cannabis’s anti-inflammatory properties may underlie this relationship by reducing chronic neuroinflammation in PWH. This study examined relations between cannabis use and inflammatory biomarkers in cerebrospinal fluid (CSF) and plasma, and cognitive correlates of these biomarkers within a community-based sample of PWH.
263 individuals were categorized into four groups: HIV− non-cannabis users (n = 65), HIV+ non-cannabis users (n = 105), HIV+ moderate cannabis users (n = 62), and HIV+ daily cannabis users (n = 31). Differences in pro-inflammatory biomarkers (IL-6, MCP-1/CCL2, IP-10/CXCL10, sCD14, sTNFR-II, TNF-α) by study group were determined by Kruskal–Wallis tests. Multivariable linear regressions examined relationships between biomarkers and seven cognitive domains, adjusting for age, sex/gender, race, education, and current CD4 count.
HIV+ daily cannabis users showed lower MCP-1 and IP-10 levels in CSF compared to HIV+ non-cannabis users (p = .015; p = .039) and were similar to HIV− non-cannabis users. Plasma biomarkers showed no differences by cannabis use. Among PWH, lower CSF MCP-1 and lower CSF IP-10 were associated with better learning performance (all ps < .05).
Current daily cannabis use was associated with lower levels of pro-inflammatory chemokines implicated in HIV pathogenesis and these chemokines were linked to the cognitive domain of learning which is commonly impaired in PWH. Cannabinoid-related reductions of MCP-1 and IP-10, if confirmed, suggest a role for medicinal cannabis in the mitigation of persistent inflammation and cognitive impacts of HIV.
This study investigated death anxiety in patients with primary brain tumor (PBT). We examined the psychometric properties of two validated death anxiety measures and determined the prevalence and possible determinants of death anxiety in this often-overlooked population.
Two cross-sectional studies in neuro-oncology were conducted. In Study 1, 81 patients with PBT completed psychological questionnaires, including the Templer Death Anxiety Scale (DAS). In Study 2, 109 patients with PBT completed similar questionnaires, including the Death and Dying Distress Scale (DADDS). Medical and disease-specific variables were collected across participants in both studies. Psychometric properties, including construct validity, internal consistency, and concurrent validity, were investigated. Levels of distress were analyzed using frequencies, and determinants of death anxiety were identified using logistic regression.
The DADDS was more psychometrically sound than the DAS in patients with PBT. Overall, 66% of PBT patients endorsed at least one symptom of distress about death and dying, with 48% experiencing moderate-severe death anxiety. Generalized anxiety symptoms and the fear of recurrence significantly predicted death anxiety.
Significance of results
The DADDS is a more appropriate instrument than the DAS to assess death anxiety in neuro-oncology. The proportion of patients with PBT who experience death anxiety appears to be higher than in other advanced cancer populations. Death anxiety is a highly distressing symptom, especially when coupled with generalized anxiety and fears of disease progression, which appears to be the case in patients with PBT. Our findings call for routine monitoring and the treatment of death anxiety in neuro-oncology.
In the field of transmission electron microscopy, data interpretation often lags behind acquisition methods, as image processing methods often have to be manually tailored to individual datasets. Machine learning offers a promising approach for fast, accurate analysis of electron microscopy data. Here, we demonstrate a flexible two-step pipeline for the analysis of high-resolution transmission electron microscopy data, which uses a U-Net for segmentation followed by a random forest for the detection of stacking faults. Our trained U-Net is able to segment nanoparticle regions from the amorphous background with a Dice coefficient of 0.8 and significantly outperforms traditional image segmentation methods. Using these segmented regions, we are then able to classify whether nanoparticles contain a visible stacking fault with 86% accuracy. We provide this adaptable pipeline as an open-source tool for the community. The combined output of the segmentation network and classifier offer a way to determine statistical distributions of features of interest, such as size, shape, and defect presence, enabling the detection of correlations between these features.
Over 50% of adoptions are transracial, involving primarily White parents and children of color from different ethnic or racial backgrounds. Transracial adoptive (TRA) parents are tasked with providing ethnic–racial socialization processes (ERS) to support TRA adoptees’ ethnic–racial identity development and prepare them to cope with ethnic–racial discrimination. However, unlike nonadoptive families of color, TRA parents lack shared cultural history with adoptees and have limited experience navigating racial discrimination. Knowledge of ERS among TRA families has centered on unidirectional processes between parenting constructs, ERS processes, and children's functioning. However, ERS processes in this population have complexities and nuances that warrant more sensitive and robust conceptualization. This paper proposes a process-oriented dynamic ecological model of the system of ERS, situating transacting processes in and across multiple family levels (parent, adoptee, family) and incorporating developmental and contextual considerations. With its framing of the complexities in ERS among TRA families, the model offers three contributions: a conceptual organization of parenting constructs related to ERS, a more robust understanding of ERS processes that inform how parents provide ERS, and framing of transacting processes within and between parenting constructs, ERS processes, and children's functioning. Implications for research, policy, and practice are discussed.
Concerns persist regarding possible false-negative results that may compromise COVID-19 containment. Although obtaining a true false-negative rate is infeasible, using real-life observations, the data suggest a possible false-negative rate of ˜2.3%. Use of a sensitive, amplified RNA platform should reassure healthcare systems.