To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
This essay examines the narrative and representational tactics of Matthew Desmond's Evicted: Poverty and Profit in the American City (2016). Rather than read this book solely in terms of its findings, this essay argues that Desmond attempts to stylistically embody the relationship between market culture, eviction, and the political delegitimation of the poor. Evicted also reworks the sociological “community study” by refashioning literary templates from writers such as Jacob Riis, Charles Dickens, Jane Jacobs, and Hannah Arendt. By fusing such debts together, Evicted powerfully connects its account of eviction's toll on the broader but too often overlooked relationship between poverty and citizenship.
Chronic psychotic disorders (CPDs) occur worldwide and cause significant burden. Poor medication adherence is pervasive, but has not been well studied in sub-Saharan Africa.
This cross-sectional survey of 100 poorly adherent Tanzanian patients with CPD characterised clinical features associated with poor adherence.
Descriptive statistics characterised demographic and clinical variables, including barriers to adherence, adherence behaviours and attitudes, and psychiatric symptoms. Measures included the Tablets Routine Questionnaire, Drug Attitudes Inventory, the Brief Psychiatric Rating Scale, the Clinical Global Impressions scale, the Alcohol Use Disorders Identification Test and Alcohol, Smoking and Substance Involvement Screening Test. The relationship between adherence and other clinical variables was evaluated.
Mean age was 35.7 years (s.d. 8.8), 61% were male and 80% had schizophrenia, with a mean age at onset of 22.4 (s.d. 7.6) years. Mean proportion of missed CPD medication was 64%. One in ten had alcohol dependence. Most individuals had multiple adherence barriers. Most clinical variables were not significantly associated with the Tablets Routine Questionnaire; however, in-patients with CPD were more likely to have worse adherence (P ≤ 0.01), as were individuals with worse medication attitudes (Drug Attitudes Inventory, P < 0.01), higher CPD symptom severity levels (Brief Psychiatric Rating Scale, P < 0.001) and higher-risk use of alcohol (Alcohol Use Disorders Identification Test, P < 0.001).
Poorly adherent patients had multiple barriers to adherence, including poor attitudes toward medication and treatment, high illness acuity and substance use comorbidity. Treatments need to address adherence barriers, and consider family supports and challenges from an intergenerational perspective.
This is an epidemiological study of carbapenem-resistant Enterobacteriaceae (CRE) in Veterans’ Affairs medical centers (VAMCs). In 2017, almost 75% of VAMCs had at least 1 CRE case. We observed substantial geographic variability, with more cases in urban, complex facilities. This supports the benefit of tailoring infection control strategies to facility characteristics.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are gram-negative bacteria resistant to at least 1 carbapenem and are associated with high mortality (50%). Carbapenemase-producing CRE (CP-CRE) are particularly serious because they are more likely to transmit carbapenem resistance genes to other gram-negative bacteria and they are resistant to all carbapenem antibiotics. Few studies have evaluated risk factors associated with CP-CRE colonization. The goal of this study was to determine the risk factors associated with CP-CRE colonization in a cohort of US veterans. Methods: We conducted a retrospective cohort study of patients seen at VA medical centers between 2013 and 2018 who had positive cultures for CRE from any site, defined by resistance to at least 1 of the following carbapenems: imipenem, meropenem, doripenem, or ertapenem. CP-CRE was defined via antibiotic sensitivity data that coded the culture as being ‘carbapenemase producing,’ being ‘Hodge test positive,’ or ‘KPC producing.’ Only the first positive culture for CRE was included. Patient demographics (year of culture, age, sex, race, major comorbidities, infectious organism, culture site, inpatient status, and CP-CRE status) and facility demographics (rurality, geographic region, and facility complexity) were collected. Bivariate analysis and multiple logistic regression were performed to determine variables associated with CP-CRE versus non–CP-CRE. Results: In total, 3,322 patients were identified with a positive CRE culture: 546 (16.4%) with CP-CRE and 2,776 (83.63%) with non–CP-CRE. Most patients were men (95%) and were older (mean age, 71; SD, 12.5) and were diagnosed at a high-complexity VA medical center (65%). Most of the cultures were urine (63%), followed by sputum (13%), and blood (7%). Most were from inpatients (46%), followed by outpatients (42%), and long-term care facilities (12%). Multivariable analysis showed the following variables to be associated with CP-CRE positive cultures: congestive heart failure (P = .0136), African American (P = .0760), Klebsiella spp (P < .0001), GI cancers (P = .0087), culture collected in 2017 (P = .0004), and culture collected in 2018 (P < .0001). There were also significant differences CP-CRE frequencies by geographic region (P < .001). Discussion: CP-CRE diagnoses are relatively rare; however, the serious complications associated make them important infections to investigate. In our analysis, we found that congestive heart failure and gastric cancer were comorbidities strongly associated with CP-CRE. In 2017, the VA formalized their CP-CRE definition, which led to more accurate reporting. Conclusions: After the guideline was implemented, CP-CRE detection dramatically increased in noncontinental US facilities. More work should be done in the future to determine the different risk factors between non–CP-CRE and CP-CRE infections.
Dialysis patients may not have access to conventional renal replacement therapy (RRT) following disasters. We hypothesized that improvised renal replacement therapy (ImpRRT) would be comparable to continuous renal replacement therapy (CRRT) in a porcine acute kidney injury model.
Following bilateral nephrectomies and 2 hours of caudal aortic occlusion, 12 pigs were randomized to 4 hours of ImpRRT or CRRT. In the ImpRRT group, blood was circulated through a dialysis filter using a rapid infuser to collect the ultrafiltrate. Improvised replacement fluid, made with stock solutions, was infused pre-pump. In the CRRT group, commercial replacement fluid was used. During RRT, animals received isotonic crystalloids and norepinephrine.
There were no differences in serum creatinine, calcium, magnesium, or phosphorus concentrations. While there was a difference between groups in serum potassium concentration over time (P < 0.001), significance was lost in pairwise comparison at specific time points. Replacement fluids or ultrafiltrate flows did not differ between groups. There were no differences in lactate concentration, isotonic crystalloid requirement, or norepinephrine doses. No difference was found in electrolyte concentrations between the commercial and improvised replacement solutions.
The ImpRRT system achieved similar performance to CRRT and may represent a potential option for temporary RRT following disasters.
OBJECTIVES/GOALS: African-Americans have a 3-fold higher risk of end-stage kidney disease (ESKD) compared to Whites due in part to APOL1 risk alleles. Whether resistant hypertension (RH) magnifies the risk of ESKD among African Americans beyond APOL1 is not known. We examined the interaction between RH and race on ESKD risk and the independent effect of RH beyond APOL1. METHODS/STUDY POPULATION: We designed a retrospective cohort of 240,038 veterans with HTN, enrolled in the Million Veteran Program with an estimated glomerular filtration rate (eGFR) >30 ml/min/1.73m2. The primary exposure was incident RH (time-varying). The primary outcome was incident ESKD during a 13.5 year follow up: 2004-2017. Secondary outcomes were myocardial infarction (MI), stroke, and death. Incident RH was defined as failure to achieve outpatient blood pressure (BP) <140/90 mmHg with 3 antihypertensive drugs, including a thiazide, or use of 4 or more drugs. Poisson models were used to estimate incidence rates and test additive interaction with race and APOL1 genotype. Multivariable Cox models (with Fine-Gray competing-risks models as sensitivity analyses) were used to examine independent effects. RESULTS/ANTICIPATED RESULTS: The cohort comprised 235,046 veterans; median age was 60 years; 21% were African-American and 6% were women, with 23,010 incident RH cases observed over a median follow-up time of 10.2 years [interquartile range, 5.6-12.6]. Patients with RH had higher incidence rates [per 1000 person-years] of ESKD (4.5 vs. 1.3), myocardial infarction (6.5 vs. 3.0), stroke (16.4 vs. 7.6) and death (12.0 vs. 6.9) than non-resistant hypertension (NRH). African-Americans with RH had a 2.6-fold higher risk of ESKD compared to African-Americans with NRH; 3-fold the risk of Whites with RH, and 9.6-fold the risk of Whites with NRH [p-interaction<.001]. Among African-Americans, RH was associated with a 2.2-fold (95%CI, 1.86-2.58) higher risk of incident ESKD in models adjusted for APOL1 genotype and in the subset of African-Americans with no APOL1 risk alleles, RH was associated with an adjusted 2.75-fold (95% CI: 2.00-3.50) higher risk of incident ESKD. DISCUSSION/SIGNIFICANCE OF IMPACT: RH was independently associated with a higher risk of ESKD and cardiovascular outcomes, especially among African-Americans. This elevated risk is independent of APOL1 genotype. Interventions that achieve BP targets among patients with RH could curtail the incidence of ESKD and cardiovascular outcomes in this high-risk population. CONFLICT OF INTEREST DESCRIPTION: None.
We systematically reviewed implementation research targeting depression interventions in low- and middle-income countries (LMICs) to assess gaps in methodological coverage.
PubMed, CINAHL, PsycINFO, and EMBASE were searched for evaluations of depression interventions in LMICs reporting at least one implementation outcome published through March 2019.
A total of 8714 studies were screened, 759 were assessed for eligibility, and 79 studies met inclusion criteria. Common implementation outcomes reported were acceptability (n = 50; 63.3%), feasibility (n = 28; 35.4%), and fidelity (n = 18; 22.8%). Only four studies (5.1%) reported adoption or penetration, and three (3.8%) reported sustainability. The Sub-Saharan Africa region (n = 29; 36.7%) had the most studies. The majority of studies (n = 59; 74.7%) reported outcomes for a depression intervention implemented in pilot researcher-controlled settings. Studies commonly focused on Hybrid Type-1 effectiveness-implementation designs (n = 53; 67.1), followed by Hybrid Type-3 (n = 16; 20.3%). Only 21 studies (26.6%) tested an implementation strategy, with the most common being revising professional roles (n = 10; 47.6%). The most common intervention modality was individual psychotherapy (n = 30; 38.0%). Common study designs were mixed methods (n = 27; 34.2%), quasi-experimental uncontrolled pre-post (n = 17; 21.5%), and individual randomized trials (n = 16; 20.3).
Existing research has focused on early-stage implementation outcomes. Most studies have utilized Hybrid Type-1 designs, with the primary aim to test intervention effectiveness delivered in researcher-controlled settings. Future research should focus on testing and optimizing implementation strategies to promote scale-up of evidence-based depression interventions in routine care. These studies should use high-quality pragmatic designs and focus on later-stage implementation outcomes such as cost, penetration, and sustainability.
Dissipation of S-metolachlor, a soil-applied herbicide, on organic and mineral soils used for sugarcane production in Florida was evaluated using field studies in 2013 to 2016. S-metolachlor was applied PRE at 2,270 g ha−1 on organic and mineral soils with 75% and 1.6% organic matter, respectively. The rate of dissipation of S-metolachlor was rapid on mineral soils compared with organic soils. Dissipation of S-metolachlor on organic soils followed a negative linear trend resulting in half-lives (DT50) ranging from 50 to 126 d. S-metolachlor loss on organic soils was more rapid under high soil-moisture conditions than in corresponding low soil-moisture conditions. On mineral soils, dissipation of S-metolachlor followed an exponential decline. The DT50 of S-metolachlor on mineral soils ranged from 12 to 24 d. The short persistence of S-metolachlor on mineral soils was likely attributed to low organic matter content with limited adsorptive capability. The results indicate that organic matter content and soil moisture are important for persistence of S-metolachlor on organic and mineral soils used for sugarcane production in Florida.
Quaternary processes and environmental changes are often difficult to assess in remote subantarctic islands due to high surface erosion rates and overprinting of sedimentary products in locations that can be a challenge to access. We present a set of high-resolution, multichannel seismic lines and complementary multibeam bathymetry collected off the eastern (leeward) side of the subantarctic Auckland Islands, about 465 km south of New Zealand's South Island. These data constrain the erosive and depositional history of the island group, and they reveal an extensive system of sediment-filled valleys that extend offshore to depths that exceed glacial low-stand sea level. Although shallow, marine, U-shaped valleys and moraines are imaged, the rugged offshore geomorphology of the paleovalley floors and the stratigraphy of infill sediments suggests that the valley floors were shaped by submarine fluvial erosion, and subsequently filled by lacustrine, fjord, and fluvial sedimentary processes.
Given the evidence of multi-parameter risk factors in shaping cognitive outcomes in aging, including sleep, inflammation, cardiometabolism, and mood disorders, multidimensional investigations of their impact on cognition are warranted. We sought to determine the extent to which self-reported sleep disturbances, metabolic syndrome (MetS) factors, cellular inflammation, depressive symptomatology, and diminished physical mobility were associated with cognitive impairment and poorer cognitive performance.
This is a cross-sectional study.
Participants with elevated, well-controlled blood pressure were recruited from the local community for a Tai Chi and healthy-aging intervention study.
One hundred forty-five older adults (72.7 ± 7.9 years old; 66% female), 54 (37%) with evidence of cognitive impairment (CI) based on Montreal Cognitive Assessment (MoCA) score ≤24, underwent medical, psychological, and mood assessments.
CI and cognitive domain performance were assessed using the MoCA. Univariate correlations were computed to determine relationships between risk factors and cognitive outcomes. Bootstrapped logistic regression was used to determine significant predictors of CI risk and linear regression to explore cognitive domains affected by risk factors.
The CI group were slower on the mobility task, satisfied more MetS criteria, and reported poorer sleep than normocognitive individuals (all p < 0.05). Multivariate logistic regression indicated that sleep disturbances, but no other risk factors, predicted increased risk of evidence of CI (OR = 2.00, 95% CI: 1.26–4.87, 99% CI: 1.08–7.48). Further examination of MoCA cognitive subdomains revealed that sleep disturbances predicted poorer executive function (β = –0.26, 95% CI: –0.51 to –0.06, 99% CI: –0.61 to –0.02), with lesser effects on visuospatial performance (β = –0.20, 95% CI: –0.35 to –0.02, 99% CI: –0.39 to 0.03), and memory (β = –0.29, 95% CI: –0.66 to –0.01, 99% CI: –0.76 to 0.08).
Our results indicate that the deleterious impact of self-reported sleep disturbances on cognitive performance was prominent over other risk factors and illustrate the importance of clinician evaluation of sleep in patients with or at risk of diminished cognitive performance. Future, longitudinal studies implementing a comprehensive neuropsychological battery and objective sleep measurement are warranted to further explore these associations.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.
This contribution discusses results obtained from 3-D neutron diffraction and 2-D fabric analyser in situ deformation experiments on laboratory-prepared polycrystalline deuterated ice and ice containing a second phase. The two-phase samples used in the experiments are composed of an ice matrix with (1) air bubbles, (2) rigid, rhombohedral-shaped calcite and (3) rheologically soft, platy graphite. Samples were tested at 10°C below the melting point of deuterated ice at ambient pressures, and two strain rates of 1 × 10−5 s−1 (fast) and 2.5 × 10−6 s−1 (medium). Nature and distribution of the second phase controlled the rheological behaviour of the ice by pinning grain boundary migration. Peak stresses increased with the presence of second-phase particles and during fast strain rate cycles. Ice-only samples exhibit well-developed crystallographic preferred orientations (CPOs) and dynamically recrystallized microstructures, typifying deformation via dislocation creep, where the CPO intensity is influenced in part by the strain rate. CPOs are accompanied by a concentration of [c]-axes in cones about the compression axis, coinciding with increasing activity of prismatic-<a> slip activity. Ice with second phases, deformed in a relatively slower strain rate regime, exhibit greater grain boundary migration and stronger CPO intensities than samples deformed at higher strain rates or strain rate cycles.
Sugarcane growers in Florida have been reporting reduced control of fall panicum with asulam, the main herbicide used for POST grass control. Therefore, outside container experiments were conducted to determine the response of four fall panicum populations from Florida to asulam applied alone and to evaluate whether tank-mix combination with trifloxysulfuron enhances control. Asulam was applied at 230 to 7,400 g ai ha−1, corresponding to 1/16 to 2X the maximum labeled rate for a single application in sugarcane, with or without combination with trifloxysulfuron at 16 g ai ha−1. Three fall panicum populations were collected from fields in which reduced control had been reported, while one population was from a field not used for sugarcane production but adjacent to a sugarcane field. The potency of asulam based on ED50 values (the rate required to cause 50% dry weight reduction at 28 d after treatment) ranged from 2,249 to 5,412 g ha−1 for tolerant populations with reported reduced fall panicum control compared with 1,808 g ha−1 for the susceptible population from the field not used for sugarcane production, showing that the latter was most sensitive to asulam. Addition of trifloxysulfuron to asulam increased potency on fall panicum by 5- to 15-fold, indicating that the tank mix enhanced dry weight reduction for all populations. The probability of fall panicum survival (regrowth after aboveground biomass harvesting) at the labeled rate of asulam ranged from 2% to 47% compared with 0% to 6% when trifloxysulfuron was added to the tank mix. Our results show differential response of fall panicum populations in Florida to asulam, which can be overcome by tank mixing with trifloxysulfuron even for populations that are difficult to control in sugarcane, but no evolution of resistance to asulam.
Fomesafen is a protoporphyrinogen oxidase–inhibitor herbicide with an alternative mode of action that provides PRE weed control in strawberry [Fragaria×ananassa (Weston) Duchesne ex Rozier (pro sp.) [chiloensis×virginiana]] produced in a plasticulture setting in Florida. Plasticulture mulch could decrease fomesafen dissipation and increase crop injury in rotational crops. Field experiments were conducted in Balm, FL, to investigate fomesafen persistence and movement in soil in Florida strawberry systems for the 2014/2015 and 2015/2016 production cycles. Treatments included fomesafen preplant at 0, 0.42, and 0.84 kg ai ha−1. Soil samples were taken under the plastic from plots treated with fomesafen at 0.42 kg ha−1 throughout the production cycle. Fomesafen did not injure strawberry or decrease yield. Fomesafen concentration data for the 0.0- to 0.1-m soil depth were described using a three-parameter logistic function. The fomesafen 50% dissipation times were 37 and 47 d for the 2014/2015 and 2015/2016 production cycles, respectively. At the end of the study, fomesafen was last detected in the 0.0- to 0.1-m depth soil at 167 and 194 d after treatment in the 2014/2015 and 2015/2016 production cycles, respectively. Fomesafen concentration was less than 25 ppb on any sampling date for 0.1- to 0.2-m and 0.2- to 0.3-m depths. Fomesafen concentration decreased significantly after strawberry was transplanted and likely leached during overhead and drip irrigation used during the crop establishment.
To test the feasibility of using telehealth to support antimicrobial stewardship at Veterans Affairs medical centers (VAMCs) that have limited access to infectious disease-trained specialists.
A prospective quasi-experimental pilot study.
Two rural VAMCs with acute-care and long-term care units.
At each intervention site, medical providers, pharmacists, infection preventionists, staff nurses, and off-site infectious disease physicians formed a videoconference antimicrobial stewardship team (VAST) that met weekly to discuss cases and antimicrobial stewardship-related education.
Descriptive measures included fidelity of implementation, number of cases discussed, infectious syndromes, types of recommendations, and acceptance rate of recommendations made by the VAST. Qualitative results stemmed from semi-structured interviews with VAST participants at the intervention sites.
Each site adapted the VAST to suit their local needs. On average, sites A and B discussed 3.5 and 3.1 cases per session, respectively. At site A, 98 of 140 cases (70%) were from the acute-care units; at site B, 59 of 119 cases (50%) were from the acute-care units. The most common clinical syndrome discussed was pneumonia or respiratory syndrome (41% and 35% for sites A and B, respectively). Providers implemented most VAST recommendations, with an acceptance rate of 73% (186 of 256 recommendations) and 65% (99 of 153 recommendations) at sites A and B, respectively. Qualitative results based on 24 interviews revealed that participants valued the multidisciplinary aspects of the VAST sessions and felt that it improved their antimicrobial stewardship efforts and patient care.
This pilot study has successfully demonstrated the feasibility of using telehealth to support antimicrobial stewardship at rural VAMCs with limited access to local infectious disease expertise.
Direct ink writing of silicone elastomers enables printing with precise control of porosity and mechanical properties of ordered cellular solids, suitable for shock absorption and stress mitigation applications. With the ability to manipulate structure and feedstock stiffness, the design space becomes challenging to parse to obtain a solution producing a desired mechanical response. Here, we derive an analytical design approach for a specific architecture. Results from finite element simulations and quasi-static mechanical tests of two different parallel strand architectures were analyzed to understand the structure-property relationships under uniaxial compression. Combining effective stiffness-density scaling with least squares optimization of the stress responses yielded general response curves parameterized by resin modulus and strand spacing. An analytical expression of these curves serves as a reduced order model, which, when optimized, provides a rapid design capability for filament-based 3D printed structures. As a demonstration, the optimal design of a face-centered tetragonal architecture is computed that satisfies prescribed minimum and maximum load constraints.