To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This research examined a potential nuisance aspect of the use of the volatility-reducing agent (VRA) potassium carbonate when combined with glyphosate in spray-tank mixtures. A VRA is now required to be added to dicamba applications to reduce off-target movement from volatility. When no VRA potassium carbonate was added to the spray mixture, there was no pressure buildup. The addition of VRA potassium carbonate plus glyphosate (which lowers the pH) resulted in an observed pressure buildup. Although the gas produced was not identified, it would be expected to be carbon dioxide formed by the dissolution of the carbonate anion from the VRA. Source water pH range from 3.2 to 8.2 had no effect on pressure buildup. Pressure buildup was directly related to water temperature, with a linear response to temperature when the VRA was added last; in contrast, a less direct relationship of temperature to pressure buildup existed at temperatures >30 C when the VRA potassium carbonate was added first. There was no effect on the pressure increase from adding a defoamer or a drift control agent.
From 2014 to 2020, we compiled radiocarbon ages from the lower 48 states, creating a database of more than 100,000 archaeological, geological, and paleontological ages that will be freely available to researchers through the Canadian Archaeological Radiocarbon Database. Here, we discuss the process used to compile ages, general characteristics of the database, and lessons learned from this exercise in “big data” compilation.
HIV-associated neurocognitive disorders (HANDs) are prevalent in older people living with HIV (PLWH) worldwide. HAND prevalence and incidence studies of the newly emergent population of combination antiretroviral therapy (cART)-treated older PLWH in sub-Saharan Africa are currently lacking. We aimed to estimate HAND prevalence and incidence using robust measures in stable, cART-treated older adults under long-term follow-up in Tanzania and report cognitive comorbidities.
A systematic sample of consenting HIV-positive adults aged ≥50 years attending routine clinical care at an HIV Care and Treatment Centre during March–May 2016 and followed up March–May 2017.
HAND by consensus panel Frascati criteria based on detailed locally normed low-literacy neuropsychological battery, structured neuropsychiatric clinical assessment, and collateral history. Demographic and etiological factors by self-report and clinical records.
In this cohort (n = 253, 72.3% female, median age 57), HAND prevalence was 47.0% (95% CI 40.9–53.2, n = 119) despite well-managed HIV disease (Mn CD4 516 (98-1719), 95.5% on cART). Of these, 64 (25.3%) were asymptomatic neurocognitive impairment, 46 (18.2%) mild neurocognitive disorder, and 9 (3.6%) HIV-associated dementia. One-year incidence was high (37.2%, 95% CI 25.9 to 51.8), but some reversibility (17.6%, 95% CI 10.0–28.6 n = 16) was observed.
HAND appear highly prevalent in older PLWH in this setting, where demographic profile differs markedly to high-income cohorts, and comorbidities are frequent. Incidence and reversibility also appear high. Future studies should focus on etiologies and potentially reversible factors in this setting.
To develop a pediatric research agenda focused on pediatric healthcare-associated infections and antimicrobial stewardship topics that will yield the highest impact on child health.
The study included 26 geographically diverse adult and pediatric infectious diseases clinicians with expertise in healthcare-associated infection prevention and/or antimicrobial stewardship (topic identification and ranking of priorities), as well as members of the Division of Healthcare Quality and Promotion at the Centers for Disease Control and Prevention (topic identification).
Using a modified Delphi approach, expert recommendations were generated through an iterative process for identifying pediatric research priorities in healthcare associated infection prevention and antimicrobial stewardship. The multistep, 7-month process included a literature review, interactive teleconferences, web-based surveys, and 2 in-person meetings.
A final list of 12 high-priority research topics were generated in the 2 domains. High-priority healthcare-associated infection topics included judicious testing for Clostridioides difficile infection, chlorhexidine (CHG) bathing, measuring and preventing hospital-onset bloodstream infection rates, surgical site infection prevention, surveillance and prevention of multidrug resistant gram-negative rod infections. Antimicrobial stewardship topics included β-lactam allergy de-labeling, judicious use of perioperative antibiotics, intravenous to oral conversion of antimicrobial therapy, developing a patient-level “harm index” for antibiotic exposure, and benchmarking and or peer comparison of antibiotic use for common inpatient conditions.
We identified 6 healthcare-associated infection topics and 6 antimicrobial stewardship topics as potentially high-impact targets for pediatric research.
Proximal environments could facilitate smoking cessation among low-income smokers by making cessation appealing to strive for and tenable.
We sought to examine how home smoking rules and proximal environmental factors such as other household members' and peers' smoking behaviors and attitudes related to low-income smokers' past quit attempts, readiness, and self-efficacy to quit.
This analysis used data from Offering Proactive Treatment Intervention (OPT-IN) (randomized control trial of proactive tobacco cessation outreach) baseline survey, which was completed by 2,406 participants in 2011/12. We tested the associations between predictors (home smoking rules and proximal environmental factors) and outcomes (past-year quit attempts, readiness to quit, and quitting self-efficacy).
Smokers who lived in homes with more restrictive household smoking rules, and/or reported having ‘important others’ who would be supportive of their quitting, were more likely to report having made a quit attempt in the past year, had greater readiness to quit, and greater self-efficacy related to quitting.
Adjustments to proximal environments, including strengthening household smoking rules, might encourage cessation even if other household members are smokers.
The South China Sea (SCS) is a biodiversity hotspot, however, most biodiversity surveys in the region are confined to shallow water reefs. Here, we studied the benthic habitat and fish assemblages in the upper mesophotic coral ecosystems (MCEs; 30–40 m) and SWRs (8–22 m) at three geographic locations (Luzon Strait; Palawan; and the Kalayaan Group of Islands) in the eastern SCS (also called the West Philippine Sea) using diver-based survey methods. Mean coral genera and fish species richness ranged from 17–25 (per 25 m2) and 11–17 (per 250 m2) in MCEs, respectively; although none of these were novel genera/species. Coral and fish assemblages were structured more strongly by location than by depth. Location differences were associated with the variability in benthic composition, wherein locations with higher hard coral cover had higher coral genera richness and abundance. Locations with higher algae and sand cover had higher diversity and density of fish herbivores and benthic invertivores. Fishing efforts may also have contributed to among-location differences as the highly exploited location had the lowest fish biomass. The low variation between depths may be attributed to the similar benthic composition at each location, the interconnectivity between depths due to hydrological conditions, fish motility, and the common fishing gears used in the Philippines that can likely extend beyond SWRs. Results imply that local-scale factors and anthropogenic disturbances probably dampen across-depth structuring in coral genera and fish species assemblages.
In 2010, South Africa (SA) hosted the Fédération Internationale de Football Association (FIFA) World Cup (soccer). Emergency Medical Services (EMS) used the SA mass gathering medicine (MGM) resource model to predict resource allocation. This study analyzed data from the World Cup and compared them with the resource allocation predicted by the SA mass gathering model.
Prospectively, data were collected from patient contacts at 9 venues across the Western Cape province of South Africa. Required resources were based on the number of patients seeking basic life support (BLS), intermediate life support (ILS), and advanced life support (ALS). Overall patient presentation rates (PPRs) and transport to hospital rates (TTHRs) were also calculated.
BLS services were required for 78.4% (n = 1279) of patients and were consistently overestimated using the SA mass gathering model. ILS services were required for 14.0% (n = 228), and ALS services were required for 3.1% (n = 51) of patients. Both ILS and ALS services, and TTHR were underestimated at smaller venues.
The MGM predictive model overestimated BLS requirements and inconsistently predicted ILS and ALS requirements. MGM resource models, which are heavily based on predicted attendance levels, have inherent limitations, which may be improved by using research-based outcomes.
Traditional ambulatory rhythm monitoring in children can have limitations, including cumbersome leads and limited monitoring duration. The ZioTM patch ambulatory monitor is a small, adhesive, single-channel rhythm monitor that can be worn up to 2 weeks. In this study, we present a retrospective cross-sectional analysis of the ZioTM monitor’s impact in clinical practice. Patients aged 0–18 years were included in the study. A total of 373 studies were reviewed in 332 patients. In all, 28.4% had structural heart disease, and 16.9% had a prior surgical, catheterisation, or electrophysiology procedure. The most common indication for monitoring was tachypalpitations (41%); 93.5% of these patients had their symptoms captured during the study window. The median duration of monitoring was 5 days. Overall, 5.1% of ZioTM monitoring identified arrhythmias requiring new intervention or increased medical management; 4.0% identified arrhythmias requiring increased clinical surveillance. The remainder had either normal-variant rhythm or minor rhythm findings requiring no change in management. For patients with tachypalpitations and no structural heart disease, 13.2% had pathological arrhythmias, but 72.9% had normal-variant rhythm during symptoms, allowing discharge from cardiology care. Notably, for patients with findings requiring intervention or increased surveillance, 56% had findings first identified beyond 24 hours, and only 62% were patient-triggered findings. Seven studies (1.9%) were associated with complications or patient intolerance. The ZioTM is a well-tolerated device that may improve what traditional Holter and event monitoring would detect in paediatric cardiology patients. This study shows a positive clinical impact on the management of patients within a paediatric cardiology practice.
OBJECTIVES/SPECIFIC AIMS: The purpose of the present secondary data analysis was to examine the effect of moderate-severe disturbed sleep before the start of radiation therapy (RT) on subsequent RT-induced pain. METHODS/STUDY POPULATION: Analyses were performed on 676 RT-naïve breast cancer patients (mean age 58, 100% female) scheduled to receive RT from a previously completed nationwide, multicenter, phase II randomized controlled trial examining the efficacy of oral curcumin on radiation dermatitis severity. The trial was conducted at 21 community oncology practices throughout the US affiliated with the University of Rochester Cancer Center NCI’s Community Oncology Research Program (URCC NCORP) Research Base. Sleep disturbance was assessed using a single item question from the modified MD Anderson Symptom Inventory (SI) on a 0–10 scale, with higher scores indicating greater sleep disturbance. Total subjective pain as well as the subdomains of pain (sensory, affective, and perceived) were assessed by the short-form McGill Pain Questionnaire. Pain at treatment site (pain-Tx) was also assessed using a single item question from the SI. These assessments were included for pre-RT (baseline) and post-RT. For the present analyses, patients were dichotomized into 2 groups: those who had moderate-severe disturbed sleep at baseline (score≥4 on the SI; n=101) Versus those who had mild or no disturbed sleep (control group; score=0–3 on the SI; n=575). RESULTS/ANTICIPATED RESULTS: Prior to the start of RT, breast cancer patients with moderate-severe disturbed sleep at baseline were younger, less likely to have had lumpectomy or partial mastectomy while more likely to have had total mastectomy and chemotherapy, more likely to be on sleep, anti-anxiety/depression, and prescription pain medications, and more likely to suffer from depression or anxiety disorder than the control group (all p’s≤0.02). Spearman rank correlations showed that changes in sleep disturbance from baseline to post-RT were significantly correlated with concurrent changes in total pain (r=0.38; p<0.001), sensory pain (r=0.35; p<0.001), affective pain (r=0.21; p<0.001), perceived pain intensity (r=0.37; p<0.001), and pain-Tx (r=0.35; p<0.001). In total, 92% of patients with moderate-severe disturbed sleep at baseline reported post-RT total pain compared with 79% of patients in the control group (p=0.006). Generalized linear estimating equations, after controlling for baseline pain and other covariates (baseline fatigue and distress, age, sleep medications, anti-anxiety/depression medications, prescription pain medications, and depression or anxiety disorder), showed that patients with moderate-severe disturbed sleep at baseline had significantly higher mean values of post-RT total pain (by 39%; p=0.033), post-RT sensory pain (by 41%; p=0.046), and post-RT affective pain (by 55%; p=0.035) than the control group. Perceived pain intensity (p=0.066) and pain-Tx (p=0.086) at post-RT were not significantly different between the 2 groups. DISCUSSION/SIGNIFICANCE OF IMPACT: These findings suggest that moderate-severe disturbed sleep prior to RT is an important predictor for worsening of pain at post-RT in breast cancer patients. There could be several plausible reasons for this. Sleep disturbance, such as sleep loss and sleep continuity disturbance, could result in impaired sleep related recovery and repair of tissue damage associated with cancer and its treatment; thus, resulting in the amplification of pain. Sleep disturbance may also reduce pain tolerance threshold through increased sensitization of the central nervous system. In addition, pain and sleep disturbance may share common neuroimmunological pathways. Sleep disturbance may modulate inflammation, which in turn may contribute to increased pain. Further research is needed to confirm these findings and whether interventions targeting sleep disturbance in early phase could be potential alternate approaches to reduce pain after RT.
Insomnia and depression are highly comorbid and mutually exacerbate clinical trajectories and outcomes. Cognitive behavioral therapy for insomnia (CBT-I) effectively reduces both insomnia and depression severity, and can be delivered digitally. This could substantially increase the accessibility to CBT-I, which could reduce the health disparities related to insomnia; however, the efficacy of digital CBT-I (dCBT-I) across a range of demographic groups has not yet been adequately examined. This randomized placebo-controlled trial examined the efficacy of dCBT-I in reducing both insomnia and depression across a wide range of demographic groups.
Of 1358 individuals with insomnia randomized, a final sample of 358 were retained in the dCBT-I condition and 300 in the online sleep education condition. Severity of insomnia and depression was examined as a dependent variable. Race, socioeconomic status (SES; household income and education), gender, and age were also tested as independent moderators of treatment effects.
The dCBT-I condition yielded greater reductions in both insomnia and depression severity than sleep education, with significantly higher rates of remission following treatment. Demographic variables (i.e. income, race, sex, age, education) were not significant moderators of the treatment effects, suggesting that dCBT-I is comparably efficacious across a wide range of demographic groups. Furthermore, while differences in attrition were found based on SES, attrition did not differ between white and black participants.
Results provide evidence that the wide dissemination of dCBT-I may effectively target both insomnia and comorbid depression across a wide spectrum of the population.
In September 2016, the annual meeting of the International Union for Quaternary Research’s Loess and Pedostratigraphy Focus Group, traditionally referred to as a LoessFest, met in Eau Claire, Wisconsin, USA. The 2016 LoessFest focused on “thin” loess deposits and loess transportation surfaces. This LoessFest included 75 registered participants from 10 countries. Almost half of the participants were from outside the United States, and 18 of the participants were students. This review is the introduction to the special issue for Quaternary Research that originated from presentations and discussions at the 2016 LoessFest. This introduction highlights current understanding and ongoing work on loess in various regions of the world and provides brief summaries of some of the current approaches/strategies used to study loess deposits.
Field identification of ST-elevation myocardial infarction (STEMI) and advanced hospital notification decreases first-medical-contact-to-balloon (FMC2B) time. A recent study in this system found that electrocardiogram (ECG) transmission following a STEMI alert was frequently unsuccessful.
Instituting weekly test ECG transmissions from paramedic units to the hospital would increase successful transmission of ECGs and decrease FMC2B and door-to-balloon (D2B) times.
This was a natural experiment of consecutive patients with field-identified STEMI transported to a single percutaneous coronary intervention (PCI)-capable hospital in a regional STEMI system before and after implementation of scheduled test ECG transmissions. In November 2014, paramedic units began weekly test transmissions. The mobile intensive care nurse (MICN) confirmed the transmission, or if not received, contacted the paramedic unit and the department’s nurse educator to identify and resolve the problem. Per system-wide protocol, paramedics transmit all ECGs with interpretation of STEMI. Receiving hospitals submit patient data to a single registry as part of ongoing system quality improvement. The frequency of successful ECG transmission and time to intervention (FMC2B and D2B times) in the 18 months following implementation was compared to the 10 months prior. Post-implementation, the time the ECG transmission was received was also collected to determine the transmission gap time (time from ECG acquisition to ECG transmission received) and the advanced notification time (time from ECG transmission received to patient arrival).
There were 388 patients with field ECG interpretations of STEMI, 131 pre-intervention and 257 post-intervention. The frequency of successful transmission post-intervention was 73% compared to 64% prior; risk difference (RD)=9%; 95% CI, 1-18%. In the post-intervention period, the median FMC2B time was 79 minutes (inter-quartile range [IQR]=68-102) versus 86 minutes (IQR=71-108) pre-intervention (P=.3) and the median D2B time was 59 minutes (IQR=44-74) versus 60 minutes (IQR=53-88) pre-intervention (P=.2). The median transmission gap was three minutes (IQR=1-8) and median advanced notification time was 16 minutes (IQR=10-25).
Implementation of weekly test ECG transmissions was associated with improvement in successful real-time transmissions from field to hospital, which provided a median advanced notification time of 16 minutes, but no decrease in FMC2B or D2B times.
Objectives: Careful characterization of how functional decline co-evolves with cognitive decline in older adults has yet to be well described. Most models of neurodegenerative disease postulate that cognitive decline predates and potentially leads to declines in everyday functional abilities; however, there is mounting evidence that subtle decline in instrumental activities of daily living (IADLs) may be detectable in older individuals who are still cognitively normal. Methods: The present study examines how the relationship between change in cognition and change in IADLs are best characterized among older adults who participated in the ACTIVE trial. Neuropsychological and IADL data were analyzed for 2802 older adults who were cognitively normal at study baseline and followed for up to 10 years. Results: Findings demonstrate that subtle, self-perceived difficulties in performing IADLs preceded and predicted subsequent declines on cognitive tests of memory, reasoning, and speed of processing. Conclusions: Findings are consistent with a growing body of literature suggesting that subjective changes in everyday abilities can be associated with more precipitous decline on objective cognitive measures and the development of mild cognitive impairment and dementia. (JINS, 2018, 24, 104–112)
With improvements in early survival following congenital heart surgery, it has become increasingly important to understand longer-term outcomes; however, routine collection of these data is challenging and remains very limited. We describe the development and initial results of a collaborative programme incorporating standardised longitudinal follow-up into usual care at the Children’s Hospital of Philadelphia (CHOP) and University of Michigan (UM).
We included children undergoing benchmark operations of the Society of Thoracic Surgeons. Considerations regarding personnel, patient/parent engagement, funding, regulatory issues, and annual data collection are described, and initial follow-up rates are reported.
The present analysis included 1737 eligible patients undergoing surgery at CHOP from January 2007 to December 2014 and 887 UM patients from January 2010 to December 2014. Overall, follow-up data, of any type, were obtained from 90.8% of patients at CHOP (median follow-up 4.3 years, 92.2% survival) and 98.3% at UM (median follow-up 2.8 years, 92.7% survival), with similar rates across operations and institutions. Most patients lost to follow-up at CHOP had undergone surgery before 2010. Standardised questionnaires assessing burden of disease/quality of life were completed by 80.2% (CHOP) and 78.4% (UM) via phone follow-up. In subsequent pilot testing of an automated e-mail system, 53.4% of eligible patients completed the follow-up questionnaire through this system.
Standardised follow-up data can be obtained on the majority of children undergoing benchmark operations. Ongoing efforts to support automated electronic systems and integration with registry data may reduce resource needs, facilitate expansion across centres, and support multi-centre efforts to understand and improve long-term outcomes in this population.
To develop an automated method for ventilator-associated condition (VAC) surveillance and to compare its accuracy and efficiency with manual VAC surveillance
The intensive care units (ICUs) of 4 hospitals
This study was conducted at Detroit Medical Center, a tertiary care center in metropolitan Detroit. A total of 128 ICU beds in 4 acute care hospitals were included during the study period from August to October 2013. The automated VAC algorithm was implemented and utilized for 1 month by all study hospitals. Simultaneous manual VAC surveillance was conducted by 2 infection preventionists and 1 infection control fellow who were blinded to each another’s findings and to the automated VAC algorithm results. The VACs identified by the 2 surveillance processes were compared.
During the study period, 110 patients from all the included hospitals were mechanically ventilated and were evaluated for VAC for a total of 992 mechanical ventilation days. The automated VAC algorithm identified 39 VACs with sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of 100%. In comparison, the combined efforts of the IPs and the infection control fellow detected 58.9% of VACs, with 59% sensitivity, 99% specificity, 91% PPV, and 92% NPV. Moreover, the automated VAC algorithm was extremely efficient, requiring only 1 minute to detect VACs over a 1-month period, compared to 60.7 minutes using manual surveillance.
The automated VAC algorithm is efficient and accurate and is ready to be used routinely for VAC surveillance. Furthermore, its implementation can optimize the sensitivity and specificity of VAC identification.
Infect. Control Hosp. Epidemiol. 2015;36(9):999–1003
In North America, terrestrial records of biodiversity and climate change that span Marine Oxygen Isotope Stage (MIS) 5 are rare. Where found, they provide insight into how the coupling of the ocean–atmosphere system is manifested in biotic and environmental records and how the biosphere responds to climate change. In 2010–2011, construction at Ziegler Reservoir near Snowmass Village, Colorado (USA) revealed a nearly continuous, lacustrine/wetland sedimentary sequence that preserved evidence of past plant communities between ~140 and 55 ka, including all of MIS 5. At an elevation of 2705 m, the Ziegler Reservoir fossil site also contained thousands of well-preserved bones of late Pleistocene megafauna, including mastodons, mammoths, ground sloths, horses, camels, deer, bison, black bear, coyotes, and bighorn sheep. In addition, the site contained more than 26,000 bones from at least 30 species of small animals including salamanders, otters, muskrats, minks, rabbits, beavers, frogs, lizards, snakes, fish, and birds. The combination of macro- and micro-vertebrates, invertebrates, terrestrial and aquatic plant macrofossils, a detailed pollen record, and a robust, directly dated stratigraphic framework shows that high-elevation ecosystems in the Rocky Mountains of Colorado are climatically sensitive and varied dramatically throughout MIS 5.
Antibiograms have effectively improved antibiotic prescribing in acute-care settings; however, their effectiveness in skilled nursing facilities (SNFs) is currently unknown.
To develop SNF-specific antibiograms and identify opportunities to improve antibiotic prescribing.
Design and Setting.
Cross-sectional and pretest-posttest study among residents of 3 Maryland SNFs.
Antibiograms were created using clinical culture data from a 6-month period in each SNF. We also used admission clinical culture data from the acute care facility primarily associated with each SNF for transferred residents. We manually collected all data from medical charts, and antibiograms were created using WHONET software. We then used a pretest-posttest study to evaluate the effectiveness of an antibiogram on changing antibiotic prescribing practices in a single SNF. Appropriate empirical antibiotic therapy was defined as an empirical antibiotic choice that sufficiently covered the infecting organism, considering antibiotic susceptibilities.
We reviewed 839 patient charts from SNF and acute care facilities. During the initial assessment period, 85% of initial antibiotic use in the SNFs was empirical, and thus only 15% of initial antibiotics were based on culture results. Fluoroquinolones were the most frequently used empirical antibiotics, accounting for 54.5% of initial prescribing instances. Among patients with available culture data, only 35% of empirical antibiotic prescribing was determined to be appropriate. In the single SNF in which we evaluated antibiogram effectiveness, prevalence of appropriate antibiotic prescribing increased from 32% to 45% after antibiogram implementation; however, this was not statistically significant (P = .32).
Implementation of antibiograms may be effective in improving empirical antibiotic prescribing in SNFs.