We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: During the COVID-19 pandemic, rates of central line bloodstream infections (CLABSI) increased nationally. Studies pre-pandemic showed improved CLABSI rates with implementation of a standardized vascular access team (VAT).[PL1] [PL2] [mi3] Varying VAT resources and coverage existed in our 10 acute care facilities (ACF) prior to and during the pandemic. VAT scope also varied in 1) process for line selection during initial placement, 2) ability to place a peripherally inserted central catheter (PICC), midline or ultrasound-guided peripheral IV in patients with difficult vascular access, 3) ownership of daily assessment of central line (CL) necessity, and 4) routine CL dressing changes. We aimed to define and implement the ideal VAT structure and evaluate the impact on CLABSI standardized infection ratios (SIR) and rates prior to and during the pandemic. Methods: A multidisciplinary workgroup including representatives from nursing, infection prevention, and vascular access was formed to understand the current state of VAT responsibilities across all ACFs. The group identified key responsibilities a VAT should conduct to aid in CLABSI prevention. Complete VAT coverage[mi4] was defined as the ability to conduct the identified responsibilities daily. We compared the SIR and CLABSI rates between hospitals who had complete VAT (CVAT) coverage to hospitals with incomplete VAT (IVAT) coverage. Given this work occurred during the pandemic, we further stratified our analysis based on a time frame prior to the pandemic (1/2015 – 12/2019) and intra-pandemic (1/2020 - 12/2022). Results: The multidisciplinary team identified 6 key components of complete VAT coverage: Assessment for appropriate line selection prior to insertion, ability to insert PICC and midlines, daily CL and midline care and maintenance assessments, daily assessment of necessity for CL, and weekly dressing changes for CL and midlines[NA5] . A cross walk of VAT scope (Figure 1) was performed in October 2022 which revealed two facilities (A and E) which met CVAT criteria. Pre-pandemic, while IVAT CLABSI rates and SIR were higher than in CVAT units, the difference was not statistically significant. During the pandemic, however, CLABSI rates and SIR were 40-50% higher in IVAT compared to CVAT facilities (Incident Rate Ratio 1.5, 95% CI 1.1-2.0 and SIR Relative Ratio 1.4, 95% CI1.1-1.9 respectively) (Table 1). Conclusions: CLABSI rates were lower in facilities with complete VAT coverage prior to and during the COVID-19 pandemic suggesting a highly functioning VAT can aid in preventing CLABSIs, especially when a healthcare system is stressed and resources are limited.
Patients receiving hematopoietic stem cell transplants (HSCT) are at increased risk for Clostridioides difficile infection (CDI). The purpose of this study was to assess the effectiveness of oral vancomycin prophylaxis (OVP) for CDI in HSCT patients.
Design:
Single-center, retrospective cohort.
Setting:
Tertiary care academic medical center in New Jersey.
Patients:
Patients ≥18 years old during admission for the HSCT were included. Patients who were admitted <72 hours or who had an active CDI prior to HSCT day 0 were excluded.
Methods:
Medical records of patients admitted between January 2015 and August 2022 to undergo an allogeneic or autologous HSCT were reviewed. The primary end point was the incidence of in-hospital CDI. Secondary end points included the incidence of vancomycin-resistant enterococci (VRE) bloodstream infections, VRE isolated from any clinical culture, gram-negative bloodstream infections, hospital survival, and hospital length of stay. Exploratory end points, including 1-year survival, relapse, and incidence of graft-versus-host disease, were also collected.
Results:
A total of 156 HSCT patients were included. There was 1 case of CDI (1 of 81, 1.23%) in the OVP group compared to 8 CDI cases (8 of 75, 10.67%) in the no OVP group (P = .0147). There were no significant (P > .05) between-group differences in incidence of gram-negative bloodstream infections, hospital survival, and length of stay. There were zero clinical cultures positive for VRE.
Conclusions:
In-hospital incidence of CDI in HSCT patients was significantly decreased with OVP. Randomized controlled trials are needed in this high-risk population to assess the efficacy and risks of OVP for CDI.
Advanced laryngeal cancers are clinically complex; there is a paucity of modern decision-making models to guide tumour-specific management. This pilot study aims to identify computed tomography-based radiomic features that may predict survival and enhance prognostication.
Methods
Pre-biopsy, contrast-enhanced computed tomography scans were assembled from a retrospective cohort (n = 72) with advanced laryngeal cancers (T3 and T4). The LIFEx software was used for radiomic feature extraction. Two features: shape compacity (irregularity of tumour volume) and grey-level zone length matrix – grey-level non-uniformity (tumour heterogeneity) were selected via least absolute shrinkage and selection operator-based Cox regression and explored for prognostic potential.
Results
A greater shape compacity (hazard ratio 2.89) and grey-level zone length matrix – grey-level non-uniformity (hazard ratio 1.64) were significantly associated with worse 5-year disease-specific survival (p < 0.05). Cox regression models yielded a superior C-index when incorporating radiomic features (0.759) versus clinicopathological variables alone (0.655).
Conclusions
Two radiomic features were identified as independent prognostic biomarkers. A multi-centre prospective study is necessary for further exploration. Integrated radiomic models may refine the treatment of advanced laryngeal cancers.
The Arabian leopard Panthera pardus nimr is categorized as Critically Endangered, with < 200 individuals estimated to remain in the wild. Historically the species ranged over an extensive area of western Saudi Arabia but, with no confirmed sightings since 2014, investigating potential continued presence and distribution is of critical conservation importance. We present the results of a comprehensive survey designed to detect any remaining Arabian leopard populations in Saudi Arabia. We conducted 14 surveys, deploying 586 camera-trap stations at 13 sites, totalling 82,075 trap-nights. Questionnaire surveys were conducted with 843 members of local communities across the Arabian leopard's historical range to assess the presence of leopards, other predators and prey species. Predator scats were collected ad hoc by field teams and we used mitochondrial DNA analysis to identify the originating species. We obtained 62,948 independent photographs of animals and people, but none were of Arabian leopards. Other carnivores appeared widespread and domestic animals were numerous, but wild prey were comparatively scarce. Three questionnaire respondents reported sightings of leopards within the previous year, but targeted camera-trap surveys in these areas did not yield evidence of leopards. Of the 143 scats sent for analysis, no DNA was conclusively identified as that of the leopard. From this extensive study, we conclude there are probably no surviving, sustainable populations of Arabian leopards in Saudi Arabia. Individual leopards might be present but were not confirmed. Any future Arabian leopard conservation in Saudi Arabia will probably require reintroduction of captive-bred leopards.
Prior trials suggest that intravenous racemic ketamine is a highly effective for treatment-resistant depression (TRD), but phase 3 trials of racemic ketamine are needed.
Aims
To assess the acute efficacy and safety of a 4-week course of subcutaneous racemic ketamine in participants with TRD. Trial registration: ACTRN12616001096448 at www.anzctr.org.au.
Method
This phase 3, double-blind, randomised, active-controlled multicentre trial was conducted at seven mood disorders centres in Australia and New Zealand. Participants received twice-weekly subcutaneous racemic ketamine or midazolam for 4 weeks. Initially, the trial tested fixed-dose ketamine 0.5 mg/kg versus midazolam 0.025 mg/kg (cohort 1). Dosing was revised, after a Data Safety Monitoring Board recommendation, to flexible-dose ketamine 0.5–0.9 mg/kg or midazolam 0.025–0.045 mg/kg, with response-guided dosing increments (cohort 2). The primary outcome was remission (Montgomery-Åsberg Rating Scale for Depression score ≤10) at the end of week 4.
Results
The final analysis (those who received at least one treatment) comprised 68 in cohort 1 (fixed-dose), 106 in cohort 2 (flexible-dose). Ketamine was more efficacious than midazolam in cohort 2 (remission rate 19.6% v. 2.0%; OR = 12.1, 95% CI 2.1–69.2, P = 0.005), but not different in cohort 1 (remission rate 6.3% v. 8.8%; OR = 1.3, 95% CI 0.2–8.2, P = 0.76). Ketamine was well tolerated. Acute adverse effects (psychotomimetic, blood pressure increases) resolved within 2 h.
Conclusions
Adequately dosed subcutaneous racemic ketamine was efficacious and safe in treating TRD over a 4-week treatment period. The subcutaneous route is practical and feasible.
Why are Nigeria’s universities launching a growing number of open access journals while simultaneously expecting their academic staff to publish ‘internationally’? And what impact do these expectations have on Nigerian journals? Drawing on interviews with editors and publishers, we describe the emergence of a hyperlocal ‘credibility economy’ within the Nigerian academy. The great majority of Nigerian scholarly journals are excluded from Scopus and Web of Science, the two main global citation indexes. Stigmatized by geography, Nigerian journals are ignored, rendered invisible, classed as poor quality or condemned as ‘predatory’. Historicizing these trends, we illustrate our argument with four case studies: two science and technology journals hosted by universities and two independent publishers, one with expertise in African studies, the other in information studies. In each case, we explore the motivations, commitments and strategies of editors and publishers. Their stories exemplify the impact of colonial histories, global discourses and bibliometric infrastructures on African research publishing cultures. The histories, logics and fragilities of this regional research ecosystem reveal how Africa’s scholars and publishers are getting by – but only just – amid the metricized judgements of the global research economy.
As the understanding of health care worker lived experience during coronavirus disease 2019 (COVID-19) grows, the experiences of those utilizing emergency health care services (EHS) during the pandemic are yet to be fully appreciated.
Study Objective:
The objective of this research was to explore lived experience of EHS utilization in Victoria, Australia during the COVID-19 pandemic from March 2020 through March 2021.
Methods:
An explorative qualitative design underpinned by a phenomenological approach was applied. Data were collected through semi-structured, in-depth interviews, which were transcribed verbatim and analyzed using Colaizzi’s approach.
Results:
Qualitative data were collected from 67 participants aged from 32 to 78-years-of-age (average age of 52). Just over one-half of the research participants were male (54%) and three-quarters lived in metropolitan regions (75%). Four key themes emerged from data analysis: (1) Concerns regarding exposure and infection delayed EHS utilization among participants with chronic health conditions; (2) Participants with acute health conditions expressed concern regarding the impact of COVID-19 on their care, but continued to access services as required; (3) Participants caring for people with sensory and developmental disabilities identified unique communication needs during interactions with EHS during the COVID-19 pandemic; communicating with emergency health care workers wearing personal protective equipment (PPE) was identified as a key challenge, with face masks reported as especially problematic for people who are deaf or hard-of-hearing; and (4) Children and older people also experienced communication challenges associated with PPE, and the need for connection with emergency health care workers was important for positive lived experience during interactions with EHS throughout the pandemic.
Conclusion:
This research provides an important insight into the lived experience of EHS utilization during the COVID-19 pandemic, a perspective currently lacking in the published peer-reviewed literature.
A chloroacetamide herbicide by application timing factorial experiment was conducted in 2017 and 2018 in Mississippi to investigate chloroacetamide use in a dicamba-based Palmer amaranth management program in cotton production. Herbicides used were S-metolachlor or acetochlor, and application timings were preemergence, preemergence followed by (fb) early postemergence, preemergence fb late postemergence, early postemergence alone, late postemergence alone, and early postemergence fb late postemergence. Dicamba was included in all preemergence applications, and dicamba plus glyphosate was included with all postemergence applications. Differences in cotton and weed response due to chloroacetamide type were minimal, and cotton injury at 14 d after late postemergence application was less than 10% for all application timings. Late-season weed control was reduced up to 30% and 53% if chloroacetamide application occurred preemergence or late postemergence only, respectively. Late-season weed densities were minimized if multiple applications were used instead of a single application. Cotton height was reduced by up to 23% if a single application was made late postemergence relative to other application timings. Chloroacetamide application at any timing except preemergence alone minimized late-season weed biomass. Yield was maximized by any treatment involving multiple applications or early postemergence alone, whereas applications preemergence or late postemergence alone resulted in up to 56% and 27% yield losses, respectively. While no yield loss was reported by delaying the first of sequential applications until early postemergence, forgoing a preemergence application is not advisable given the multiple factors that may delay timely postemergence applications such as inclement weather.
Microstructures, including crystallographic fabric, within the margin of streaming ice can exert strong control on flow dynamics. To characterize a natural setting, we retrieved three cores, two of which reached bed, from the flank of Jarvis Glacier, eastern Alaska Range, Alaska. The core sites lie ~1 km downstream of the source, with abundant water present in the extracted cores and at the base of the glacier. All cores exhibit dipping layers, a combination of debris bands and bubble-free domains. Grain sizes coarsen on average approaching the lateral margin. Crystallographic orientations are more clustered and with c-axes closer to horizontal nearer the lateral margin. The measured fabric is sufficiently weak to induce little mechanical anisotropy, but the data suggest that despite the challenging conditions of warm ice, abundant water and a short flow distance, many aspects of the microstructure, including measurable crystallographic fabric, evolved in systematic ways.
Wind-driven snow redistribution can increase the spatial heterogeneity of snow accumulation on ice caps and ice sheets, and may prove crucial for the initiation and survival of glaciers in areas of marginal glaciation. We present a snowdrift model (Snow_Blow), which extends and improves the model of Purves, Mackaness and Sugden (1999, Journal of Quaternary Science 14, 313–321). The model calculates spatial variations in relative snow accumulation that result from variations in topography, using a digital elevation model (DEM) and wind direction as inputs. Improvements include snow redistribution using a flux routing algorithm, DEM resolution independence and the addition of a slope curvature component. This paper tests Snow_Blow in Antarctica (a modern environment) and reveals its potential for application in palaeoenvironmental settings, where input meteorological data are unavailable and difficult to estimate. Specifically, Snow_Blow is applied to the Ellsworth Mountains in West Antarctica where ablation is considered to be predominantly related to wind erosion processes. We find that Snow_Blow is able to replicate well the existing distribution of accumulating snow and snow erosion as recorded in and around Blue Ice Areas. Lastly, a variety of model parameters are tested, including depositional distance and erosion vs wind speed, to provide the most likely input parameters for palaeoenvironmental reconstructions.
Following stage 1 palliation, delayed sternal closure may be used as a technique to enhance thoracic compliance but may also prolong the length of stay and increase the risk of infection.
Methods
We reviewed all neonates undergoing stage 1 palliation at our institution between 2010 and 2017 to describe the effects of delayed sternal closure.
Results
During the study period, 193 patients underwent stage 1 palliation, of whom 12 died before an attempt at sternal closure. Among the 25 patients who underwent primary sternal closure, 4 (16%) had sternal reopening within 24 hours. Among the 156 infants who underwent delayed sternal closure at 4 [3,6] days post-operatively, 11 (7.1%) had one or more failed attempts at sternal closure. Patients undergoing primary sternal closure had a shorter duration of mechanical ventilation and intensive care unit length of stay. Patients who failed delayed sternal closure had a longer aortic cross-clamp time (123±42 versus 99±35 minutes, p=0.029) and circulatory arrest time (39±28 versus 19±17 minutes, p=0.0009) than those who did not fail. Failure of delayed sternal closure was also closely associated with Technical Performance Score: 1.3% of patients with a score of 1 failed sternal closure compared with 18.9% of patients with a score of 3 (p=0.0028). Among the haemodynamic and ventilatory parameters studied, only superior caval vein saturation following sternal closure was different between patients who did and did not fail sternal closure (30±7 versus 42±10%, p=0.002). All patients who failed sternal closure did so within 24 hours owing to hypoxaemia, hypercarbia, or haemodynamic impairment.
Conclusion
When performed according to our current clinical practice, sternal closure causes transient and mild changes in haemodynamic and ventilatory parameters. Monitoring of SvO2 following sternal closure may permit early identification of patients at risk for failure.
Studies determined the effect of common lambsquarters, goosegrass, and a mixture of these on ‘Beauregard’ and ‘Jewel’ sweetpotato transplant production with or without polyethylene bed covers. Effects of herbicides on Beauregard in propagation beds were also studied. Black and infrared transmissible (IRT) plastic covers gave near 100% control of goosegrass and common lambsquarters, resulting in the greatest number and weight of Jewel transplants per plot. Common lambsquarters reduced transplant number and weight per plot with Jewel under clear plastic covers when compared with black and IRT plastic covers. Beauregard transplant number was not affected by row cover treatment. However, with data combined over all covers, Beauregard transplant weight per plot was lowest for treatments with weeds compared to weed-free plots. With the exception of DCPA, no significant (10% or greater) injury to Beauregard was observed with diphenamid, napropamide, chloramben, or chloramben plus fluazifop.
Field experiments conducted in 1992 and 1993 evaluated transplanted watermelon tolerance to ethalfluralin applied PPI, PRE (before transplanting), and POST (immediately after transplanting) at 1.2 or 2.4 kg ai/ha. Other treatments for comparison included the registered herbicides ethalfluralin POST-directed spray (PDS), ethalfluralin PDS followed by (fb) naptalam POST, bensulide plus naptalam PPI, and a nontreated check. All treatments controlled common lambsquarters and goosegrass 83 to 100% 2 and 6 weeks after treatment (WAT). Watermelon was injured 30 to 77% in 1992 and 14 to 83% in 1993 by ethalfluralin PPI or PRE at 1.2 or 2.4 kg/ha. Ethalfluralin POST was not injurious to watermelon. In 1992, watermelon treated with ethalfluralin POST at 1.2 and 2.4 kg/ha yielded 52 to 62% more fruit than watermelon from the nontreated check. In 1993, yield of transplanted watermelon treated with ethalfluralin POST was similar to that in the nontreated check.
Motivated by growing concern as to the many threats that islands face, subsequent calls for more extensive island nature conservation and recent discussion in the conservation literature about the potential for wellbeing as a useful approach to understanding how conservation affects people's lives, this paper reviews the literature in order to explore how islands and wellbeing relate and how conservation might impact that relationship. We apply a three-dimensional concept of social wellbeing to structure the discussion and illustrate the importance of understanding island–wellbeing interactions in the context of material, relational and subjective dimensions, using examples from the literature. We posit that islands and their shared characteristics of ‘islandness’ provide a useful setting in which to apply social wellbeing as a generalizable framework, which is particularly adept at illuminating the relevance of social relationships and subjective perceptions in island life – aspects that are often marginalized in more economically focused conservation impact assessments. The paper then explores in more depth the influences of island nature conservation on social wellbeing and sustainability outcomes using two case studies from the global north (UK islands) and global south (the Solomon Islands). We conclude that conservation approaches that engage with all three dimensions of wellbeing seem to be associated with success.