We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In a prospective, remote natural history study of 277 individuals with (60) and genetically at risk for (217) Parkinson’s disease (PD), we examined interest in the return of individual research results (IRRs) and compared characteristics of those who opted for versus against the return of IRRs. Most (n = 180, 65%) requested sharing of IRRs with either a primary care provider, neurologist, or themselves. Among individuals without PD, those who requested sharing of IRRs with a clinician reported more motor symptoms than those who did not request any sharing (mean (SD) 2.2 (4.0) versus 0.7 (1.5)). Participant interest in the return of IRRs is strong.
Patients tested for Clostridioides difficile infection (CDI) using a 2-step algorithm with a nucleic acid amplification test (NAAT) followed by toxin assay are not reported to the National Healthcare Safety Network as a laboratory-identified CDI event if they are NAAT positive (+)/toxin negative (−). We compared NAAT+/toxin− and NAAT+/toxin+ patients and identified factors associated with CDI treatment among NAAT+/toxin− patients.
Design:
Retrospective observational study.
Setting:
The study was conducted across 36 laboratories at 5 Emerging Infections Program sites.
Patients:
We defined a CDI case as a positive test detected by this 2-step algorithm during 2018–2020 in a patient aged ≥1 year with no positive test in the previous 8 weeks.
Methods:
We used multivariable logistic regression to compare CDI-related complications and recurrence between NAAT+/toxin− and NAAT+/toxin+ cases. We used a mixed-effects logistic model to identify factors associated with treatment in NAAT+/toxin− cases.
Results:
Of 1,801 cases, 1,252 were NAAT+/toxin−, and 549 were NAAT+/toxin+. CDI treatment was given to 866 (71.5%) of 1,212 NAAT+/toxin− cases versus 510 (95.9%) of 532 NAAT+/toxin+ cases (P < .0001). NAAT+/toxin− status was protective for recurrence (adjusted odds ratio [aOR], 0.65; 95% CI, 0.55–0.77) but not CDI-related complications (aOR, 1.05; 95% CI, 0.87–1.28). Among NAAT+/toxin− cases, white blood cell count ≥15,000/µL (aOR, 1.87; 95% CI, 1.28–2.74), ≥3 unformed stools for ≥1 day (aOR, 1.90; 95% CI, 1.40–2.59), and diagnosis by a laboratory that provided no or neutral interpretive comments (aOR, 3.23; 95% CI, 2.23–4.68) were predictors of CDI treatment.
Conclusion:
Use of this 2-step algorithm likely results in underreporting of some NAAT+/toxin− cases with clinically relevant CDI. Disease severity and laboratory interpretive comments influence treatment decisions for NAAT+/toxin− cases.
Monitoring the prevalence and abundance of parasites over time is important for addressing their potential impact on host life histories, immunological profiles and their influence as a selective force. Only long-term ecological studies have the potential to shed light on both the temporal trends in infection prevalence and abundance and the drivers of such trends, because of their ability to dissect drivers that may be confounded over shorter time scales. Despite this, only a relatively small number of such studies exist. Here, we analysed changes in the prevalence and abundance of gastrointestinal parasites in the wild Soay sheep population of St. Kilda across 31 years. The host population density (PD) has increased across the study, and PD is known to increase parasite transmission, but we found that PD and year explained temporal variation in parasite prevalence and abundance independently. Prevalence of both strongyle nematodes and coccidian microparasites increased during the study, and this effect varied between lambs, yearlings and adults. Meanwhile, abundance of strongyles was more strongly linked to host PD than to temporal (yearly) dynamics, while abundance of coccidia showed a strong temporal trend without any influence of PD. Strikingly, coccidian abundance increased 3-fold across the course of the study in lambs, while increases in yearlings and adults were negligible. Our decades-long, intensive, individual-based study will enable the role of environmental change and selection pressures in driving these dynamics to be determined, potentially providing unparalleled insight into the drivers of temporal variation in parasite dynamics in the wild.
Enhanced recovery programmes have been widely adopted in other surgical disciplines but are not commonplace in head and neck surgery. The authors of this study created a pathway for post-operative laryngectomy patients.
Method
A multidisciplinary working group reviewed the literature and agreed standards of care. A retrospective audit was conducted to measure current practice against our new pathway; after programme implementation our performance was reaudited in two prospective cycles, with an education programme and review after the first prospective cycle.
Results
Statistically significant improvement in performance was realised in catheter and surgical drain removal, opiate analgesia use, mobilisation, and timeliness of swallow assessment. The rate of hospital acquired pneumonia reduced from 23.1 to 9.5 per cent and length of stay reduced by a median of 5.2 days to 14.8 days (non-significant).
Conclusion
The programme improved consistency of patient care across most areas that were measured. Improving patient stoma training needs to be prioritised.
Background: Decisions to treat large-vessel occlusion with endovascular therapy(EVT) or intravenous alteplase depend on how physicians weigh benefits against risks when considering patients’ pre-stroke comorbidities. Methods: In an international survey, experts chose treatment approaches under current resources and under assumed ideal conditions for 10 of 22 randomly assigned case-scenarios. Five included comorbidities(metastatic/non-metastatic cancer, cardiac/respiratory/renal disease, non-disabling/mild cognitive impairment[MCI], physical dependence). We examined scenario/respondent characteristics associated with EVT/alteplase decisions using multivariable logistic regressions. Results: Among 607 physicians(38 countries), EVT was favoured in 1,097/1,379(79.6%) responses for comorbidity-related scenarios under current resources versus 1,510/1,657(91.1%,OR:0.38, 95%CI.0.31-0.47) for six “level-1A” scenarios (assuming ideal conditions:82.7% vs 95.1%,OR:0.25,0.19-0.33). However, this was reversed on including all other scenarios(e.g. under current resources:3,489/4,691[74.4%], OR:1.34,1.17-1.54). Responses favouring alteplase for comorbidity-related(e.g.75.0% under current resources) scenarios were comparable to level-1A scenarios(72.2%) and higher than all others(60.4%). No comorbidity-related factor independently diminished EVT-odds. MCI and dependence carried higher alteplase-odds; cancer and cardiac/respiratory/renal disease had lower odds. Relevant respondent characteristics included performing more EVT cases/year (higher EVT, lower alteplase-odds), practicing in East-Asia (higher EVT-odds), and in interventional neuroradiology(lower alteplase-odds vs neurology). Conclusions: Moderate-to-severe comorbidities did not consistently deter experts from EVT, suggesting equipoise about withholding EVT based on comorbidities. However, alteplase was often foregone when respondents chose EVT.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
New Zealand has a long-running campylobacter infection (campylobacteriosis) epidemic with contaminated fresh chicken meat as the major source. This is both the highest impact zoonosis and the largest food safety problem in the country. Adding to this burden is the recent rapid emergence of antibiotic resistance in these campylobacter infections acquired from locally-produced chicken. Campylobacteriosis rates halved in 2008, as compared with the previous 5 years, following the introduction of regulatory limits on allowable contamination levels in fresh chicken meat, with large health and economic benefits resulting. In the following decade, disease rates do not appear to have declined further. The cumulative impact would equate to an estimated 539 000 cases, 5480 hospitalisations, 284 deaths and economic costs of approximately US$380 million during the last 10 years (2009–2018). Additional regulatory interventions, that build on previously successful regulations in this country, are urgently needed to control the source of this epidemic.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
The ALMA twenty-six arcmin2 survey of GOODS-S at one millimeter (ASAGAO) is a deep (1σ ∼ 61μJy/beam) and wide area (26 arcmin2) survey on a contiguous field at 1.2 mm. By combining with archival data, we obtained a deeper map in the same region (1σ ∼ 30μJy/beam−1, synthesized beam size 0.59″ × 0.53″), providing the largest sample of sources (25 sources at 5σ, 45 sources at 4.5σ) among ALMA blank-field surveys. The median redshift of the 4.5σ sources is 2.4. The number counts shows that 52% of the extragalactic background light at 1.2 mm is resolved into discrete sources. We create IR luminosity functions (LFs) at z = 1–3, and constrain the faintest luminosity of the LF at 2 < z < 3. The LFs are consistent with previous results based on other ALMA and SCUBA-2 observations, which suggests a positive luminosity evolution and negative density evolution.
Introduction: Recently there have been many studies performed on the effectiveness of implementing LEAN principals to improve wait times for emergency departments (EDs), but there have been relatively few studies on implementing these concepts on length of stay (LOS) in the ED. This research aims to explore the initial feasibility of applying the LEAN model to length-of-stay metrics in an ED by identifying areas of non-value added time for patients staying in the ED. Methods: In this project we used a sample of 10,000 ED visits at the Health Science Centre in St. John's over a 1-year period and compared patients’ LOS in the ED on four criteria: day of the week, hour of presentation, whether laboratory tests were ordered, and whether diagnostic imaging was ordered. Two sets of analyses were then performed. First a two-sided Wilcoxon rank-sum test was used to evaluate whether ordering either lab tests or diagnostic imaging affected LOS. Second a generalized linear model (GLM) was created using a 10-fold cross-validation with a LASSO operator to analyze the effect size and significance of each of the four criteria on LOS. Additionally, a post-test analysis of the GLM was performed on a second sample of 10,000 ED visits in the same 1-year period to assess its predictive power and infer the degree to which a patient's LOS is determined by the four criteria. Results: For the Wilcoxon rank-sum test there was no significant difference in LOS for patients who were ordered diagnostic imaging compared to those who were not (p = 0.6998) but there was a statistically significant decrease in LOS for patients who were ordered lab tests compared to those who were not (p = 2.696 x 10-10). When assessing the GLM there were two significant takeaways: ordering lab tests reduced LOS (95% CI = 42.953 - 68.173min reduction), and arriving at the ED on Thursday increased LOS significantly (95% CI = 6.846 – 52.002min increase). Conclusion: This preliminary analysis identified several factors that increased patients’ LOS in the ED, which would be suitable for potential LEAN interventions. The increase in LOS for both patients who are not ordered lab tests and who visit the ED on Thursday warrant further investigation to identify causal factors. Finally, while this analysis revealed several actionable criteria for improving ED LOS the relatively low predictive power of the final GLM in the post-test analysis (R2 = 0.00363) indicates there are more criteria that influence LOS for exploration in future analyses.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.
Non-stoichiometric, carbon-containing crandallite from Guatemala and plumbogummite from Cumbria have been characterized using electron microprobe (EMPA) and wet-chemical analyses, Rietveld analysis of powder X-ray diffraction (PXRD) patterns, and infrared (IR), Raman and cathodoluminescence (CL) spectroscopies. The samples contain 11.0 and 4.8 wt.% CO2, respectively. The IR spectra for both samples show a doublet in the range 1410–1470 cm–1, corresponding to CO3 vibrations. Direct confirmation of CO3 replacing PO4 was obtained from difference Fourier maps in the Rietveld analysis. Carbonate accounts for 67% of the C in the plumbogummite and 20% of the C in the Guatemalan crandallite, the remainder being present as nano-scale organic carbon. The CO3 substitution for PO4 is manifested in a large contraction of the tetrahedral volume (14–19%) and by a contraction of the a axis, analogous to observations for carbonate-containing fluorapatites. Stoichiometric crandallite from Utah was characterized using the same methods, for comparison with the non-stoichiometric, carbon-bearing phases.
Secondary phosphate assemblages from the Hagendorf Süd granitic pegmatite, containing the new Mn-Al phosphate mineral, nordgauite, have been characterized using scanning electron microscopy and electron microprobe analysis. Nordgauite nodules enclose crystals of the jahnsite—whiteite group of minerals, showing pronounced compositional zoning, spanning the full range of Fe/Al ratios between jahnsite and whiteite. The whiteite-rich members are F-bearing, whereas the jahnsite-rich members contain no F. Associated minerals include sphalerite, apatite, parascholzite, zwieselite-triplite solid solutions and a kingsmountite-related mineral. The average compositions of whiteite and jahnsite from different zoned regions correspond to jahnsite-(CaMnMn), whiteite-(CaMnMn) and the previously undescribed whiteite-(CaMnFe) end-members. Mo-Kα CCD intensity data were collected on a twinned crystal of the (CaMnMn)-dominant whiteite and refined in P2/a to wRobs = 0.064 for 1015 observed reflections.
Background: Limbic encephalitis (LE) is a rare autoimmune syndrome affecting limbic system structures and causing variety of manifestations including memory changes, temporal epilepsies, and psychiatric symptoms. It is a rare disease in children but with a well-recognizable combination of clinical, neuroimaging and/or histological signature. Beyond the association with anti-neuronal auto-antibodies, no clear immune system phenotype has been associated with limbic encephalitis. Our aim is to characterize the clinical and paraclinical features of non-paraneoplasic limbic encephalitis and to correlate them with potential underlying immune deficiencies. Methods: Retrospective case series of seven patients with limbic encephalitis recruited at the Montreal Children’s Hospital (MCH) with a focus on the immune- and neuro-phenotypes, including anti-neuronal antibodies, lymphocyte sub-typing, key markers of immunoglobulin and complement systems. Literature review showed 77 cases of non-paraneoplastic non-NMDA limbic encephalitis. Results: Symptoms included temporal epilepsy (n=5), psychiatric symptoms such as ADHD or autistic symptoms (n=2), and memory changes (n=3). One patient was positive for both voltage gated potassium channel antibodies (VGKC) and anti-thyroid peroxidase antibodies (TPO) and two were positive only for anti TPO antibodies. One patient showed low CD19, and immunoglobulins. Three patients showed chronic low CD56 cell count. Conclusions: The study is still ongoing, but at least 3 patients already display some traits of immune dysregulation.
Ridge-till is an integrated weed management system that involves the physical movement of soil containing weed seeds away from the row with ridge-clearing equipment on the planter. Corn, grain sorghum, and soybean are the major crops planted using the ridge-till system. Weeds can be controlled with cultivation, competitive row crops, and herbicides. Weeds have adapted to the system but have been controlled through alternative management. Through modernization of equipment and herbicides, ridge-till has become an economic crop production practice. Integrating cultivation and herbicides controls a broader spectrum of weeds than cultivation or herbicides alone.
An electrical discharge system (EDS) was evaluated in field studies conducted in 1977 through 1979 in western Nebraska for its ability to control weed escapes in sugarbeets (Beta vulgaris L. ‘Mono Hy D2′). Nine weeks after sugarbeets were planted, kochia [Kochia scoparia (L.) Schrad.] had attained a height above sugarbeets sufficient for EDS treatment. Redroot pigweed (Amaranthus retroflexus L.) and common lambsquarters (Chenopodium album L.) generally attained sufficient height above sugarbeets 11 and 13 weeks after sugarbeet planting. Sugarbeet root yields were reduced 40, 20, and 10% from competition by kochia, common lambsquarters, and redroot pigweed, respectively. Treatment of kochia, redroot pigweed, and common lambsquarters with EDS in some cases resulted in a reduction in weed height. The EDS treatments reduced the stand of all weeds 32, 39, and 47% for 1977, 1978, and 1979, respectively. Although the EDS treatments failed to kill many weeds, it did suppress the competitive ability of the three weeds to the extent that sugarbeet yields were higher in areas receiving EDS treatments than areas receiving no EDS treatment.
To achieve their conservation goals individuals, communities and organizations need to acquire a diversity of skills, knowledge and information (i.e. capacity). Despite current efforts to build and maintain appropriate levels of conservation capacity, it has been recognized that there will need to be a significant scaling-up of these activities in sub-Saharan Africa. This is because of the rapid increase in the number and extent of environmental problems in the region. We present a range of socio-economic contexts relevant to four key areas of African conservation capacity building: protected area management, community engagement, effective leadership, and professional e-learning. Under these core themes, 39 specific recommendations are presented. These were derived from multi-stakeholder workshop discussions at an international conference held in Nairobi, Kenya, in 2015. At the meeting 185 delegates (practitioners, scientists, community groups and government agencies) represented 105 organizations from 24 African nations and eight non-African nations. The 39 recommendations constituted six broad types of suggested action: (1) the development of new methods, (2) the provision of capacity building resources (e.g. information or data), (3) the communication of ideas or examples of successful initiatives, (4) the implementation of new research or gap analyses, (5) the establishment of new structures within and between organizations, and (6) the development of new partnerships. A number of cross-cutting issues also emerged from the discussions: the need for a greater sense of urgency in developing capacity building activities; the need to develop novel capacity building methodologies; and the need to move away from one-size-fits-all approaches.
There is limited longitudinal research that has looked at the longer term incidence of depressive symptoms, comparing women with a hysterectomy to women without a hysterectomy. We aimed to investigate the association between hysterectomy status and the 12-year incidence of depressive symptoms in a mid-aged cohort of Australian women, and whether these relationships were modified by use of exogenous hormones.
Methods.
We used generalised estimating equation models for binary outcome data to assess the associations of the incidence of depressive symptoms (measured by the 10-item Centre for Epidemiologic Studies Depression Scale) across five surveys over a 12-year period, in women with a hysterectomy with ovarian conservation, or a hysterectomy with bilateral oophorectomy compared with women without a hysterectomy. We further stratified women with hysterectomy by their current use of menopausal hormone therapy (MHT). Women who reported prior treatment for depression were excluded from the analysis.
Results.
Compared with women without a hysterectomy (n = 4002), both women with a hysterectomy with ovarian conservation (n = 884) and women with a hysterectomy and bilateral oophorectomy (n = 450) had a higher risk of depressive symptoms (relative risk (RR) 1.20; 95% confidence interval (CI) 1.06–1.36 and RR 1.44; 95% CI 1.22–1.68, respectively). There were differences in the strength of the risk for women with a hysterectomy with ovarian conservation, compared with those without, when we stratified by current MHT use. Compared with women without a hysterectomy who did not use MHT, women with a hysterectomy with ovarian conservation who were also MHT users had a higher risk of depressive symptoms (RR 1.57; 95% CI 1.31–1.88) than women with a hysterectomy with ovarian conservation but did not use MHT (RR 1.17; 95% CI 1.02–1.35). For women with a hysterectomy and bilateral oophorectomy, MHT use did not attenuate the risk. We could not rule out, however, that the higher risk seen among MHT users may be due to confounding by indication, i.e. MHT was prescribed to treat depressive symptoms, but their depressive symptoms persisted.
Conclusions.
Women with a hysterectomy (with and without bilateral oophorectomy) have a higher risk of new incidence of depressive symptoms in the longer term that was not explained by lifestyle or socio-economic factors.
Medusahead is one of the most problematic rangeland weeds in the western United States. In previous studies, prescribed burning has been used successfully to control medusahead in some situations, but burning has failed in other circumstances. In this study, trials were conducted using the same protocol at four locations in central to northern California to evaluate plant community response to two consecutive years of summer burning and to determine the conditions resulting in successful medusahead control. During 2002 through 2003 large-scale experiments were established at two low-elevation, warm-winter sites (Fresno and Yolo counties) and two higher elevation, cool-winter sites (Siskiyou and Modoc counties). Plant species cover was estimated using point-intercept transects, and biomass samples were taken in each plot. After 2 yr of burning, medusahead cover was reduced by 99, 96, and 93% for Fresno, Yolo, and Siskiyou counties, respectively, compared to unburned control plots. Other annual grasses were also reduced, but less severely, and broadleaf species increased at all three sites. In contrast, 2 yr of burning resulted in a 55% increase in medusahead at the coolest winter site in Modoc County. In the second season after the final burn, medusahead cover remained low in burned plots at Fresno and Yolo counties (1 and 12% of cover in unburned controls, respectively), but at the Siskiyou site medusahead recovered to 45% relative to untreated controls. The success of prescribed burning was correlated with biomass of annual grasses, excluding medusahead, preceding a burn treatment. It is hypothesized that greater production of combustible forage resulted in increased fire intensity and greater seed mortality in exposed inflorescences. These results demonstrate that burning can be an effective control strategy for medusahead in low elevation, warm-winter areas characterized by high annual grass biomass production, but may not be successful in semiarid cool winter areas.