We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Wildfires have become a regular seasonal disaster across the Western region of the United States. Wildfires require a multifaceted disaster response. In addition to fire suppression, there are public health and medical needs for responders and the general population in the path of the fire, as well as a much larger population impacted by smoke. This paper describes key aspects of the health and medical response to wildfires in California, including facility evacuation and shelter medical support, with emphasis on the organization, coordination, and management of medical teams deployed to fire incident base camps. This provides 1 model of medical support and references resources to help other jurisdictions that must respond to the rising incidence of large wildland fires.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
Trends in detention under the Mental Health Act 1983 in two major London secondary mental healthcare providers were explored using patient-level data in a historical cohort study between 2007–2008 and 2016–2017. An increase in the number of detention episodes initiated per fiscal year was observed at both sites. The rise was accompanied by an increase in the number of active patients; the proportion of active patients detained per year remained relatively stable. Findings suggest that the rise in the number of detentions reflects the rise of the number of people receiving secondary mental healthcare.
Heart rate variability (HRV) is a proxy measure of autonomic function and can be used as an indicator of swine stress. While traditional linear measures are used to distinguish between stressed and unstressed treatments, inclusion of nonlinear HRV measures that evaluate data structure and organization shows promise for improving HRV interpretation. The objective of this study was to evaluate the inclusion of nonlinear HRV measures in response to an acute heat episode. Twenty 12- to 14-week-old growing pigs were individually housed for 7 days and acclimated to thermoneutral conditions (20.35°C ± 0.01°C; 67.6% ± 0.2% RH) before undergoing one of the two treatments: (1) thermoneutral control (TN; n = 10 pigs) or (2) acute heat stress (HS; n = 10 pigs; 32.6°C ± 0.1°C; 26.2% ± 0.1% RH). In Phase 1 of the experimental procedure (P1; 60 min), pigs underwent a baseline HRV measurement period in thermoneutral conditions before treatment [Phase 2; P2; 60 min once gastrointestinal temperature (Tg) reached 40.6°C], where HS pigs were exposed to heated conditions and TN pigs remained in thermoneutral conditions. After P2, all pigs were moved back to thermoneutral conditions (Phase 3; P3; 60 min). During each phase, Tg data were collected every 5 min and behavioural data were collected to evaluate the amount of time each pig spent in an active posture. Additionally, linear (time and frequency domain) and nonlinear [sample entropy (SampEn), de-trended fluctuation analysis, percentage recurrence, percentage determinism (%DET), mean diagonal line length in a recurrence plot] HRV measures were quantified. Heat stressed pigs exhibited greater Tg (P = 0.002) and spent less time in an active posture compared to TN pigs during P2 (P = 0.0003). Additionally, low frequency to high frequency ratio was greater in HS pigs during P3 compared to TN pigs (P = 0.02). SampEn was reduced in HS pigs during P2 (P = 0.01) and P3 (P = 0.03) compared to TN pigs. Heat stressed pigs exhibited greater %DET during P3 (P = 0.03) and tended to have greater %DET (P = 0.09) during P2 than TN pigs. No differences between treatments were detected for the remaining HRV measures. In conclusion, linear HRV measures were largely unchanged during P2. However, changes to SampEn and %DET suggest increased heat stress as a result of the acute heat episode. Future work should continue to evaluate the benefits of including nonlinear HRV measures in HRV analysis of swine heat stress.
As chemical management options for weeds become increasingly limited due to selection for herbicide resistance, investigation of additional nonchemical tools becomes necessary. Harvest weed seed control (HWSC) is a methodology of weed management that targets and destroys weed seeds that are otherwise dispersed by harvesters following threshing. It is not known whether problem weeds in western Canada retain their seeds in sufficient quantities until harvest at a height suitable for collection. A study was conducted at three sites over 2 yr to determine whether retention and height criteria were met by wild oat, false cleavers, and volunteer canola. Wild oat consistently shed seeds early, but seed retention was variable, averaging 56% at the time of wheat swathing, with continued losses until direct harvest of wheat and fababean. The majority of retained seeds were >45 cm above ground level, suitable for collection. Cleavers seed retention was highly variable by site-year, but generally greater than wild oat. The majority of seed was retained >15 cm above ground level and would be considered collectable. Canola seed typically had >95% retention, with the majority of seed retained >15 cm above ground level. The suitability ranking of the species for management with HWSC was canola>cleavers>wild oat. Efficacy of HWSC systems in western Canada will depend on the target species and site- and year-specific environmental conditions.
The porcine small intestinal extracellular matrix reportedly has the potential to differentiate into viable myocardial cells. When used in tetralogy of Fallot repair, it may improve right ventricular function. We evaluated right ventricular function after repair of tetralogy of Fallot with extracellular matrix versus bovine pericardium.
Method
Subjects with non-transannular repair of tetralogy of Fallot with at least 1 year of follow-up were selected. The extracellular matrix and bovine pericardium groups were compared. We used three-dimensional right ventricular ejection fraction, right ventricle global longitudinal strain, and tricuspid annular plane systolic excursion to assess right ventricular function.
Results
The extracellular matrix group had 11 patients, whereas the bovine pericardium group had 10 patients. No differences between the groups were found regarding sex ratio, age at surgery, and cardiopulmonary bypass time. The follow-up period was 28±12.6 months in the extracellular matrix group and 50.05±17.6 months in the bovine pericardium group (p=0.001). The mean three-dimensional right ventricular ejection fraction (55.7±5.0% versus 55.3±5.2%, p=0.73), right ventricular global longitudinal strain (−18.5±3.0% versus −18.0±2.2%, p=0.44), and tricuspid annular plane systolic excursions (1.59±0.16 versus 1.59±0.2, p=0.93) were similar in the extracellular matrix group and in the bovine pericardium group, respectively. Right ventricular global longitudinal strain in healthy children is reported at −29±3% in literature.
Conclusion
In a small cohort of the patients undergoing non-transannular repair of tetralogy of Fallot, there was no significant difference in right ventricular function between groups having extracellular matrix versus bovine pericardium patches followed-up for more than 1 year. Lower right ventricular longitudinal strain noted in both the groups compared to healthy children.
In the midwestern United States, biotypes of giant ragweed resistant to multiple herbicide biochemical sites of action have been identified. Weeds with resistance to multiple herbicides reduce the utility of existing herbicides and necessitate the development of alternative weed control strategies. In two experiments in southeastern Minnesota, we determined the effect of six 3 yr crop-rotation systems containing corn, soybean, wheat, and alfalfa on giant ragweed seedbank depletion and emergence patterns. The six crop-rotation systems included continuous corn, soybean–corn–corn, corn–soybean–corn, soybean–wheat–corn, soybean–alfalfa–corn, and alfalfa–alfalfa–corn. The crop-rotation system had no effect on the amount of seedbank depletion when a zero-weed threshold was maintained, with an average of 96% of the giant ragweed seedbank being depleted within 2 yr. Seedbank depletion occurred primarily through seedling emergence in all crop-rotation systems. However, seedling emergence tended to account for more of the seedbank depletion in rotations containing only corn or soybean compared with rotations with wheat or alfalfa. Giant ragweed emerged early across all treatments, with on average 90% emergence occurring by June 4. Duration of emergence was slightly longer in established alfalfa compared with other cropping systems. These results indicate that corn and soybean rotations are more conducive to giant ragweed emergence than rotations including wheat and alfalfa, and that adopting a zero-weed threshold is a viable approach to depleting the weed seedbank in all crop-rotation systems.
A recent editorial claimed that the 2014 National Institute for Health and Care Excellence (NICE) guideline on psychosis and schizophrenia, unlike its equivalent 2013 Scottish Intercollegiate Guidelines Network (SIGN) guideline, is biased towards psychosocial treatments and against drug treatments. In this paper we underline that the NICE and SIGN guidelines recommend similar interventions, but that the NICE guideline has more rigorous methodology. Our analysis suggests that the authors of the editorial appear to have succumbed to bias themselves.
To determine the effect of graft choice (allograft, bone-patellar tendon-bone autograft, or hamstring autograft) on deep tissue infections following anterior cruciate ligament (ACL) reconstructions.
DESIGN
Retrospective cohort study.
SETTING AND POPULATION
Patients from 6 US health plans who underwent ACL reconstruction from January 1, 2000, through December 31, 2008.
METHODS
We identified ACL reconstructions and potential postoperative infections using claims data. A hierarchical stratified sampling strategy was used to identify patients for medical record review to confirm ACL reconstructions and to determine allograft vs autograft tissue implanted, clinical characteristics, and infection status. We estimated infection rates overall and by graft type. We used logistic regression to assess the association between infections and patients’ demographic characteristics, comorbidities, and choice of graft.
RESULTS
On review of 1,452 medical records, we found 55 deep wound infections. With correction for sampling weights, infection rates varied by graft type: 0.5% (95% CI, 0.3%-0.8%) with allografts, 0.6% (0.1%–1.5%) with bone-patellar tendon-bone autografts, and 2.5% (1.9%–3.1%) with hamstring autograft. After adjusting for potential confounders, we found an increased infection risk with hamstring autografts compared with allografts (odds ratio, 5.9; 95% CI, 2.8–12.8). However, there was no difference in infection risk among bone-patellar tendon-bone autografts vs allografts (odds ratio, 1.2; 95% CI, 0.3–4.8).
CONCLUSIONS
The overall risk for deep wound infections following ACL reconstruction is low but it does vary by graft type. Infection risk was highest in hamstring autograft recipients compared with allograft recipients and bone-patellar tendon-bone autograft recipients.
As herbicide-resistant weed populations become increasingly problematic in crop production, alternative strategies of weed control are necessary. Giant ragweed, one of the most competitive agricultural weeds in row crops, has evolved resistance to multiple herbicide biochemical sites of action within the plant, necessitating the development of new and integrated methods of weed control. This study assessed the quantity and duration of seed retention of giant ragweed grown in soybean fields and adjacent field margins. Seed retention of giant ragweed was monitored weekly during the 2012 to 2014 harvest seasons using seed collection traps. Giant ragweed plants produced an average of 1,818 seeds per plant, with 66% being potentially viable. Giant ragweed on average began shattering hard (potentially viable) and soft (nonviable) seeds September 12 and continued through October at an average rate of 0.75 and 0.44% of total seeds per day during September and October, respectively. Giant ragweed seeds remained on the plants well into the Minnesota soybean harvest season, with an average of 80% of the total seeds being retained on October 11, when Minnesota soybean harvest was approximately 75% completed in the years of the study. These results suggest that there is a sufficient amount of time to remove escaped giant ragweed from production fields and field margins before the seeds shatter by managing weed seed dispersal before or at crop harvest. Controlling weed seed dispersal has potential to manage herbicide-resistant giant ragweed by limiting replenishment of the weed seed bank.
Marine worms in the genus Osedax, have specialized ‘root’ tissues used to bore into the bones of decomposing vertebrate skeletons and obtain nutrition. We investigated the borings of nine Osedax species, using micro computed tomography to quantitatively describe the morphology of the borings and provide three-dimensional reconstructions of the space occupied by Osedax root tissues inside the bone. Each Osedax species displayed a consistent boring morphology in any given bone, but these differed between bones. In bones where multiple species coexisted there was limited evidence for spatial niche partitioning by Osedax root tissues inside the bones investigated here. The new morphological data may be applied to Osedax traces in fossil bones, showing that borings can be used to indicate minimum species richness in these bones.
To explore the feasibility of identifying anterior cruciate ligament (ACL) allograft implantations and infections using claims.
Design.
Retrospective cohort study.
Methods.
We identified ACL reconstructions using procedure codes at 6 health plans from 2000 to 2008. We then identified potential infections using claims-based indicators of infection, including diagnoses, procedures, antibiotic dispensings, specialty consultations, emergency department visits, and hospitalizations. Patients’ medical records were reviewed to determine graft type, validate infection status, and calculate sensitivity and positive predictive value (PPV) for indicators of ACL allografts and infections.
Results.
A total of 11,778 patients with codes for ACL reconstruction were identified. After chart review, PPV for ACL reconstruction was 96% (95% confidence interval [CI], 94%–97%). Of the confirmed ACL reconstructions, 39% (95% CI, 35%–42%) used allograft tissues. The deep infection rate after ACL reconstruction was 1.0% (95% CI, 0.7%–1.4%). The odds ratio of infection for allografts versus autografts was 0.41 (95% CI, 0.19–0.78). Sensitivity of individual claims-based indicators for deep infection after ACL reconstruction ranged from 0% to 75% and PPV from 0% to 100%. Claims-based infection indicators could be combined to enhance sensitivity or PPV but not both.
Conclusions.
While claims data accurately identify ACL reconstructions, they poorly distinguish between allografts and autografts and identify infections with variable accuracy. Claims data could be useful to monitor infection trends after ACL reconstruction, with different algorithms optimized for different surveillance goals.