We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Review a single-centre experience with pulmonary artery sling repair and evaluate risk factors for re-intervention.
Methods:
Patients with surgically repaired pulmonary artery sling at a single institution between 1996 and 2018 were retrospectively reviewed. A univariate Cox regression analysis was used to evaluate variables for association with freedom from re-intervention.
Results:
Eighteen patients had pulmonary artery sling repair. At operation, median age and weight were 6.9 months (interquartile range 4.1–18.1) and 9.5 kg (interquartile range 6.5–14.5), respectively. A median hospital length of stay was 12 days (interquartile range 5.8–55.3). Twelve patients (67%) had complete tracheal rings, of whom six (50%) underwent tracheoplasty (five concurrently with pulmonary artery sling repair). Airway re-intervention was required in five (83%) of the six patients who underwent tracheoplasty. One patient had intraoperative diagnosis and repair of pulmonary artery sling during unrelated lesion repair and required tracheoplasty 24 days post-operatively. One patient died 55 days after pulmonary artery sling repair and tracheoplasty following multiple arrests and re-interventions. Median post-operative follow-up for surviving patients was 6.3 years (interquartile range 11 months–13 years), at which time freedom from re-intervention was 61%. When controlling for patient and tracheal size, initial tracheoplasty was associated with decreased freedom from re-intervention (hazard ratio 21.9, 95% confidence interval 1.7–284.3, p = 0.018).
Conclusions:
In patients with pulmonary artery sling, tracheoplasty is associated with decreased freedom from re-intervention. In select patients with pulmonary artery sling and complete tracheal rings, conservative management without tracheoplasty is feasible. Further study is necessary to delineate objective indications for tracheoplasty.
Posthodiplostomum minimum utilizes a three-host life cycle with multiple developmental stages. The metacercarial stage, commonly known as ‘white grub’, infects the visceral organs of many freshwater fishes and was historically considered a host generalist due to its limited morphological variation among a wide range of hosts. In this study, infection data and molecular techniques were used to evaluate the host and tissue specificity of Posthodiplostomum metacercariae in centrarchid fishes. Eleven centrarchid species from three genera were collected from the Illinois portion of the Ohio River drainage and necropsied. Posthodiplostomum infection levels differed significantly by host age, host genera and infection locality. Three Posthodiplostomum spp. were identified by DNA sequencing, two of which were relatively common within centrarchid hosts. Both common species were host specialists at the genus level, with one species restricted to Micropterus hosts and the other preferentially infecting Lepomis. Host specificity is likely dictated by physiological compatibility and deviations from Lepomis host specificity may be related to host hybridization. Posthodiplostomum species also differed in their utilization of host tissues. Neither common species displayed strong genetic structure over the scale of this study, likely due to their utilization of bird definitive hosts.
Simulation-based education (SBE) is an important training strategy in emergency medicine (EM) postgraduate programs. This study sought to characterize the use of simulation in FRCPC-EM residency programs across Canada.
Methods
A national survey was administered to residents and knowledgeable program representatives (PRs) at all Canadian FRCPC-EM programs. Survey question themes included simulation program characteristics, the frequency of resident participation, the location and administration of SBE, institutional barriers, interprofessional involvement, content, assessment strategies, and attitudes about SBE.
Results
Resident and PR response rates were 63% (203/321) and 100% (16/16), respectively. Residents reported a median of 20 (range 0–150) hours of annual simulation training, with 52% of residents indicating that the time dedicated to simulation training met their needs. PRs reported the frequency of SBE sessions ranging from weekly to every 6 months, with 15 (94%) programs having an established simulation curriculum. Two (13%) of the programs used simulation for resident assessment, although 15 (94%) of PRs indicated that they would be comfortable with simulation-based assessment. The most common PR-identified barriers to administering simulation were a lack of protected faculty time (75%) and a lack of faculty experience with simulation (56%). Interprofessional involvement in simulation was strongly valued by both residents and PRs.
Conclusions
SBE is frequently used by Canadian FRCPC-EM residency programs. However, there exists considerable variability in the structure, frequency, and timing of simulation-based activities. As programs transition to competency-based medical education, national organizations and collaborations should consider the variability in how SBE is administered.
A nationwide initiative was implemented in February 2014 to decrease Clostridium difficile infections (CDI) in Veterans Affairs (VA) long-term care facilities. We report a baseline of national CDI data collected during the 2 years before the Initiative.
METHODS
Personnel at each of 122 reporting sites entered monthly retrospective CDI case data from February 2012 through January 2014 into a national database using case definitions similar to those used in the National Healthcare Safety Network Multidrug-Resistant Organism/CDI module. The data were evaluated using Poisson regression models to examine infection occurrences over time while accounting for admission prevalence and type of diagnostic test.
RESULTS
During the 24-month analysis period, there were 100,800 admissions, 6,976,121 resident days, and 1,558 CDI cases. The pooled CDI admission prevalence rate (including recurrent cases) was 0.38 per 100 admissions, and the pooled nonduplicate/nonrecurrent community-onset rate was 0.17 per 100 admissions. The pooled long-term care facility–onset rate and the clinically confirmed (ie, diarrhea or evidence of pseudomembranous colitis) long-term care facility–onset rate were 1.98 and 1.78 per 10,000 resident days, respectively. Accounting for diagnostic test type, the long-term care facility–onset rate declined significantly (P=.05), but the clinically confirmed long-term care facility–onset rate did not.
CONCLUSIONS
VA long-term care facility CDI rates were comparable to those in recent reports from other long-term care facilities. The significant decline in the long-term care facility-onset rate but not in the clinically confirmed long-term care facility–onset rate may have been due to less testing of asymptomatic patients. Efforts to decrease CDI rates in long-term care facilities are necessary as part of a coordinated approach to decrease healthcare-associated infections.
Infect. Control Hosp. Epidemiol. 2016;37(3):295–300
Transcatheter pulmonary valve implantation is usually performed from a femoral venous – transfemoral – approach, but this may not be the optimal vascular access option in some patients. This study aimed to determine which group of patients might benefit from an internal jugular – transjugular – approach for transcatheter pulmonary valve implantation.
Methods
This multicentre retrospective study included all patients who underwent attempted transcatheter pulmonary valve placement in the right ventricular outflow tract between April 2010 and June 2012 at two large congenital heart centres. Patients were divided into two groups based on venous access site – transfemoral or transjugular. Patient characteristics, procedural outcomes, and complications were compared between groups.
Results
Of 81 patients meeting the inclusion criteria (median age 16.4 years), the transjugular approach was used in 14 patients (17%). The transjugular group was younger (median age 11.9 versus 17.3 years), had lower body surface area (mean 1.33 versus 1.61 m2), more often had moderate or greater tricuspid regurgitation (29% versus 7%), and had a higher ratio of right ventricle-to-systemic systolic pressure (mean 82.4 versus 64.7). Patients requiring a transjugular approach after an unsuccessful transfemoral approach had longer fluoroscopic times and procedure duration.
Conclusions
The transjugular approach for transcatheter pulmonary valve implantation is used infrequently but is more often used in younger and smaller patients. Technical limitations from a transfemoral approach may be anticipated if there is moderate or greater tricuspid regurgitation or higher right ventricular pressures. In these patients, a transjugular approach should be considered early.
Objectives: The aim of this study was to develop a decision support tool to assess the potential benefits and costs of new healthcare interventions.
Methods: The Canadian Partnership Against Cancer (CPAC) commissioned the development of a Cancer Risk Management Model (CRMM)—a computer microsimulation model that simulates individual lives one at a time, from birth to death, taking account of Canadian demographic and labor force characteristics, risk factor exposures, and health histories. Information from all the simulated lives is combined to produce aggregate measures of health outcomes for the population or for particular subpopulations.
Results: The CRMM can project the population health and economic impacts of cancer control programs in Canada and the impacts of major risk factors, cancer prevention, and screening programs and new cancer treatments on population health and costs to the healthcare system. It estimates both the direct costs of medical care, as well as lost earnings and impacts on tax revenues. The lung and colorectal modules are available through the CPAC Web site (www.cancerview.ca/cancerrriskmanagement) to registered users where structured scenarios can be explored for their projected impacts. Advanced users will be able to specify new scenarios or change existing modules by varying input parameters or by accessing open source code. Model development is now being extended to cervical and breast cancers.
A passive seismology experiment was conducted across the main overdeepening of Storglaciären in the Tarfala valley, northern Sweden, to investigate the spatial and temporal distribution of basal microseismic waveforms in relation to known dynamics of this small polythermal sub-arctic glacier. The high ablation rate made it difficult to keep geophones buried and well coupled to the glacier during the experiment and reduced the number of days of good-quality data collection. The characterization of typical and atypical waveforms showed that the dominant waveforms were from near-surface events such as crevassing. Waveforms resembling basal microseismic signals were very rare, and seldom observed on more than two seismic stations simultaneously. The analysis of waveforms, amplitudes and particle motions suggested a near-field origin for most events. Even though basal sliding is known to occur in the overdeepening, no convincing examples of basal waveforms were detected, suggesting basal microseismic signals are rare or difficult to detect beneath polythermal glaciers like Storglaciären. We discuss the reasons for failing to locate basal signals, consider the origin of common waveforms and make recommendations for setting up passive seismology experiments on glaciers with high ablation rates.
Edited by
Alex S. Evers, Washington University School of Medicine, St Louis,Mervyn Maze, University of California, San Francisco,Evan D. Kharasch, Washington University School of Medicine, St Louis