To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many clinical trials leverage real-world data. Typically, these data are manually abstracted from electronic health records (EHRs) and entered into electronic case report forms (CRFs), a time and labor-intensive process that is also error-prone and may miss information. Automated transfer of data from EHRs to eCRFs has the potential to reduce data abstraction and entry burden as well as improve data quality and safety.
We conducted a test of automated EHR-to-CRF data transfer for 40 participants in a clinical trial of hospitalized COVID-19 patients. We determined which coordinator-entered data could be automated from the EHR (coverage), and the frequency with which the values from the automated EHR feed and values entered by study personnel for the actual study matched exactly (concordance).
The automated EHR feed populated 10,081/11,952 (84%) coordinator-completed values. For fields where both the automation and study personnel provided data, the values matched exactly 89% of the time. Highest concordance was for daily lab results (94%), which also required the most personnel resources (30 minutes per participant). In a detailed analysis of 196 instances where personnel and automation entered values differed, both a study coordinator and a data analyst agreed that 152 (78%) instances were a result of data entry error.
An automated EHR feed has the potential to significantly decrease study personnel effort while improving the accuracy of CRF data.
Recently, defaults have become celebrated as a low-cost and easy-to-implement nudge for promoting positive outcomes, both at an individual and societal level. In the present research, we conducted a large-scale field experiment (N = 32,508) in an educational context to test the effectiveness of a default intervention in promoting participation in a potentially beneficial achievement test. We found that a default manipulation increased the rate at which high school students registered to take the test but failed to produce a significant change in students’ actual rate of test-taking. These results join past literature documenting robust effects of default framings on initial choice but marked variability in the extent to which those choices ultimately translate to real-world outcomes. We suggest that this variability is attributable to differences in choice-to-outcome pathways – the extent to which the initial choice is causally determinative of the outcome.
To examine differences in surgical practices between salaried and fee-for-service (FFS) surgeons for two common degenerative spine conditions. Surgeons may offer different treatments for similar conditions on the basis of their compensation mechanism.
The study assessed the practices of 63 spine surgeons across eight Canadian provinces (39 FFS surgeons and 24 salaried) who performed surgery for two lumbar conditions: stable spinal stenosis and degenerative spondylolisthesis. The study included a multicenter, ambispective review of consecutive spine surgery patients enrolled in the Canadian Spine Outcomes and Research Network registry between October 2012 and July 2018. The primary outcome was the difference in type of procedures performed between the two groups. Secondary study variables included surgical characteristics, baseline patient factors, and patient-reported outcome.
For stable spinal stenosis (n = 2234), salaried surgeons performed statistically fewer uninstrumented fusion (p < 0.05) than FFS surgeons. For degenerative spondylolisthesis (n = 1292), salaried surgeons performed significantly more instrumentation plus interbody fusions (p < 0.05). There were no statistical differences in patient-reported outcomes between the two groups.
Surgeon compensation was associated with different approaches to stable lumbar spinal stenosis and degenerative lumbar spondylolisthesis. Salaried surgeons chose a more conservative approach to spinal stenosis and a more aggressive approach to degenerative spondylolisthesis, which highlights that remuneration is likely a minor determinant in the differences in practice of spinal surgery in Canada. Further research is needed to further elucidate which variables, other than patient demographics and financial incentives, influence surgical decision-making.
Background: Cervical spondylotic myelopathy (CSM) is the leading cause of spinal cord impairment. In a public healthcare system, wait times to see spine specialists and eventually access surgical treatment for CSM can be substantial. The goals of this study were to determine consultation wait times (CWT) and surgical wait times (SWT), and identify predictors of wait time length. Methods: Consecutive patients enrolled in the Canadian Spine Outcomes and Research Network (CSORN) prospective and observational CSM study from March 2015 to July 2017 were included. A data-splitting technique was used to develop and internally validate multivariable models of potential predictors. Results: A CSORN query returned 264 CSM patients for CWT. The median was 46 days. There were 31% mild, 35% moderate, and 33% severe CSM. There was a statistically significant difference in median CWT between moderate and severe groups; 207 patients underwent surgical treatment. Median SWT was 42 days. There was a statistically significant difference in SWT between mild/moderate and severe groups. Short symptom duration, less pain, lower BMI, and lower physical component score of SF-12 were predictive of shorter CWT. Only baseline pain and medication duration were predictive of SWT. Both CWT and SWT were shorter compared to a concurrent cohort of lumbar stenosis patients (p <0.001). Conclusions: Patients with shorter duration (either symptoms or medication) and less neck pain waited less to see a spine specialist in Canada and to undergo surgical treatment. This study highlights some of the obstacles to overcome in expedited care for this patient population.
To determine the feasibility and value of developing a regional antibiogram for community hospitals.
Multicenter retrospective analysis of antibiograms.
SETTING AND PARTICIPANTS
A total of 20 community hospitals in central and eastern North Carolina and south central Virginia participated in this study.
We combined antibiogram data from participating hospitals for 13 clinically relevant gram-negative pathogen–antibiotic combinations. From this combined antibiogram, we developed a regional antibiogram based on the mean susceptibilities of the combined data.
We combined a total of 69,778 bacterial isolates across 13 clinically relevant gram-negative pathogen–antibiotic combinations (median for each combination, 1100; range, 174–27,428). Across all pathogen–antibiotic combinations, 69% of local susceptibility rates fell within 1 SD of the regional mean susceptibility rate, and 97% of local susceptibilities fell within 2 SD of the regional mean susceptibility rate. No individual hospital had >1 pathogen–antibiotic combination with a local susceptibility rate >2 SD of the regional mean susceptibility rate. All hospitals’ local susceptibility rates were within 2 SD of the regional mean susceptibility rate for low-prevalence pathogens (<500 isolates cumulative for the region).
Small community hospitals frequently cannot develop an accurate antibiogram due to a paucity of local data. A regional antibiogram is likely to provide clinically useful information to community hospitals for low-prevalence pathogens.
The porcine small intestinal extracellular matrix reportedly has the potential to differentiate into viable myocardial cells. When used in tetralogy of Fallot repair, it may improve right ventricular function. We evaluated right ventricular function after repair of tetralogy of Fallot with extracellular matrix versus bovine pericardium.
Subjects with non-transannular repair of tetralogy of Fallot with at least 1 year of follow-up were selected. The extracellular matrix and bovine pericardium groups were compared. We used three-dimensional right ventricular ejection fraction, right ventricle global longitudinal strain, and tricuspid annular plane systolic excursion to assess right ventricular function.
The extracellular matrix group had 11 patients, whereas the bovine pericardium group had 10 patients. No differences between the groups were found regarding sex ratio, age at surgery, and cardiopulmonary bypass time. The follow-up period was 28±12.6 months in the extracellular matrix group and 50.05±17.6 months in the bovine pericardium group (p=0.001). The mean three-dimensional right ventricular ejection fraction (55.7±5.0% versus 55.3±5.2%, p=0.73), right ventricular global longitudinal strain (−18.5±3.0% versus −18.0±2.2%, p=0.44), and tricuspid annular plane systolic excursions (1.59±0.16 versus 1.59±0.2, p=0.93) were similar in the extracellular matrix group and in the bovine pericardium group, respectively. Right ventricular global longitudinal strain in healthy children is reported at −29±3% in literature.
In a small cohort of the patients undergoing non-transannular repair of tetralogy of Fallot, there was no significant difference in right ventricular function between groups having extracellular matrix versus bovine pericardium patches followed-up for more than 1 year. Lower right ventricular longitudinal strain noted in both the groups compared to healthy children.
Giant ragweed has been increasing as a major weed of row crops in the last
30 yr, but quantitative data regarding its pattern and mechanisms of spread
in crop fields are lacking. To address this gap, we conducted a Web-based
survey of certified crop advisors in the U.S. Corn Belt and Ontario, Canada.
Participants were asked questions regarding giant ragweed and crop
production practices for the county of their choice. Responses were mapped
and correlation analyses were conducted among the responses to determine
factors associated with giant ragweed populations. Respondents rated giant
ragweed as the most or one of the most difficult weeds to manage in 45% of
421 U.S. counties responding, and 57% of responding counties reported giant
ragweed populations with herbicide resistance to acetolactate synthase
inhibitors, glyphosate, or both herbicides. Results suggest that giant
ragweed is increasing in crop fields outward from the east-central U.S. Corn
Belt in most directions. Crop production practices associated with giant
ragweed populations included minimum tillage, continuous soybean, and
multiple-application herbicide programs; ecological factors included giant
ragweed presence in noncrop edge habitats, early and prolonged emergence,
and presence of the seed-burying common earthworm in crop fields. Managing
giant ragweed in noncrop areas could reduce giant ragweed migration from
noncrop habitats into crop fields and slow its spread. Where giant ragweed
is already established in crop fields, including a more diverse combination
of crop species, tillage practices, and herbicide sites of action will be
critical to reduce populations, disrupt emergence patterns, and select
against herbicide-resistant giant ragweed genotypes. Incorporation of a
cereal grain into the crop rotation may help suppress early giant ragweed
emergence and provide chemical or mechanical control options for
late-emerging giant ragweed.
The Glacier Bay region of southeast Alaska, USA, and British Columbia, Canada, has undergone major glacier retreat since the Little Ice Age (LIA). We used airborne laser altimetry elevation data acquired between 1995 and 2011 to estimate the mass loss of the Glacier Bay region over four time periods (1995–2000, 2000–05, 2005–09, 2009–11). For each glacier, we extrapolated from center-line profiles to the entire glacier to estimate glacier-wide mass balance, and then averaged these results over the entire region using three difference methods (normalized elevation, area-weighted method and simple average). We found that there was large interannual variability of the mass loss since 1995 compared with the long-term (post-LIA) average. For the full period (1995–2011) the average mass loss was 3.93 ± 0.89 Gt a−1 (0.6 ± 0.1 m w.e. a−1), compared with 17.8 Gt a−1 for the post-LIA (1770–1948) rate. Our mass loss rate is consistent with GRACE gravity signal changes for the 2003–10 period. Our results also show that there is a lower bias due to center-line profiling than was previously found by a digital elevation model difference method.
The effects of vascular rarefaction (the loss of small arteries) on the circulation of blood are studied using a multiscale mathematical model that can predict blood flow and pressure in the systemic and pulmonary arteries. We augmented a model originally developed for the systemic arteries by Olufsen and coworkers and Ottesen et al. (2004) to (a) predict flow and pressure in the pulmonary arteries, and (b) predict pressure propagation along the small arteries in the vascular beds. The systemic and pulmonary arteries are modelled as separate bifurcating trees of compliant and tapering vessels. Each tree is divided into two parts representing the ‘large’ and ‘small’ arteries. Blood flow and pressure in the large arteries are predicted using a nonlinear cross-sectional-area-averaged model for a Newtonian fluid in an elastic tube with inflow obtained from magnetic resonance measurements. Each terminal vessel within the network of the large arteries is coupled to a vascular bed of small ‘resistance’ arteries, which are modelled as asymmetric structured trees with specified area and asymmetry ratios between the parent and daughter arteries. For the systemic circulation, each structured tree represents a specific vascular bed corresponding to major organs and limbs. For the pulmonary circulation, there are four vascular beds supplied by the interlobar arteries. This paper presents the first theoretical calculations of the propagation of the pressure and flow waves along systemic and pulmonary large and small arteries. Results for all networks are in agreement with published observations. Two studies were done with this model. First, we showed how rarefaction can be modelled by pruning the tree of arteries in the microvascular system. This was done by modulating parameters used for designing the structured trees. Results showed that rarefaction leads to increased mean and decreased pulse pressure in the large arteries. Second, we investigated the impact of decreasing vessel compliance in both large and small arteries. Results showed that the effects of decreased compliance in the large arteries far outweigh the effects observed when decreasing the compliance of the small arteries. We further showed that a decrease of compliance in the large arteries results in pressure increases consistent with observations of isolated systolic hypertension, as occurs in ageing.