To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Introduction: Emergency care serves as an important health resource for First Nations (FN) persons. Previous reporting shows that FN persons visit emergency departments at almost double the rate of non-FN persons. Working collaboratively with FN partners, academic researchers and health authority staff, the objective of this study is to investigate FN emergency care patient visit statistics in Alberta over a five year period. Methods: Through a population-based retrospective cohort study for the period from April 1, 2012 to March 31, 2017, patient demographics and emergency care visit characteristics for status FN patients in Alberta were analyzed and compared to non-FN statistics. Frequencies and percentages (%) describe patients and visits by categorical variables (e.g., Canadian Triage Acuity Scale (CTAS)). Means and standard deviations (medians and interquartile ranges (IQR)) describe continuous variables (e.g., distances) as appropriate for the data distribution. These descriptions are repeated for the FN and non-FN populations, separately. Results: The data set contains 11,686,288 emergency facility visits by 3,024,491 unique persons. FN people make up 4.8% of unique patients and 9.4% of emergency care visits. FN persons live further from emergency facilities than their non-FN counterparts (FN median 6 km, IQR 1-24; vs. non-FN median 4 km, IQR 2-8). FN visits arrive more often by ground ambulance (15.3% vs. 10%). FN visits are more commonly triaged as less acute (59% CTAS levels 4 and 5, compared to non-FN 50.4%). More FN visits end in leaving without completing treatment (6.7% vs. 3.6%). FN visits are more often in the evening – 4:01pm to 12:00am (43.6% vs. 38.1%). Conclusion: In a collaborative validation session, FN Elders and health directors contextualized emergency care presentation in evenings and receiving less acute triage scores as related to difficulties accessing primary care. They explained presentation in evenings, arrival by ambulance, and leaving without completing treatment in terms of issues accessing transport to and from emergency facilities. Many factors interact to determine FN patients’ emergency care visit characteristics and outcomes. Further research needs to separate the impact of FN identity from factors such as reasons for visiting emergency facilities, distance traveled to care, and the size of facility where care is provided.
Neurocognitive impairments robustly predict functional outcome. However, heterogeneity in neurocognition is common within diagnostic groups, and data-driven analyses reveal homogeneous neurocognitive subgroups cutting across diagnostic boundaries.
To determine whether data-driven neurocognitive subgroups of young people with emerging mental disorders are associated with 3-year functional course.
Model-based cluster analysis was applied to neurocognitive test scores across nine domains from 629 young people accessing mental health clinics. Cluster groups were compared on demographic, clinical and substance-use measures. Mixed-effects models explored associations between cluster-group membership and socio-occupational functioning (using the Social and Occupational Functioning Assessment Scale) over 3 years, adjusted for gender, premorbid IQ, level of education, depressive, positive, negative and manic symptoms, and diagnosis of a primary psychotic disorder.
Cluster analysis of neurocognitive test scores derived three subgroups described as ‘normal range’ (n = 243, 38.6%), ‘intermediate impairment’ (n = 252, 40.1%), and ‘global impairment’ (n = 134, 21.3%). The major mental disorder categories (depressive, anxiety, bipolar, psychotic and other) were represented in each neurocognitive subgroup. The global impairment subgroup had lower functioning for 3 years of follow-up; however, neither the global impairment (B = 0.26, 95% CI −0.67 to 1.20; P = 0.581) or intermediate impairment (B = 0.46, 95% CI −0.26 to 1.19; P = 0.211) subgroups differed from the normal range subgroup in their rate of change in functioning over time.
Neurocognitive impairment may follow a continuum of severity across the major syndrome-based mental disorders, with data-driven neurocognitive subgroups predictive of functional course. Of note, the global impairment subgroup had longstanding functional impairment despite continuing engagement with clinical services.
Cover crop residue can act as a mulch that will suppress weeds, but as the residue degrades, weed suppression diminishes. Biomass of cover crop residue is positively correlated to weed suppression, but little research is available regarding the composition of cover crop residue and its effect on weed suppression. Field experiments were conducted to determine the impact of cover crop residue properties (i.e., total carbon, total nitrogen, lignin, cellulose, and hemicellulose) on summer annual weed suppression and cash crop yield. Cover crop monocultures and mixtures were planted in the fall and designed to provide a range of biomass and residue properties. Cover crops were followed by corn (Zea mays L.) or soybean [Glycine max (L.) Merr.]. At termination, cover crop biomass and residue components were determined. Biomass ranged from 3,640 to 8,750 kg ha−1, and the carbon-to-nitrogen (C:N) ratio ranged from 12:1 to 36:1. As both cover crop biomass and C:N ratio increased, weed suppression and duration of suppression increased. For example, a C:N ratio of 9:1 is needed to suppress redroot pigweed (Amaranthus retroflexus L.) 50% at 4 wk after termination (WAT), and that increases to 16:1 and 20:1 to have 50% suppression at 6 and 8 WAT, respectively. Similarly, with biomass, 2,800 kg ha−1 is needed for 50% A. retroflexus suppression at 4 WAT, which increases to 5,280 kg ha−1 and 6,610 kg ha−1 needed for 50% suppression at 6 and 8 WAT, respectively. In general, similar trends were observed for pitted morningglory (Ipomoea lacunosa L.) and large crabgrass [Digitaria sanguinalis (L.) Scop.]. Corn and soybean yield increased as both cover crop biomass and C:N ratio increased where no weed control measures were implemented beyond cover crop. The same trend was observed with cash crop yield in the weed-free subblocks, with one exception. This research indicates that cover crop residue composition is important for weed control in addition to biomass.
To determine the effect of an electronic medical record (EMR) nudge at reducing total and inappropriate orders testing for hospital-onset Clostridioides difficile infection (HO-CDI).
An interrupted time series analysis of HO-CDI orders 2 years before and 2 years after the implementation of an EMR intervention designed to reduce inappropriate HO-CDI testing. Orders for C. difficile testing were considered inappropriate if the patient had received a laxative or stool softener in the previous 24 hours.
Four hospitals in an academic healthcare network.
All patients with a C. difficile order after hospital day 3.
Orders for C. difficile testing in patients administered a laxative or stool softener in <24 hours triggered an EMR alert defaulting to cancellation of the order (“nudge”).
Of the 17,694 HO-CDI orders, 7% were inappropriate (8% prentervention vs 6% postintervention; P < .001). Monthly HO-CDI orders decreased by 21% postintervention (level-change rate ratio [RR], 0.79; 95% confidence interval [CI], 0.73–0.86), and the rate continued to decrease (postintervention trend change RR, 0.99; 95% CI, 0.98–1.00). The intervention was not associated with a level change in inappropriate HO-CDI orders (RR, 0.80; 95% CI, 0.61–1.05), but the postintervention inappropriate order rate decreased over time (RR, 0.95; 95% CI, 0.93–0.97).
An EMR nudge to minimize inappropriate ordering for C. difficile was effective at reducing HO-CDI orders, and likely contributed to decreasing the inappropriate HO-CDI order rate after the intervention.
A desired closure property in Bayesian probability is that an updated posterior distribution be in the same class of distributions – say Gaussians – as the prior distribution. When the updating takes place via a statistical model, one calls the class of prior distributions the ‘conjugate priors’ of the model. This paper gives (1) an abstract formulation of this notion of conjugate prior, using channels, in a graphical language, (2) a simple abstract proof that such conjugate priors yield Bayesian inversions and (3) an extension to multiple updates. The theory is illustrated with several standard examples.
Tomasello describes how the sense of moral obligation emerges from a shared perspective with collaborative partners and in-group members. Our commentary expands this framework to accommodate multiple social identities, where the normative standards associated with diverse group memberships can often conflict with one another. Reconciling these conflicting obligations is argued to be a central part of human morality.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
On many Australian commercial pig farms, groups of growing pigs are mass-medicated through their drinking water with selected antimicrobials for short periods to manage herd health. However, delivery of medication in drinking water cannot be assumed to deliver an equal dose to all animals in a group. There is substantial between-animal variability in systemic exposure to an antimicrobial (i.e. the antimicrobial concentration in plasma), resulting in under-dosing or over-dosing of many pigs. Three sources of this between-animal variability during a water medication dosing event are differences in: (1) concentration of the active constituent of the antimicrobial product in water available to pigs at drinking appliances in each pen over time, (2) medicated water consumption patterns of pigs in each pen over time, and (3) pharmacokinetics (i.e. oral bioavailability, volume of distribution and clearance between pigs and within pigs over time). It is essential that factors operating on each farm that influence the range of systemic exposures of pigs to an antimicrobial are factored into antimicrobial administration regimens to reduce under-dosing and over-dosing.
Recent years have seen an exponential increase in the variety of healthcare data captured across numerous sources. However, mechanisms to leverage these data sources to support scientific investigation have remained limited. In 2013 the Pediatric Heart Network (PHN), funded by the National Heart, Lung, and Blood Institute, developed the Integrated CARdiac Data and Outcomes (iCARD) Collaborative with the goals of leveraging available data sources to aid in efficiently planning and conducting PHN studies; supporting integration of PHN data with other sources to foster novel research otherwise not possible; and mentoring young investigators in these areas. This review describes lessons learned through the development of iCARD, initial efforts and scientific output, challenges, and future directions. This information can aid in the use and optimisation of data integration methodologies across other research networks and organisations.
Antipseudomonal carbapenems are an important target for antimicrobial stewardship programs. We evaluated the impact of formulary restriction and preauthorization on relative carbapenem use for medical and surgical intensive care units at a large, urban academic medical center using interrupted time-series analysis.
Background: In preparation for July 2019 rollout of competency-based design (CBD) in Canadian neurosurgery residency training, the University of Calgary launched a pilot-program of five representative EPAs using the One45 program. Our study objectives were to examine the uptake of CBD with residents and faculty and to quantify CBD implementation barriers. Methods: Phase one of the One45-based CBD pilot-program launched on November 1st, 2018 and ended on January 8th, 2019, after which a questionnaire was sent to each participating resident. The questionnaire examined number of EPAs initiated, measures of favourability, importance, ease of use, and barriers encountered. Results: Results obtained from the survey show 93.8% response rate (15/16 residents). 66.7% of residents feel that CBD is moderately important or higher to their education. Over the 10 study weeks, there were only 8 completed EPAs (expected was 50), five of which were completed by a single resident. Major expressed barriers of implementation of CBD were time involved (50.0%) and technical unfamiliarity with the platform itself (50.0%). Conclusions: This study demonstrates the critical importance of piloting a CBD program prior to official implementation as immediate buy-in was significantly slower than anticipated. Technical and time barriers exist which need to be rectified in advance of July 2019.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Horseweed is a problematic weed to control, especially in no-tillage production. Increasing cases of herbicide resistance have exacerbated the problem, necessitating alternative control options and an integrated weed management approach. Field experiments were conducted to evaluate horseweed suppression from fall-planted cover crop monocultures and mixtures as well as two fall-applied residual herbicide treatments. Prior to cover crop termination, horseweed density was reduced by 88% to 96% from cover crops. At cover crop termination in late spring, cereal rye biomass was 7,671 kg ha–1, which was similar to cereal rye–containing mixtures (7,720 kg ha–1) but greater than legumes in monoculture (3,335 kg ha–1). After cover crops were terminated in late spring using a roller crimper, corn and soybeans were planted and horseweed was evaluated using density counts, visible ratings, and biomass collection until harvest. Forage radish winterkilled, offering no competition in late winter or biomass to contribute to horseweed suppression after termination. Excluding forage radish in monoculture, no difference in horseweed suppression was detected between cereal rye–containing cover crops and legumes (crimson clover and hairy vetch) in monoculture. Likewise, horseweed suppression was similar between monocultures and mixtures, with the exception of one site-year in which mixtures provided better suppression. In this experiment, the cover crop treatments performed as well as or better than the fall-applied residual herbicides, flumioxazin+paraquat and metribuzin+chlorimuron-ethyl. These results indicate that fall-planted cover crops are a viable option to suppress horseweed and can be an effective part of an integrated weed management program. Furthermore, cover crop mixtures can be used to gain the benefits of legume or brassica cover crop species without sacrificing horseweed suppression.
A significant proportion of adults who are admitted to psychiatric hospitals are homeless, yet little is known about their outcomes after a psychiatric hospitalisation discharge. The aim of this study was to assess the impact of being homeless at the time of psychiatric hospitalisation discharge on psychiatric hospital readmission, mental health-related emergency department (ED) visits and physician-based outpatient care.
This was a population-based cohort study using health administrative databases. All patients discharged from a psychiatric hospitalisation in Ontario, Canada, between 1 April 2011 and 31 March 2014 (N = 91 028) were included and categorised as homeless or non-homeless at the time of discharge. Psychiatric hospitalisation readmission rates, mental health-related ED visits and physician-based outpatient care were measured within 30 days following hospital discharge.
There were 2052 (2.3%) adults identified as homeless at discharge. Homeless individuals at discharge were significantly more likely to have a readmission within 30 days following discharge (17.1 v. 9.8%; aHR = 1.43 (95% CI 1.26–1.63)) and to have an ED visit (27.2 v. 11.6%; aHR = 1.87 (95% CI 1.68–2.0)). Homeless individuals were also over 50% less likely to have a psychiatrist visit (aHR = 0.46 (95% CI 0.40–0.53)).
Homeless adults are at higher risk of readmission and ED visits following discharge. They are also much less likely to receive post-discharge physician care. Efforts to improve access to services for this vulnerable population are required to reduce acute care service use and improve care continuity.
Hospital-onset bacteremia and fungemia (HOB), a potential measure of healthcare-associated infections, was evaluated in a pilot study among 60 patients across 3 hospitals. Two-thirds of all HOB events and half of nonskin commensal HOB events were judged as potentially preventable. Follow-up studies are needed to further develop this measure.