To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
The 2015-2016 academic year was the fourth year since the Accreditation Council for Graduate Medical Education (ACGME; Chicago, Illinois USA) accredited Emergency Medical Services (EMS) fellowships, and the first year an in-training examination was given. Soon, ACGME-accredited fellowship education will be the sole path to EMS board certification when the practice pathway closes after 2019. This project aimed to describe the current class of EMS fellows at ACGME-accredited programs and their current educational opportunities to better understand current and future needs in EMS fellowship education.
This was a cross-sectional survey of EMS fellows in ACGME-accredited programs in conjunction with the first EMS In-Training Examination (EMSITE) between April and June 2016. Fellows completed a 14-question survey composed of multiple-choice and free-response questions. Basic frequency statistics were performed on their responses.
Fifty fellows from 35 ACGME-accredited programs completed the survey. The response rate was 100%. Forty-eight (96%) fellows reported previous training in emergency medicine. Twenty (40%) were undergoing fellowship training at the same institution as their prior residency training. Twenty-five (50%) fellows performed direct patient care aboard a helicopter during their fellowship. Thirty-three (66%) fellows had a dedicated physician response vehicle for fellows. All fellows reported using the National Association of EMS Physicians (NAEMSP; Overland Park, Kansas USA) textbooks as their primary reference. Fellows felt most prepared for the Clinical Aspects questions and least prepared for Quality Management and Research questions on the board exam.
These data provide insight into the characteristics of EMS fellows in ACGME-accredited programs.
ClemencyB, Martin-GillC, RallN, PatelD, MyersJ. US Emergency Medical Services Fellows. Prehosp Disaster Med. 2018;33(3):339–341.
Although procedural sedation for cardioversion is a common event in emergency departments (EDs), there is limited evidence surrounding medication choices. We sought to evaluate geographic and temporal variation in sedative choice at multiple Canadian sites, and to estimate the risk of adverse events due to sedative choice.
This is a secondary analysis of one health records review, the Recent Onset Atrial Fibrillation or Flutter-0 (RAFF-0 [n=420, 2008]) and one prospective cohort study, the Recent Onset Atrial Fibrillation or Flutter-1 (RAFF-1 [n=565, 2010 – 2012]) at eight and six Canadian EDs, respectively. Sedative choices within and among EDs were quantified, and the risk of adverse events was examined with adjusted and unadjusted comparisons of sedative regimes.
In RAFF-0 and RAFF-1, the combination of propofol and fentanyl was most popular (63.8% and 52.7%) followed by propofol alone (27.9% and 37.3%). There were substantially more adverse events in the RAFF-0 data set (13.5%) versus RAFF-1 (3.3%). In both data sets, the combination of propofol/fentanyl was not associated with increased adverse event risk compared to propofol alone.
There is marked variability in procedural sedation medication choice for a direct current cardioversion in Canadian EDs, with increased use of propofol alone as a sedation agent over time. The risk of adverse events from procedural sedation during cardioversion is low but not insignificant. We did not identify an increased risk of adverse events with the addition of fentanyl as an adjunctive analgesic to propofol.
This roundtable reflects on the processes of de-centering from multiple lenses and temporal placements inside the research and creative process. It is based on a collaborative, intermedia, and multitemporal contemporary performance/art installation informed by long-term ethnographic research of dance and ritual in Ghana and Cuba. Roundtable participants will excavate the process of conducting the research and creating the installation that continues to exhibit internationally at venues ranging from art galleries and libraries to rural research field sites. The installation offers a matrix of layered artistic exploration grounded in ethnographic inquiry that does not sit squarely inside a singular discipline. Inherently transdisciplinary, with multiple entanglements and porous boundaries, it offers “interpretive frictions” at the borders of ethnography, performance, material culture, research-based choreography, and embodiment of lived experience.
The current study examines the impact of a nutrition rating system on consumers’ food purchases in supermarkets.
Aggregate sales data for 102 categories of food (over 60 000 brands) on a weekly basis for 2005–2007 from a supermarket chain of over 150 stores are analysed. Change in weekly sales of nutritious and less nutritious foods, after the introduction of a nutrition rating system on store shelves, is calculated, controlling for seasonality and time trends in sales.
One hundred and sixty-eight supermarket stores in the north-east USA, from January 2005 to December 2007.
Consumers purchasing goods at the supermarket chain during the study period.
After the introduction of the nutrition ratings, overall weekly food sales declined by an average of 3637 units per category (95 % CI –5961, –1313; P<0·01). Sales of less nutritious foods fell by 8·31 % (95 % CI –13·50, –2·80 %; P=0·004), while sales of nutritious foods did not change significantly (P=0·21); as a result, the percentage of food purchases rated as nutritious rose by 1·39 % (95 % CI 0·58, 2·20 %; P<0·01). The decrease in sales of less nutritious foods was greatest in the categories of canned meat and fish, soda pop, bakery and canned vegetables.
The introduction of the nutrition ratings led shoppers to buy a more nutritious mix of products. Interestingly, it did so by reducing purchases of less nutritious foods rather than by increasing purchases of nutritious foods. In evaluating nutrition information systems, researchers should focus on the entire market basket, not just sales of nutritious foods.
This article will narrate the process of working in artistic collaboration utilizing traditional Ghanaian dance forms and Western dance-making methods while incorporating a common artistic thread between the two cultures. Invited to create a new work of choreography for the Ghana Dance Ensemble (GDE), the author as guest artist choreographer explored ways of creating a hybrid dance work that honored the artistic footprint of GDE. The choreographer engaged company members, consisting of both dancers and musicians, in daily rehearsals and dialogue about the artistic process and the aesthetic roots from which each artist was grounded. Compositional structures were explored cross-culturally. Traveling out to several field sites, the choreographer was able to view and participate in sacred ceremony for more grounding and artistic information. From this process, a dialogic space was created in which new meanings were shared between cultures and traditional artistic values re-imagined. Dialogue through conversation was not the only exchange of importance. An additional dialogue was that of dancing bodies viewing each other, adapting and integrating change firmly grounded in each other's originating aesthetic footprint. Equally important was the exchange in a culture where it is inherent that the music sounds the dance and the dance moves the music. Thus the dialogue extended itself where the choreographer tried on new ways of thinking about the sounding body just as GDE integrated the choreographer's approaches and made it their own through their own processes of creative invention.
The 2000 presidential election made various electoral institutions—from ballot format to voting mechanisms—suddenly prominent in public debate. One institution that garnered little attention, but nonetheless affected the outcome, was apportionment. A few commentators, looking ahead to 2004, noticed that Bush would have won more comfortably had the apportionment based on the 2000 census already been in place for the 2000 election. Little attention, however, was paid to the method by which 1990 census data were used to generate the 1992–2000 apportionment, even though there are many ways to perform that allocation, the United States has used different methods over its history, and the precise algorithm turned out, in this instance, to matter. More generally, previous discussions of apportionment methods have neglected the point that allocation to states of US House seats simultaneously determines Electoral College weights. Since the Electoral College has built-in biases favoring small states, an apportionment method that partially offsets this bias might be justifiable. We revisit some criteria by which one might prefer one apportionment rule to another, in light of this double duty.
To determine the timing of community-onset Clostridium difficile–associated disease (CDAD) relative to the patient's last healthcare facility discharge, the association of postdischarge cases with healthcare facility–onset cases, and the influence of postdischarge cases on overall rates and interhospital comparison of rates of CDAD.
Retrospective cohort study for the period January 1, 2005, through December 31, 2005.
Catchment areas of 6 acute care hospitals in North Carolina.
We reviewed medical and laboratory records to determine the date of symptom onset, the dates of hospitalization, and stool C. difficile toxin assay results for patients with CDAD who had diarrhea and positive toxin–assay results. Cases were classified as healthcare facility–onset if they were diagnosed more than 48 hours after admission. Cases were defined as community-onset if they were diagnosed in the community or within 48 hours after admission, and were also classified on the basis of the time since the last discharge: if within 4 weeks, community-onset, healthcare facility–associated (CO-HCFA); if 4-12 weeks, indeterminate exposure; and if more than 12 weeks, community-associated. Pearson's correlation coefficient was used to assess the association between monthly rates of healthcare facility–onset, healthcare facility–associated (HO-HCFA) cases and CO-HCFA cases. We performed interhospital rate comparisons using HO-HCFA cases only and using both HO-HCFA and CO-HCFA cases.
Of 1046 CDAD cases, 442 (42%) were HO-HCFA cases and 604 (58%) were community-onset cases. Of the 604 community-onset cases, 94 (15%) were CO-HCFA, 40 (7%) were of indeterminate exposure, and 208 (34%) community-associated. A modest correlation was found between monthly rates of HO-HCFA cases and CO-HCFA cases across the 6 hospitals (r = 0.63, P<.001). Interhospital rankings changed for 6 of 11 months if CO-HCFA cases were included.
A substantial proportion of community-onset cases of CDAD occur less than 4 weeks after discharge from a healthcare facility, and inclusion of CO-HCFA cases influences interhospital comparisons. Our findings support the use of a proposed definition of healthcare facility–associated CDAD that includes cases that occur within 4 weeks after discharge.
Bird vetch is a perennial Eurasian plant which, unlike many exotic weed species, can invade low fertility areas that have not been disturbed. It also is common in pastures, woodland, and tall forb communities. Bird vetch is expanding along Alaskan roadsides, in urbanized areas, and in low density forests. A greenhouse study was conducted to determine efficacy of six herbicide treatments applied at reduced rates in 2005 and again in 2006 for bird vetch seedling control. Bird vetch seedlings were tolerant of reduced rates of chlorsulfuron and 2,4-DB; however, complete control was achieved with rates of clopyralid, dicamba plus diflufenzopyr, triclopyr, and 2,4-D that were a fourth to an eighth of the full registered rate. These results will be important for developing effective, low-cost methods for controlling bird vetch in Alaska, especially on the outer margins of infestations.
The integration of vector (and operator) valued functions with respect to vector (and operator) valued measures can be simplified by assuming that the measures involved take values in the positive elements of a Banach lattice.
The connection between Clifford analysis and the Weyl functional calculus for a d-tuple of bounded selfadjoint operators is used to prove a geometric condition due to J. Bazer and D. H. Y. Yen for a point to be in the support of the Weyl functional calculus for a pair of hermitian matrices. Examples are exhibited in which the support has gaps.
The effects of supplementing a basal diet of silage and hay with increasing
amounts of harvested spring pasture, or with lupin and wheat, on the composition
of milk and the consequent effects on cheese composition and yield were investigated
in an indoor feeding study. Milk was collected from five groups of eight cows in mid
lactation offered different diets and manufactured into Cheddar cheese on a pilot
scale. Milk from cows given the lupin–wheat (LW) and the high pasture level (HP)
diets produced low moisture cheese. Cheese produced with milk from cows given the
control diet was high in moisture content compared with that made with milk from
cows offered the LW diet. Cheese yields from the milk of cows offered the HP and LW
diets were greater than from the milk of cows on the control diet, and were associated
with the higher casein concentrations of these milks. Casein number was higher in
milk from diets supplemented with pasture but was not an indicator of the functional
properties of milk that affected cheese moisture. The proportion of β-casein in milk
from cows offered the HP diet was higher and that of γ-casein lower than in milk
from cows given the LW supplement, although cheese moisture content was similar
with both diets. Milk from cows offered the HP diet had a greater inorganic P
concentration than that from cows given the LW diet, although the dietary intake
of P was higher for the LW diet. The significance of the effect of dietary P intake on
the concentration of inorganic P in milk and hence its suitability for cheesemaking
was apparent when dietary P intake was low, as shown in milk produced by cows
offered the control diet.
The effects of supplementing cows' diets with protein and energy on milk
composition and the composition and yield of Cheddar cheese were investigated. This
research addresses the problems of seasonal reduction in the capacity of cheese curds
to expel moisture as observed in parts of south-eastern Australia. Milk was collected
from cows offered a basal diet of silage and hay supplemented with different sources
and levels of dietary protein and energy. The protein supplements were sunflower,
canola, cottonseed meal and lupin, and the energy supplements were maize grain,
oats, wheat and barley. This milk was used to manufacture Cheddar cheese on a pilot
scale. Cheese moisture content was dependent on the source and level of dietary
protein and energy. Milk from cows offered the lupin protein supplements and wheat
energy supplements consistently produced cheese with a lower moisture content and
moisture in fat-free matter. Milk from these supplemented diets had increased casein
concentrations and higher proportions of αs2-casein
than milk from the poor quality
control diet. Cheese yield was directly related to the total casein concentration of
milk, but was not influenced by differences in casein composition. Supplementing the
cows' diets increased the inorganic P, Mg and Ca concentrations in milk. A low
inorganic P concentration in milk from cows offered the control diet was caused by
a low intake of dietary P. These findings showed that changes in the mineral and
casein composition of milk, associated with diet, could influence the composition of