To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Successful pigweed management requires an integrated strategy to delay the development of resistance to any single control tactic. Field trials were implemented during 2017 and 2018 in three counties in Kansas on dryland (limited rainfall, nonirrigated), glufosinate-resistant soybean. The objective was to assess pigweed control with combinations of a winter wheat cover crop (CC), three soybean row widths (76, 38, and 19 cm), row-crop cultivation 2.5 weeks after planting (WAP), and an herbicide program to develop integrated pigweed management recommendations. All combinations of the four components were assessed by 16 treatments. All treatments with the herbicide program resulted in excellent (>97%) pigweed control and were analyzed separately from the other components. Treatments containing row-crop cultivation reduced pigweed density and biomass 3 and 8 WAP in all locations compared with the 76-cm row width plus no CC treatment. CC impacts were mixed. In Riley County, Palmer amaranth density and biomass were reduced; in Reno County, no additional Palmer amaranth control was observed; in Franklin County, the CC had greater waterhemp density and biomass compared with the treatments containing no CC. Narrow row widths achieved the most consistent results of all cultural components when data were pooled across locations: Decreasing row widths from 76 to 38 cm resulted in a 23% reduction in pigweed biomass 8 WAP and decreasing row width from 38 to 19 cm achieved a 15% reduction. Row-crop cultivation should be incorporated where possible as a mechanical option to manage pigweed, and narrow row widths should be used to suppress late-season pigweed growth when feasible. Inconsistent pigweed control from CC was achieved and should be given special consideration before implementation. The integral use of these components with an herbicide program as a system should be recommended to achieve the best pigweed control and reduce the risk of developing herbicide resistance.
Despite United States national learning objectives referencing research fundamentals and the critical appraisal of medical literature, many paramedic programs are not meeting these objectives with substantive content.
The objective was to develop and implement a journal club educational module for paramedic training programs, which is all-inclusive and could be distributed to Emergency Medical Services (EMS) educators and EMS medical directors to use as a framework to adapt to their program.
Four two-hour long journal club sessions were designed. First, the educator provided students with four types of articles on a student-chosen topic and discussed differences in methodology and structures. Next, after a lecture about peer-review, students used search engines to verify references of a trade magazine article. Third, the educator gave a statistics lecture and critiqued the results section of several articles found by students on a topic. Finally, students found an article on a topic of personal interest and presented it to their classmates, as if telling their paramedic partner about it at work. Before and after the series, students from two cohorts (2017, 2018) completed a survey with questions about demographics and perceptions of research. Students from one cohort (2017) received a follow-up survey one year later.
For the 2016 cohort, 13 students participated and provided qualitative feedback. For the 2017 and 2018 cohorts, 33 students participated. After the series, there was an increased self-reported ability to find, evaluate, and apply medical research articles, as well as overall positive trending opinions of participating in and the importance of prehospital research. This ability was demonstrated by every student during the final journal club session. McNemar’s and Related-Samples Cochran’s Q testing of questionnaire responses suggested a statistically significant improvement in student approval of exceptions from informed consent.
The framework for this paramedic journal club series could be adapted by EMS educators and medical directors to enable paramedics to search for, critically appraise, and discuss the findings of medical literature.
To evaluate the association between novel pre- and post-operative biomarker levels and 30-day unplanned readmission or mortality after paediatric congenital heart surgery.
Children aged 18 years or younger undergoing congenital heart surgery (n = 162) at Johns Hopkins Hospital from 2010 to 2014 were enrolled in the prospective cohort. Collected novel pre- and post-operative biomarkers include soluble suppression of tumorgenicity 2, galectin-3, N-terminal prohormone of brain natriuretic peptide, and glial fibrillary acidic protein. A model based on clinical variables from the Society of Thoracic Surgery database was developed and evaluated against two augmented models.
Unplanned readmission or mortality within 30 days of cardiac surgery occurred among 21 (13%) children. The clinical model augmented with pre-operative biomarkers demonstrated a statistically significant improvement over the clinical model alone with a receiver-operating characteristics curve of 0.754 (95% confidence interval: 0.65–0.86) compared to 0.617 (95% confidence interval: 0.47–0.76; p-value: 0.012). The clinical model augmented with pre- and post-operative biomarkers demonstrated a significant improvement over the clinical model alone, with a receiver-operating characteristics curve of 0.802 (95% confidence interval: 0.72–0.89; p-value: 0.003).
Novel biomarkers add significant predictive value when assessing the likelihood of unplanned readmission or mortality after paediatric congenital heart surgery. Further exploration of the utility of these novel biomarkers during the pre- or post-operative period to identify early risk of mortality or readmission will aid in determining the clinical utility and application of these biomarkers into routine risk assessment.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
The purpose of this study was to examine whether vehicle type based on size (car vs. other = truck/van/SUV) had an impact on the speeding, acceleration, and braking patterns of older male and female drivers (70 years and older) from a Canadian longitudinal study. The primary hypothesis was that older adults driving larger vehicles (e.g., trucks, SUVs, or vans) would be more likely to speed than those driving cars. Participants (n = 493) had a device installed in their vehicles that recorded their everyday driving. The findings suggest that the type of vehicle driven had little or no impact on per cent of time speeding or on the braking and accelerating patterns of older drivers. Given that the propensity for exceeding the speed limit was high among these older drivers, regardless of vehicle type, future research should examine what effect this behaviour has on older-driver road safety.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
Epistaxis is the most common ENT emergency. This study aimed to assess one-year mortality rates in patients admitted to a large teaching hospital.
This study was a retrospective case note analysis of all patients admitted to the Queen Elizabeth University Hospital in Glasgow with epistaxis over a 12-month period.
The one-year overall mortality for a patient admitted with epistaxis was 9.8 per cent. The patients who died were older (mean age 77.2 vs 68.8 years; p = 0.002), had a higher Cumulative Illness Rating Scale-Geriatric score (9.9 vs 6.7; p < 0.001) and had a higher performance status score (2 or higher vs less than 2; p < 0.001). Other risk factors were a low admission haemoglobin level (less than 128 g/dl vs 128 g/dl or higher; p = 0.025), abnormal coagulation (p = 0.004), low albumin (less than 36 g/l vs more than 36 g/l; p < 0.001) and longer length of stay (p = 0.046).
There are a number of risk factors associated with increased mortality after admission with epistaxis. This information could help with risk stratification of patients at admission and enable the appropriate patient support to be arranged.
Residual stresses in dental castings are widely held to be the cause of distortion and change of fit in ceramic bonded to metal dental restorations. Residual stresses are thought to result from the casting process and from ceramic/metal mismatch of thermal expansion coefficients. Such stresses have not been confirmed experimentally. The purpose of this study was to measure residual stress with x-ray diffraction at the various porcelain application steps for two noble dental alloys with two dental opaque porcelains.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
OBJECTIVES/SPECIFIC AIMS: In a randomized controlled trial in participants with HIV infection, recombinant human growth hormone (rhGH) reduced visceral adipose tissue (VAT); addition of rosiglitazone to rhGH prevented the accompanying decline in insulin sensitivity (SI). Within this parent RCT, we sought to determine the effect of rosiglitazone and rhGH intervention on alpha-1-acid glycoprotein (AGP), a biomarker of inflammation. We also investigated AGP as an independent risk factor for SI and VAT changes along with any potential effect modification by AGP of the intervention. METHODS/STUDY POPULATION: Participants with HIV-infection (n = 72) with abdominal adiposity and insulin resistance were randomized to rosiglitazone, rhGH, combination, or placebo for 12 weeks (NCT00130286). SI was determined by frequently sampled intravenous glucose tolerance test, and VAT by whole body MRI. AGP concentrations were determined by immunoturbidimetric assay in available serum samples at baseline (time 0), 4, and 12 weeks (n = 41 participants with samples at all 3 time points). A linear mixed model was used to assess the impact of intervention over time on AGP concentrations. General linear models were used to assess baseline AGP concentrations as an independent predictor of SI and VAT changes by treatment group with the model initially including age quartile, gender, race, ethnicity, BMI, HIV RNA <400 copies/mL, antiretroviral regimen, CD4 count, Stavudine use, and zidovudine use with step-by-step removal of least significant predictors. Effect modification was assessed by adding an interaction term between AGP and assigned intervention. RESULTS/ANTICIPATED RESULTS: AGP did not differ among treatment groups at baseline; overall median (Q1, Q3): 0.608 (.526,.727) g/L, P = 0.92. Treatment with rosiglitazone, rhGH, or the combination significantly reduced AGP concentrations from baseline to week 12, compared to placebo (time by treatment interaction, P = 0.0038). Baseline AGP was not a significant predictor or effect modifier of SI change in response to treatment (P ≥ 0.50). Baseline AGP (g/L) was an independent predictor of VAT change (L) (β = 1.91, SE = 0.89, P = 0.038) in addition to a treatment effect (P < 0.001) and age quartile effect (P < 0.001). No other predictors or interactions were significant, including effect modification of AGP (AGP by treatment interaction P = 0.50). DISCUSSION/SIGNIFICANCE OF IMPACT: It is known that immune and metabolic pathways are highly integrated, and biomarkers of inflammation have predictive abilities for cardiovascular and metabolic disease outcomes. This analysis provides data showing that treatment with rosiglitazone or rhGH in the context of HIV reduces AGP concentrations, indicating efficacy in reducing systemic inflammation. Baseline AGP was an independent risk factor for VAT changes as those with lower AGP at baseline showed a greater reduction in VAT in response to treatment. Biomarkers of inflammation may provide prognostic information for individualized patient outcomes to help guide treatment and follow-up.
Double-crop grain sorghum after winter wheat harvest is a common cropping system in the southern plains region. Palmer amaranth is a troublesome weed in double-crop grain sorghum in Kansas. Populations resistant to various herbicides (e.g., atrazine, glyphosate, metsulfuron, pyrasulfotole) have made Palmer amaranth management even more difficult for producers. To evaluate control of atrazine-resistant and atrazine-susceptible Palmer amaranth in double-crop grain sorghum, we assessed 14 herbicide programs, of which 8 were PRE only and 6 were PRE followed by (fb) POST applications. Visible ratings of Palmer amaranth control were taken at 3 and 8 wk after planting (WAP) grain sorghum. PRE treatments containing very-long-chain fatty acid (VLCFA)–inhibiting herbicides provided 91% control of atrazine-resistant Palmer amaranth 3 WAP, and reduced weed density 8 WAP compared to atrazine-only PRE treatments. PRE fb POST treatments, especially those that included VLCFA-inhibiting herbicides, provided greater control (71% to 93%) of both atrazine-resistant and atrazine-susceptible Palmer amaranth, respectively, at 8 WAP compared to PRE treatments alone (59% to 79%). These results demonstrated the utility of VLCFA-inhibiting herbicides applied PRE and in a layered PRE fb POST approach in controlling atrazine-resistant Palmer amaranth, as well as the importance of an effective POST application following residual PRE herbicides for controlling both atrazine-resistant and atrazine-susceptible Palmer amaranth in double-crop grain sorghum.
Double-crop soybean after winter wheat is a component of many cropping systems across eastern and central Kansas. Until recently, control of Palmer amaranth and common waterhemp has been both easy and economical with the use of sequential applications of glyphosate in glyphosate-resistant soybean. Many populations of Palmer amaranth and common waterhemp have become resistant to glyphosate. During 2015 and 2016, a total of five field experiments were conducted near Manhattan, Hutchinson, and Ottawa, KS, to assess various non-glyphosate herbicide programs at three different application timings for the control of Palmer amaranth and waterhemp in double-crop soybean after winter wheat. Spring-POST treatments of pyroxasulfone (119 g ai ha–1) and pendimethalin (1065 g ai ha–1) were applied to winter wheat to evaluate residual control of Palmer amaranth and waterhemp. Less than 40% control of Palmer amaranth and waterhemp was observed in both treatments 2 wk after planting (WAP) double-crop soybean. Preharvest treatments of 2,4-D (561 g ae ha–1) and flumioxazin (107 g ai ha–1) were also applied to the winter wheat to assess control of emerged Palmer amaranth and waterhemp. 2,4-D resulted in highly variable Palmer amaranth and waterhemp control, whereas flumioxazin resulted in control similar to PRE treatments that contained paraquat (841 g ai ha–1) plus residual herbicide(s). Excellent control of both species was observed 2 WAP with a PRE paraquat application; however, reduced control of Palmer amaranth and waterhemp was noted 8 WAP due to subsequent emergence. Results indicate that Palmer amaranth and waterhemp control was 85% or greater 8 WAP for PRE treatments that included a combination of paraquat plus residual herbicide(s). PRE treatments that did not include both paraquat and residual herbicide(s) did not provide acceptable control.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
The north-west European population of Bewick’s Swan Cygnus columbianus bewickii declined by 38% between 1995 and 2010 and is listed as ‘Endangered’ on the European Red List of birds. Here, we combined information on food resources within the landscape with long-term data on swan numbers, habitat use, behaviour and two complementary measures of body condition, to examine whether changes in food type and availability have influenced the Bewick’s Swan’s use of their main wintering site in the UK, the Ouse Washes and surrounding fens. Maximum number of Bewick’s Swans rose from 620 in winter 1958/59 to a high of 7,491 in winter 2004/05, before falling to 1,073 birds in winter 2013/14. Between winters 1958/59 and 2014/15 the Ouse Washes supported between 0.5 and 37.9 % of the total population wintering in north-west Europe (mean ± 95 % CI = 18.1 ± 2.4 %). Swans fed on agricultural crops, shifting from post-harvest remains of root crops (e.g. sugar beet and potatoes) in November and December to winter-sown cereals (e.g. wheat) in January and February. Inter-annual variation in the area cultivated for these crops did not result in changes in the peak numbers of swans occurring on the Ouse Washes. Behavioural and body condition data indicated that food supplies on the Ouse Washes and surrounding fens remain adequate to allow the birds to gain and maintain good body condition throughout winter with no increase in foraging effort. Our findings suggest that the recent decline in numbers of Bewick’s Swans at this internationally important site was not linked to inadequate food resources.
To describe the process by which the 12 community-based primary health care (CBPHC) research teams worked together and fostered cross-jurisdictional collaboration, including collection of common indicators with the goal of using the same measures and data sources.
A pan-Canadian mechanism for common measurement of the impact of primary care innovations across Canada is lacking. The Canadian Institutes for Health Research and its partners funded 12 teams to conduct research and collaborate on development of a set of commonly collected indicators.
A working group representing the 12 teams was established. They undertook an iterative process to consider existing primary care indicators identified from the literature and by stakeholders. Indicators were agreed upon with the intention of addressing three objectives across the 12 teams: (1) describing the impact of improving access to CBPHC; (2) examining the impact of alternative models of chronic disease prevention and management in CBPHC; and (3) describing the structures and context that influence the implementation, delivery, cost, and potential for scale-up of CBPHC innovations.
Nineteen common indicators within the core dimensions of primary care were identified: access, comprehensiveness, coordination, effectiveness, and equity. We also agreed to collect data on health care costs and utilization within each team. Data sources include surveys, health administrative data, interviews, focus groups, and case studies. Collaboration across these teams sets the foundation for a unique opportunity for new knowledge generation, over and above any knowledge developed by any one team. Keys to success are each team’s willingness to engage and commitment to working across teams, funding to support this collaboration, and distributed leadership across the working group. Reaching consensus on collection of common indicators is challenging but achievable.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
The effect of transportation and lairage on the faecal shedding and post-slaughter contamination of carcasses with Escherichia coli O157 and O26 in young calves (4–7-day-old) was assessed in a cohort study at a regional calf-processing plant in the North Island of New Zealand, following 60 calves as cohorts from six dairy farms to slaughter. Multiple samples from each animal at pre-slaughter (recto-anal mucosal swab) and carcass at post-slaughter (sponge swab) were collected and screened using real-time PCR and culture isolation methods for the presence of E. coli O157 and O26 (Shiga toxin-producing E. coli (STEC) and non-STEC). Genotype analysis of E. coli O157 and O26 isolates provided little evidence of faecal–oral transmission of infection between calves during transportation and lairage. Increased cross-contamination of hides and carcasses with E. coli O157 and O26 between co-transported calves was confirmed at pre-hide removal and post-evisceration stages but not at pre-boning (at the end of dressing prior to chilling), indicating that good hygiene practices and application of an approved intervention effectively controlled carcass contamination. This study was the first of its kind to assess the impact of transportation and lairage on the faecal carriage and post-harvest contamination of carcasses with E. coli O157 and O26 in very young calves.
Dengue is the fastest spreading mosquito-transmitted disease in the world. In China, Guangzhou City is believed to be the most important epicenter of dengue outbreaks although the transmission patterns are still poorly understood. We developed an autoregressive integrated moving average model incorporating external regressors to examine the association between the monthly number of locally acquired dengue infections and imported cases, mosquito densities, temperature and precipitation in Guangzhou. In multivariate analysis, imported cases and minimum temperature (both at lag 0) were both associated with the number of locally acquired infections (P < 0.05). This multivariate model performed best, featuring the lowest fitting root mean squared error (RMSE) (0.7520), AIC (393.7854) and test RMSE (0.6445), as well as the best effect in model validation for testing outbreak with a sensitivity of 1.0000, a specificity of 0.7368 and a consistency rate of 0.7917. Our findings suggest that imported cases and minimum temperature are two key determinants of dengue local transmission in Guangzhou. The modelling method can be used to predict dengue transmission in non-endemic countries and to inform dengue prevention and control strategies.
Introduction: The ECG diagnosis of acute coronary occlusion (ACO) in the setting of ventricular paced rhythm (VPR) is purported to be impossible. However, VPR has a similar ECG morphology to LBBB. The validated Smith-modified Sgarbossa criteria (MSC) have high sensitivity (Sens) and specificity (Spec) for ACO in LBBB. MSC consist of 1 of the following in 1 lead: concordant ST Elevation (STE) 1 mm, concordant ST depression 1 mm in V1-V3, or ST/S ratio <−0.25 (in leads with 1 mm STE). We hypothesized that the MSC will have higher Sens for diagnosis of ACO in VPR when compared to the original Sgarbossa criteria. We report preliminary findings of the Paced Electrocardiogram Requiring Fast Emergency Coronary Therapy (PERFECT) study Methods: The PERFECT study is a retrospective, multicenter, international investigation of ED patients from 1/2008 - 12/2016 with VPR on the ECG and symptoms suggestive of acute coronary syndrome (e.g. chest pain or shortness of breath). Data from four sites are presented. Acute myocardial infarction (AMI) was defined by the Third Universal Definition of AMI. A blinded cardiologist adjudicated ACO, defined as thrombolysis in myocardial infarction score 0 or 1 on coronary angiography; a pre-defined subgroup of ACO patients with peak cardiac troponin (cTn) >100 times the 99% upper reference limit (URL) of the cTn assay was also analyzed. Another blinded physician measured all ECGs. Statistics were by Mann Whitney U, Chi-square, and McNemars test. Results: The ACO and No-AMI groups consisted of 15 and 79 encounters, respectively. For the ACO and No-AMI groups, median age was 78 [IQR 72-82] vs. 70 [61-75] and 13 (86%) vs. 48 (61%) patients were male. The median peak cTn ratio (cTn/URL) was 260 [33-663] and 0.5 [0-1.3] for ACO vs. no-AMI. The Sens and Spec for the MSC and the original Sgarbossa criteria were 67% (95%CI 39-87) vs. 46% (22-72; p=0.25) and 99% (92-100) vs. 99% (92-100; p=0.5). In pre-defined subgroup analysis of ACO patients with peak cTn >100 times the URL (n=10), the Sens was 90% (54-100) for the MSC vs. 60% (27- 86) for original Sgarbossa criteria (p=0.25). Conclusion: ACO in VPR is an uncommon condition. The MSC showed good Sens for diagnosis of ACO in the presence of VPR, especially among patients with high peak cTn, and Spec was excellent. These methods and results are consistent with studies that have used the MSC to diagnose ACO in LBBB.