To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Alzheimer’s disease (AD) studies are increasingly targeting earlier (pre)clinical populations, in which the expected degree of observable cognitive decline over a certain time interval is reduced as compared to the dementia stage. Consequently, endpoints to capture early cognitive changes require refinement. We aimed to determine the sensitivity to decline of widely applied neuropsychological tests at different clinical stages of AD as outlined in the National Institute on Aging – Alzheimer’s Association (NIA-AA) research framework.
Amyloid-positive individuals (as determined by positron emission tomography or cerebrospinal fluid) with longitudinal neuropsychological assessments available were included from four well-defined study cohorts and subsequently classified among the NIA-AA stages. For each stage, we investigated the sensitivity to decline of 17 individual neuropsychological tests using linear mixed models.
1103 participants (age = 70.54 ± 8.7, 47% female) were included: n = 120 Stage 1, n = 206 Stage 2, n = 467 Stage 3 and n = 309 Stage 4. Neuropsychological tests were differentially sensitive to decline across stages. For example, Category Fluency captured significant 1-year decline as early as Stage 1 (β = −.58, p < .001). Word List Delayed Recall (β = −.22, p < .05) and Trail Making Test (β = 6.2, p < .05) became sensitive to 1-year decline in Stage 2, whereas the Mini-Mental State Examination did not capture 1-year decline until Stage 3 (β = −1.13, p < .001) and 4 (β = −2.23, p < .001).
We demonstrated that commonly used neuropsychological tests differ in their ability to capture decline depending on clinical stage within the AD continuum (preclinical to dementia). This implies that stage-specific cognitive endpoints are needed to accurately assess disease progression and increase the chance of successful treatment evaluation in AD.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Heat stress is a global issue constraining pig productivity, and it is likely to intensify under future climate change. Technological advances in earth observation have made tools available that enable identification and mapping livestock species that are at risk of exposure to heat stress due to climate change. Here, we present a methodology to map the current and likely future heat stress risk in pigs using R software by combining the effects of temperature and relative humidity. We applied the method to growing-finishing pigs in Uganda. We mapped monthly heat stress risk and quantified the number of pigs exposed to heat stress using 18 global circulation models and projected impacts in the 2050s. Results show that more than 800 000 pigs in Uganda will be affected by heat stress in the future. The results can feed into evidence-based policy, planning and targeted resource allocation in the livestock sector.
This guidance paper from the European Psychiatric Association (EPA) aims to provide evidence-based recommendations on early intervention in clinical high risk (CHR) states of psychosis, assessed according to the EPA guidance on early detection. The recommendations were derived from a meta-analysis of current empirical evidence on the efficacy of psychological and pharmacological interventions in CHR samples. Eligible studies had to investigate conversion rate and/or functioning as a treatment outcome in CHR patients defined by the ultra-high risk and/or basic symptom criteria. Besides analyses on treatment effects on conversion rate and functional outcome, age and type of intervention were examined as potential moderators. Based on data from 15 studies (n = 1394), early intervention generally produced significantly reduced conversion rates at 6- to 48-month follow-up compared to control conditions. However, early intervention failed to achieve significantly greater functional improvements because both early intervention and control conditions produced similar positive effects. With regard to the type of intervention, both psychological and pharmacological interventions produced significant effects on conversion rates, but not on functional outcome relative to the control conditions. Early intervention in youth samples was generally less effective than in predominantly adult samples. Seven evidence-based recommendations for early intervention in CHR samples could have been formulated, although more studies are needed to investigate the specificity of treatment effects and potential age effects in order to tailor interventions to the individual treatment needs and risk status.
The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rates to psychosis in non-overlapping samples meeting any at least any one of the main CHR criteria: ultra-high risk (UHR) and/or basic symptoms criteria. Further, effects of potential moderators (different UHR criteria definitions, single UHR criteria and age) on conversion rates were examined. Conversion rates in the identified 42 samples with altogether more than 4000 CHR patients who had mainly been identified by UHR criteria and/or the basic symptom criterion ‘cognitive disturbances’ (COGDIS) showed considerable heterogeneity. While UHR criteria and COGDIS were related to similar conversion rates until 2-year follow-up, conversion rates of COGDIS were significantly higher thereafter. Differences in onset and frequency requirements of symptomatic UHR criteria or in their different consideration of functional decline, substance use and co-morbidity did not seem to impact on conversion rates. The ‘genetic risk and functional decline’ UHR criterion was rarely met and only showed an insignificant pooled sample effect. However, age significantly affected UHR conversion rates with lower rates in children and adolescents. Although more research into potential sources of heterogeneity in conversion rates is needed to facilitate improvement of CHR criteria, six evidence-based recommendations for an early detection of psychosis were developed as a basis for the EPA guidance on early intervention in CHR states.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
Pigweed is difficult to manage in grain sorghum because of widespread herbicide resistance, a limited number of registered effective herbicides, and the synchronous emergence of pigweed with grain sorghum in Kansas. The combination of cultural and mechanical control tactics with an herbicide program are commonly recognized as best management strategies; however, limited information is available to adapt these strategies to dryland systems. Our objective for this research was to assess the influence of four components, including a winter wheat cover crop (CC), row-crop cultivation, three row widths, with and without a herbicide program, on pigweed control in a dryland system. Field trials were implemented during 2017 and 2018 at three locations for a total of 6 site-years. The herbicide program component resulted in excellent control (>97%) in all treatments at 3 and 8 weeks after planting (WAP). CC provided approximately 50% reductions in pigweed density and biomass for both timings in half of the site-years; however, mixed results were observed in the remaining site-years, ranging from no attributable difference to a 170% increase in weed density at 8 WAP in 1 site-year. Treatments including row-crop cultivation reduced pigweed biomass and density in most site-years 3 and 8 WAP. An herbicide program is required to achieve pigweed control and should be integrated with row-crop cultivation or narrow row widths to reduce the risk of herbicide resistance. Additional research is required to optimize the use of CC as an integrated pigweed management strategy in dryland grain sorghum.
Successful pigweed management requires an integrated strategy to delay the development of resistance to any single control tactic. Field trials were implemented during 2017 and 2018 in three counties in Kansas on dryland (limited rainfall, nonirrigated), glufosinate-resistant soybean. The objective was to assess pigweed control with combinations of a winter wheat cover crop (CC), three soybean row widths (76, 38, and 19 cm), row-crop cultivation 2.5 weeks after planting (WAP), and an herbicide program to develop integrated pigweed management recommendations. All combinations of the four components were assessed by 16 treatments. All treatments with the herbicide program resulted in excellent (>97%) pigweed control and were analyzed separately from the other components. Treatments containing row-crop cultivation reduced pigweed density and biomass 3 and 8 WAP in all locations compared with the 76-cm row width plus no CC treatment. CC impacts were mixed. In Riley County, Palmer amaranth density and biomass were reduced; in Reno County, no additional Palmer amaranth control was observed; in Franklin County, the CC had greater waterhemp density and biomass compared with the treatments containing no CC. Narrow row widths achieved the most consistent results of all cultural components when data were pooled across locations: Decreasing row widths from 76 to 38 cm resulted in a 23% reduction in pigweed biomass 8 WAP and decreasing row width from 38 to 19 cm achieved a 15% reduction. Row-crop cultivation should be incorporated where possible as a mechanical option to manage pigweed, and narrow row widths should be used to suppress late-season pigweed growth when feasible. Inconsistent pigweed control from CC was achieved and should be given special consideration before implementation. The integral use of these components with an herbicide program as a system should be recommended to achieve the best pigweed control and reduce the risk of developing herbicide resistance.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
The purpose of this study was to examine whether vehicle type based on size (car vs. other = truck/van/SUV) had an impact on the speeding, acceleration, and braking patterns of older male and female drivers (70 years and older) from a Canadian longitudinal study. The primary hypothesis was that older adults driving larger vehicles (e.g., trucks, SUVs, or vans) would be more likely to speed than those driving cars. Participants (n = 493) had a device installed in their vehicles that recorded their everyday driving. The findings suggest that the type of vehicle driven had little or no impact on per cent of time speeding or on the braking and accelerating patterns of older drivers. Given that the propensity for exceeding the speed limit was high among these older drivers, regardless of vehicle type, future research should examine what effect this behaviour has on older-driver road safety.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
Epistaxis is the most common ENT emergency. This study aimed to assess one-year mortality rates in patients admitted to a large teaching hospital.
This study was a retrospective case note analysis of all patients admitted to the Queen Elizabeth University Hospital in Glasgow with epistaxis over a 12-month period.
The one-year overall mortality for a patient admitted with epistaxis was 9.8 per cent. The patients who died were older (mean age 77.2 vs 68.8 years; p = 0.002), had a higher Cumulative Illness Rating Scale-Geriatric score (9.9 vs 6.7; p < 0.001) and had a higher performance status score (2 or higher vs less than 2; p < 0.001). Other risk factors were a low admission haemoglobin level (less than 128 g/dl vs 128 g/dl or higher; p = 0.025), abnormal coagulation (p = 0.004), low albumin (less than 36 g/l vs more than 36 g/l; p < 0.001) and longer length of stay (p = 0.046).
There are a number of risk factors associated with increased mortality after admission with epistaxis. This information could help with risk stratification of patients at admission and enable the appropriate patient support to be arranged.
OBJECTIVES/SPECIFIC AIMS: In a randomized controlled trial in participants with HIV infection, recombinant human growth hormone (rhGH) reduced visceral adipose tissue (VAT); addition of rosiglitazone to rhGH prevented the accompanying decline in insulin sensitivity (SI). Within this parent RCT, we sought to determine the effect of rosiglitazone and rhGH intervention on alpha-1-acid glycoprotein (AGP), a biomarker of inflammation. We also investigated AGP as an independent risk factor for SI and VAT changes along with any potential effect modification by AGP of the intervention. METHODS/STUDY POPULATION: Participants with HIV-infection (n = 72) with abdominal adiposity and insulin resistance were randomized to rosiglitazone, rhGH, combination, or placebo for 12 weeks (NCT00130286). SI was determined by frequently sampled intravenous glucose tolerance test, and VAT by whole body MRI. AGP concentrations were determined by immunoturbidimetric assay in available serum samples at baseline (time 0), 4, and 12 weeks (n = 41 participants with samples at all 3 time points). A linear mixed model was used to assess the impact of intervention over time on AGP concentrations. General linear models were used to assess baseline AGP concentrations as an independent predictor of SI and VAT changes by treatment group with the model initially including age quartile, gender, race, ethnicity, BMI, HIV RNA <400 copies/mL, antiretroviral regimen, CD4 count, Stavudine use, and zidovudine use with step-by-step removal of least significant predictors. Effect modification was assessed by adding an interaction term between AGP and assigned intervention. RESULTS/ANTICIPATED RESULTS: AGP did not differ among treatment groups at baseline; overall median (Q1, Q3): 0.608 (.526,.727) g/L, P = 0.92. Treatment with rosiglitazone, rhGH, or the combination significantly reduced AGP concentrations from baseline to week 12, compared to placebo (time by treatment interaction, P = 0.0038). Baseline AGP was not a significant predictor or effect modifier of SI change in response to treatment (P ≥ 0.50). Baseline AGP (g/L) was an independent predictor of VAT change (L) (β = 1.91, SE = 0.89, P = 0.038) in addition to a treatment effect (P < 0.001) and age quartile effect (P < 0.001). No other predictors or interactions were significant, including effect modification of AGP (AGP by treatment interaction P = 0.50). DISCUSSION/SIGNIFICANCE OF IMPACT: It is known that immune and metabolic pathways are highly integrated, and biomarkers of inflammation have predictive abilities for cardiovascular and metabolic disease outcomes. This analysis provides data showing that treatment with rosiglitazone or rhGH in the context of HIV reduces AGP concentrations, indicating efficacy in reducing systemic inflammation. Baseline AGP was an independent risk factor for VAT changes as those with lower AGP at baseline showed a greater reduction in VAT in response to treatment. Biomarkers of inflammation may provide prognostic information for individualized patient outcomes to help guide treatment and follow-up.
Double-crop grain sorghum after winter wheat harvest is a common cropping system in the southern plains region. Palmer amaranth is a troublesome weed in double-crop grain sorghum in Kansas. Populations resistant to various herbicides (e.g., atrazine, glyphosate, metsulfuron, pyrasulfotole) have made Palmer amaranth management even more difficult for producers. To evaluate control of atrazine-resistant and atrazine-susceptible Palmer amaranth in double-crop grain sorghum, we assessed 14 herbicide programs, of which 8 were PRE only and 6 were PRE followed by (fb) POST applications. Visible ratings of Palmer amaranth control were taken at 3 and 8 wk after planting (WAP) grain sorghum. PRE treatments containing very-long-chain fatty acid (VLCFA)–inhibiting herbicides provided 91% control of atrazine-resistant Palmer amaranth 3 WAP, and reduced weed density 8 WAP compared to atrazine-only PRE treatments. PRE fb POST treatments, especially those that included VLCFA-inhibiting herbicides, provided greater control (71% to 93%) of both atrazine-resistant and atrazine-susceptible Palmer amaranth, respectively, at 8 WAP compared to PRE treatments alone (59% to 79%). These results demonstrated the utility of VLCFA-inhibiting herbicides applied PRE and in a layered PRE fb POST approach in controlling atrazine-resistant Palmer amaranth, as well as the importance of an effective POST application following residual PRE herbicides for controlling both atrazine-resistant and atrazine-susceptible Palmer amaranth in double-crop grain sorghum.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
The effect of transportation and lairage on the faecal shedding and post-slaughter contamination of carcasses with Escherichia coli O157 and O26 in young calves (4–7-day-old) was assessed in a cohort study at a regional calf-processing plant in the North Island of New Zealand, following 60 calves as cohorts from six dairy farms to slaughter. Multiple samples from each animal at pre-slaughter (recto-anal mucosal swab) and carcass at post-slaughter (sponge swab) were collected and screened using real-time PCR and culture isolation methods for the presence of E. coli O157 and O26 (Shiga toxin-producing E. coli (STEC) and non-STEC). Genotype analysis of E. coli O157 and O26 isolates provided little evidence of faecal–oral transmission of infection between calves during transportation and lairage. Increased cross-contamination of hides and carcasses with E. coli O157 and O26 between co-transported calves was confirmed at pre-hide removal and post-evisceration stages but not at pre-boning (at the end of dressing prior to chilling), indicating that good hygiene practices and application of an approved intervention effectively controlled carcass contamination. This study was the first of its kind to assess the impact of transportation and lairage on the faecal carriage and post-harvest contamination of carcasses with E. coli O157 and O26 in very young calves.
Dengue is the fastest spreading mosquito-transmitted disease in the world. In China, Guangzhou City is believed to be the most important epicenter of dengue outbreaks although the transmission patterns are still poorly understood. We developed an autoregressive integrated moving average model incorporating external regressors to examine the association between the monthly number of locally acquired dengue infections and imported cases, mosquito densities, temperature and precipitation in Guangzhou. In multivariate analysis, imported cases and minimum temperature (both at lag 0) were both associated with the number of locally acquired infections (P < 0.05). This multivariate model performed best, featuring the lowest fitting root mean squared error (RMSE) (0.7520), AIC (393.7854) and test RMSE (0.6445), as well as the best effect in model validation for testing outbreak with a sensitivity of 1.0000, a specificity of 0.7368 and a consistency rate of 0.7917. Our findings suggest that imported cases and minimum temperature are two key determinants of dengue local transmission in Guangzhou. The modelling method can be used to predict dengue transmission in non-endemic countries and to inform dengue prevention and control strategies.
Introduction: The ECG diagnosis of acute coronary occlusion (ACO) in the setting of ventricular paced rhythm (VPR) is purported to be impossible. However, VPR has a similar ECG morphology to LBBB. The validated Smith-modified Sgarbossa criteria (MSC) have high sensitivity (Sens) and specificity (Spec) for ACO in LBBB. MSC consist of 1 of the following in 1 lead: concordant ST Elevation (STE) 1 mm, concordant ST depression 1 mm in V1-V3, or ST/S ratio <−0.25 (in leads with 1 mm STE). We hypothesized that the MSC will have higher Sens for diagnosis of ACO in VPR when compared to the original Sgarbossa criteria. We report preliminary findings of the Paced Electrocardiogram Requiring Fast Emergency Coronary Therapy (PERFECT) study Methods: The PERFECT study is a retrospective, multicenter, international investigation of ED patients from 1/2008 - 12/2016 with VPR on the ECG and symptoms suggestive of acute coronary syndrome (e.g. chest pain or shortness of breath). Data from four sites are presented. Acute myocardial infarction (AMI) was defined by the Third Universal Definition of AMI. A blinded cardiologist adjudicated ACO, defined as thrombolysis in myocardial infarction score 0 or 1 on coronary angiography; a pre-defined subgroup of ACO patients with peak cardiac troponin (cTn) >100 times the 99% upper reference limit (URL) of the cTn assay was also analyzed. Another blinded physician measured all ECGs. Statistics were by Mann Whitney U, Chi-square, and McNemars test. Results: The ACO and No-AMI groups consisted of 15 and 79 encounters, respectively. For the ACO and No-AMI groups, median age was 78 [IQR 72-82] vs. 70 [61-75] and 13 (86%) vs. 48 (61%) patients were male. The median peak cTn ratio (cTn/URL) was 260 [33-663] and 0.5 [0-1.3] for ACO vs. no-AMI. The Sens and Spec for the MSC and the original Sgarbossa criteria were 67% (95%CI 39-87) vs. 46% (22-72; p=0.25) and 99% (92-100) vs. 99% (92-100; p=0.5). In pre-defined subgroup analysis of ACO patients with peak cTn >100 times the URL (n=10), the Sens was 90% (54-100) for the MSC vs. 60% (27- 86) for original Sgarbossa criteria (p=0.25). Conclusion: ACO in VPR is an uncommon condition. The MSC showed good Sens for diagnosis of ACO in the presence of VPR, especially among patients with high peak cTn, and Spec was excellent. These methods and results are consistent with studies that have used the MSC to diagnose ACO in LBBB.