To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
Introduction: Patients presenting to the emergency department (ED) with hypotension have a high mortality rate and require careful yet rapid resuscitation. The use of cardiac point of care ultrasound (PoCUS) in the ED has progressed beyond the basic indications of detecting pericardial fluid and activity in cardiac arrest. We examine if finding left ventricular dysfunction (LVD) on emergency physician performed PoCUS reliably predicts the presence of cardiogenic shock in hypotensive ED patients. Methods: We prospectively collected PoCUS findings performed in 135 ED patients with undifferentiated hypotension as part of an international study. Patients with clearly identified etiologies for hypotension were excluded, along with other specific presumptive diagnoses. LVD was defined as identification of a generally hypodynamic LV in the setting of shock. PoCUS findings were collected using a standardized protocol and data collection form. All scans were performed by PoCUS-trained emergency physicians. Final shock type was defined as cardiogenic or non-cardiogenic by independent specialist blinded chart review. Results: All 135 patients had complete follow up. Median age was 56 years, 53% of patients were male. Disease prevalence for cardiogenic shock was 12% and the mortality rate was 24%. The presence of LVD on PoCUS had a sensitivity of 62.50% (95%CI 35.43% to 84.80%), specificity of 94.12% (88.26% to 97.60%), positive-LR 10.62 (4.71 to 23.95), negative-LR 0.40 (0.21 to 0.75) and accuracy of 90.37% (84.10% to 94.77%) for detecting cardiogenic shock. Conclusion: Detecting left ventricular dysfunction on PoCUS in the ED may be useful in confirming the underlying shock type as cardiogenic in otherwise undifferentiated hypotensive patients.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
The effect of transportation and lairage on the faecal shedding and post-slaughter contamination of carcasses with Escherichia coli O157 and O26 in young calves (4–7-day-old) was assessed in a cohort study at a regional calf-processing plant in the North Island of New Zealand, following 60 calves as cohorts from six dairy farms to slaughter. Multiple samples from each animal at pre-slaughter (recto-anal mucosal swab) and carcass at post-slaughter (sponge swab) were collected and screened using real-time PCR and culture isolation methods for the presence of E. coli O157 and O26 (Shiga toxin-producing E. coli (STEC) and non-STEC). Genotype analysis of E. coli O157 and O26 isolates provided little evidence of faecal–oral transmission of infection between calves during transportation and lairage. Increased cross-contamination of hides and carcasses with E. coli O157 and O26 between co-transported calves was confirmed at pre-hide removal and post-evisceration stages but not at pre-boning (at the end of dressing prior to chilling), indicating that good hygiene practices and application of an approved intervention effectively controlled carcass contamination. This study was the first of its kind to assess the impact of transportation and lairage on the faecal carriage and post-harvest contamination of carcasses with E. coli O157 and O26 in very young calves.
The prevalence and spatial distribution of Escherichia coli serogroups O26, O103, O111 and O145 in calves <7 days old in New Zealand and their relationship with serum IgG, weight and sex was determined by collecting recto-anal mucosal swabs (RAMS) (n = 299) and blood samples (n = 299) from two slaughter plants in the North Island. Real-time PCR of RAMS enrichment cultures revealed that 134/299 samples were positive for O26, 68/299 for O103 and 47/299 for O145, but none were positive for O111. Processing of positive enrichment cultures resulted in 49 O26, four O103 and five O145 isolates. Using multiplex PCR 25/49 (51%) O26 isolates were positive for stx1, eae, ehxA, 17/49 (34·7%) for eae, ehxA and 7/49 (14·2%) for eae only. All O103 and O145 isolates were positive for eae, ehxA only. O26 isolates were grouped into four clusters (>70% similarity) using pulsed field gel electrophoresis. Mapping of the farms showed the presence of farms positive for O26, O103 and O145 in three important dairy producing regions of the North Island. Calves positive for O103 were more likely to be positive for O26 and vice versa (P = 0·04). Similarly, calves positive for O145 were more likely to be positive for O103 and vice versa (P = 0·03). This study demonstrates that non-O157 E. coli serogroups of public health and economic importance containing clinically relevant virulence factors are present in calves in the North Island of New Zealand.
The aim of this study was to examine the population structure, transmission and spatial relationship between genotypes of Shiga toxin-producing Escherichia coli (STEC) and Campylobacter jejuni, on 20 dairy farms in a defined catchment. Pooled faecal samples (n = 72) obtained from 288 calves were analysed by real-time polymerase chain reaction (rtPCR) for E. coli serotypes O26, O103, O111, O145 and O157. The number of samples positive for E. coli O26 (30/72) was high compared to E. coli O103 (7/72), O145 (3/72), O157 (2/72) and O111 (0/72). Eighteen E. coli O26 and 53 C. jejuni isolates were recovered from samples by bacterial culture. E. coli O26 and C. jejuni isolates were genotyped using pulsed-field gel electrophoresis and multilocus sequence typing, respectively. All E. coli O26 isolates could be divided into four clusters and the results indicated that E. coli O26 isolates recovered from calves on the same farm were more similar than isolates recovered from different farms in the catchment. There were 11 different sequence types of C. jejuni isolated from the cattle and 22 from water. An analysis of the population structure of C. jejuni isolated from cattle provided evidence of clustering of genotypes within farms, and among groups of farms separated by road boundaries.
Paranoia is one of the commonest symptoms of psychosis but has rarely been studied in a population at risk of developing psychosis. Based on existing theoretical models, including the proposed distinction between ‘poor me’ and ‘bad me’ paranoia, we aimed to test specific predictions about associations between negative cognition, metacognitive beliefs and negative emotions and paranoid ideation and the belief that persecution is deserved (deservedness).
We used data from 117 participants from the Early Detection and Intervention Evaluation for people at risk of psychosis (EDIE-2) trial of cognitive–behaviour therapy, comparing them with samples of psychiatric in-patients and healthy students from a previous study. Multi-level modelling was utilized to examine predictors of both paranoia and deservedness, with post-hoc planned comparisons conducted to test whether person-level predictor variables were associated differentially with paranoia or with deservedness.
Our sample of at-risk mental state participants was not as paranoid, but reported higher levels of ‘bad-me’ deservedness, compared with psychiatric in-patients. We found several predictors of paranoia and deservedness. Negative beliefs about self were related to deservedness but not paranoia, whereas negative beliefs about others were positively related to paranoia but negatively with deservedness. Both depression and negative metacognitive beliefs about paranoid thinking were specifically related to paranoia but not deservedness.
This study provides evidence for the role of negative cognition, metacognition and negative affect in the development of paranoid beliefs, which has implications for psychological interventions and our understanding of psychosis.
The objective of this study was to determine the distribution of Shiga toxin-producing Escherichia coli (STEC) virulence markers (stx1, stx2, eae, ehxA) in E. coli strains isolated from young calves aged fewer than 7 days (bobby calves). In total, 299 recto-anal mucosal swabs were collected from animals at two slaughter plants and inoculated onto tryptone bile X-glucuronide and sorbitol MacConkey agar supplemented with cefixime and potassium tellurite. Isolates were analysed using multiplex polymerase chain reaction to detect stx1, stx2, eae and ehxA genes. The most common combination of virulence markers were eae, ehxA (n = 35) followed by eae (n = 9). In total, STEC and atypical enteropathogenic E. coli (aEPEC) were isolated from 8/299 (2·6%) and 37/299 (12·3%) calves, respectively. All the isolates could be assigned to 15 genotype clusters with >70% similarity cut-off using XbaI pulsed-field gel electrophoresis. It may be concluded that healthy calves from the dairy industry are asymptomatic carriers of a diverse population of STEC and aEPEC in New Zealand.
We will present a simplified approximate model showing how even small changes in the dielectric response result in substantial variations in the Hamaker coefficient of the van der Waals interactions. Since all the terms in the Matsubara summation depends on the variation of the dielectric response spectra at one particular frequency, the total change in the Hamaker coefficient depends on the spectral changes not only at that frequency but also at the rest of the spectrum properly weighted. The Matsubara terms most affected by the addition of a single peak are not those close to the position of the added peak, but are distributed over the entire range of frequencies. We comment on the possibility of eliminating van der Waals interactions and/or drastically reducing them by spectral variation in a narrow regime of frequencies.
The faecal-pat prevalence (as estimated by culture) of Campylobacter fetus from cattle and sheep on 19 farms in rural Lancashire was investigated using standard Campylobacter culture techniques and PCR during a 2-year longitudinal study. C. fetus was isolated from 9·48% [95% confidence interval (CI) 8·48–10·48] of cattle faecal pats and 7·29% (95% CI 6·21–9·62) of sheep faecal pats. There was evidence of significant differences in shedding prevalence between geographical regions; cows in geographical zone 3 had an increased risk of shedding C. fetus compared to cows in geographical zones 1 and 2 (OR 6·64, 95% CI 1·67–26·5, P = 0·007), as did cows at pasture (OR 1·66, 95% CI 1·01–2·73, P = 0·046) compared to when housed. Multiple logistic regression modelling demonstrated underlying seasonal periodicity in both species.
In a 2-year longitudinal study of adult animals on 15 dairy farms and four sheep farms in Lancashire, UK, Arcobacter spp. were isolated from all farms although not at every sampling occasion. Faecal samples were collected and cultured using standard techniques for isolation of campylobacters. Assignment to species was via PCR assays. Apparent prevalence of Arcobacter spp. was higher in dairy cattle compared to sheep (40·1% vs. 8%, P < 0·001) and in housed cattle compared to cattle at pasture (50·1% vs. 20·9%, P < 0·001). This was reflected in the higher prevalence observed in herds that were housed (n = 4) all year compared to herds that grazed cattle on pasture in the summer and housed cattle in the winter (n = 11) (55·5% vs. 36%, P < 0·001). In the case of sheep, peak prevalence was observed in autumn with increased prevalence also being associated with improving pasture quality. There was an apparent inverse association between the faecal pat prevalence of Arcobacter spp. and Campylobacter jejuni although this may in part be an artefact of laboratory test method sensitivity, whereby a relative increase in the frequency of one bacterial species would reduce the sensitivity of detecting the other.
We compared Campylobacter jejuni/coli multilocus sequence types (STs) from pets (dogs/cats) and their owners and investigated risk factors for pet-associated human campylobacteriosis using a combined source-attribution and case-control analysis. In total, 132/687 pet stools were Campylobacter-positive, resulting in 499 strains isolated (320 C. upsaliensis/helveticus, 100 C. jejuni, 33 C. hyointestinalis/fetus, 10 C. lari, 4 C. coli, 32 unidentified). There were 737 human and 104 pet C. jejuni/coli strains assigned to 154 and 49 STs, respectively. Dog, particularly puppy, owners were at increased risk of infection with pet-associated STs. In 2/68 cases vs. 0·134/68 expected by chance, a pet and its owner were infected with an identical ST (ST45, ST658). Although common sources of infection and directionality of transmission between pets and humans were unknown, dog ownership significantly increased the risk for pet-associated human C. jejuni/coli infection and isolation of identical strains in humans and their pets occurred significantly more often than expected.
The lifetime performance and reliability of photovoltaic (PV) modules are critical factors in their successful deployment. Interfaces in thin film PV, such as that between the transparent conductive oxide (TCO) electrode and the absorber layer, are frequently an avenue for degradation; this degradation is promoted by exposure to environmental stressors such as irradiance, heat and humidity. Understanding and suppressing TCO degradation is critical to improving stability and extending the lifetime. Commercially available indium tin oxide (ITO), fluorine doped tin oxide (FTO) and aluminum doped zinc oxide (AZO) were exposed to damp heat (DH), ASTM G154 cycle 4, and modified ASTM G154 for up to 1000 hours. The TCOs’ electrical and optical properties and surface energies were determined before and after each exposure and their relative degradation classified. Data demonstrate that AZO degraded most rapidly of all the TCOs, whereas ITO and FTO degraded at lower to non-quantifiable rates. One approach to suppress degradation could be to use interfacial layers (IFLs), including organofunctional silane layers, to modify the TCO. We modified the TCO surfaces using a variety of organofunctional silanes, and determined a range of surface energies could be obtained without affecting the electrical and optical properties of the TCO. Degradation studies of TCOs with a silane layer were also conducted. We found that an inhomogeneous silane layer was able to delay the resistivity increase for ITO in DH.
Although antipsychotic medication is the first line of treatment for schizophrenia, many service users choose to refuse or discontinue their pharmacological treatment. Cognitive therapy (CT) has been shown to be effective when delivered in combination with antipsychotic medication, but has yet to be formally evaluated in its absence. This study evaluates CT for people with psychotic disorders who have not been taking antipsychotic medication for at least 6 months.
Twenty participants with schizophrenia spectrum disorders received CT in an open trial. Our primary outcome was psychiatric symptoms measured using the Positive and Negative Syndromes Scale (PANSS), which was administered at baseline, 9 months (end of treatment) and 15 months (follow-up). Secondary outcomes were dimensions of hallucinations and delusions, self-rated recovery and social functioning.
T tests and Wilcoxon's signed ranks tests revealed significant beneficial effects on all primary and secondary outcomes at end of treatment and follow-up, with the exception of self-rated recovery at end of treatment. Cohen's d effect sizes were moderate to large [for PANSS total, d=0.85, 95% confidence interval (CI) 0.32–1.35 at end of treatment; d=1.26, 95% CI 0.66–1.84 at follow-up]. A response rate analysis found that 35% and 50% of participants achieved at least a 50% reduction in PANSS total scores by end of therapy and follow-up respectively. No patients deteriorated significantly.
This study provides preliminary evidence that CT is an acceptable and effective treatment for people with psychosis who choose not to take antipsychotic medication. An adequately powered randomized controlled trial is warranted.
Multi-locus sequence typing was performed on 1003 Campylobacter jejuni isolates collected in a 2-year longitudinal study of 15 dairy farms and four sheep farms in Lancashire, UK. There was considerable farm-level variation in occurrence and prevalence of clonal complexes (CC). Clonal complexes ST61, ST21, ST403 and ST45 were most prevalent in cattle while in sheep CC ST42, ST21, ST48 and ST52 were most prevalent. CC ST45, a complex previously shown to be more common in summer months in human cases, was more prevalent in summer in our ruminant samples. Gene flow analysis demonstrated a high level of genetic heterogeneity at the within-farm level. Sequence-type diversity was greater in cattle compared to sheep, in cattle at pasture vs. housed, and in isolates from farms on the Pennines compared to the Southern Fylde. Sequence-type diversity was greatest in isolates belonging to CC ST21, ST45 and ST206.
In a 2-year longitudinal study of adult animals on 15 dairy farms and four sheep farms in Lancashire, UK. C. jejuni was isolated from all farms, although not on every occasion. Faecal samples were collected and cultured using standard techniques for isolation of Campylobacter. Assignment to species was via PCR assays. Peak prevalence of C. jejuni in both cattle and sheep was observed during the summer and in cattle this apparent seasonality was associated with grazing pasture [odds ratio (OR) 2·14], while in sheep it was independent of grazing. Increased prevalence was associated with increased milk yield (OR 1·05) and herd size (OR 1·01) in dairy cattle, and with increased stocking density (OR 1·29) and pasture quality (OR 2·16) in sheep. There was considerable variation in prevalence between farms but no evidence of large-scale spatial variation. The association between C. jejuni prevalence and diet in dairy cattle deserves further investigation.
Using data from a cohort study conducted by the Veterinary Laboratories Agency (VLA), evidence of spatial clustering at distances up to 30 km was found for S. Agama and S. Dublin (P values of 0·001) and borderline evidence was found for spatial clustering of S. Typhimurium (P=0·077). The evolution of infection status of study farms over time was modelled using a Markov Chain model with transition probabilities describing changes in status at each of four visits, allowing for the effect of sampling visit. The degree of geographical clustering of infection, having allowed for temporal effects, was assessed by comparing the residual deviance from a model including a measure of recent neighbourhood infection levels with one excluding this variable. The number of cases arising within a defined distance and time period of an index case was higher than expected. This provides evidence for spatial and spatio-temporal clustering, which suggests either a contagious process (e.g. through direct or indirect farm-to-farm transmission) or geographically localized environmental and/or farm factors which increase the risk of infection. The results emphasize the different epidemiology of the three Salmonella serovars investigated.