To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Newton's Principia is perhaps the second most famous work of mathematics, after Euclid's Elements. Originally published in 1687, it gave the first systematic account of the fundamental concepts of dynamics, as well as three beautiful derivations of Newton's law of gravitation from Kepler's laws of planetary motion. As a book of great insight and ingenuity, it has raised our understanding of the power of mathematics more than any other work. This heavily annotated translation of the third and final edition (1726) of the Principia will enable any reader with a good understanding of elementary mathematics to easily grasp the meaning of the text, either from the translation itself or from the notes, and to appreciate some of its significance. All forward references are given to illuminate the structure and unity of the whole, and to clarify the parts. The mathematical prerequisites for understanding Newton's arguments are given in a brief appendix.
Observations while attending several crime analysis meetings of a major city police department provide insight into questions of sources and importance of knowledge in policing. This observable knowledge is used by the police as they grapple with the ongoing complexities of public safety, in this case in an urban city.
At each meeting, two forms of knowledge framed discussions of crime and disorder problems. Framing as a sociological and communications theory relies on ideas associated with understanding how attitudes and behavior are shaped by the information or form of knowledge presented (Goffman, 1986; Scheufele & Iyengar, 2014).
Oxidative stress is implicated in the etiology of schizophrenia, and the antioxidant defense system may be protective in this illness. We examined the major antioxidant glutathione (GSH) in prefrontal brain, and its correlates with clinical and demographic variables, in schizophrenia.
GSH levels were measured in the dorsolateral prefrontal region of 28 patients with chronic schizophrenia using a magnetic resonance spectroscopy sequence specifically adapted for GSH. We examined correlations of GSH levels with age, age at onset of illness, duration of illness, and clinical symptoms.
We found a negative correlation between GSH levels and age at onset (r=-.46, p=.015), and a trend-level positive relationship between GSH and duration of illness (r=.34, p=.076).
Our findings are consistent with a possible compensatory upregulation of the antioxidant defense system with longer duration of illness, and suggests the antioxidant defense system may play a role in schizophrenia.
Most of the water humans consume is for agriculture. Rapidly increasing water demand has led to overexploitation of water resources in many important food-producing regions. In particular, growing groundwater-based irrigation causes potentially damaging depletion. Food systems are increasingly globalized, leading to large export-oriented production. Much research has focused on quantifying the amount of water resources embedded in traded products, but less attention has been given to the role of groundwater use and the related sustainability of agriculture globally. We assess current knowledge of virtual water trade in light of groundwater use and sustainability and highlight remaining challenges in this field.
Older adults with dementia are particularly vulnerable to adverse outcomes resulting from anticholinergic use. We aimed to: (i) Examine the anticholinergic burden of patients with dementia attending a Psychiatry of Later Life (PLL) service (ii) Examine concomitant prescription of acetylcholinesterase inhibitors (AChEIs) and anticholinergics and (iii) Compare the Anticholinergic Cognitive Burden (ACB) scale with a recently published composite list of anticholinergics.
Retrospective chart review of new referrals with a diagnosis of dementia (n = 66) seen by the PLL service, Tallaght University Hospital, Dublin, Ireland, over a consecutive period of 4 months.
The mean ACB score was 2.2 (range = 0–9, SD = 2.1). 37.9% (n = 25) had a clinically significant ACB score (>3) and 42.1% (n = 8) of those taking AChEIs had a clinically significant ACB score. A significantly greater number of medications with anticholinergic activity were identified using the composite list versus the traditional ACB scale (2.3 v.1.5, p = 0.001).
We demonstrated a significant anticholinergic burden amongst patients with dementia attending a specialist PLL service. There was no difference in anticholinergic burden between groups prescribed and not prescribed AChEIs, indicating that these medications are being prescribed without discontinuation of potentially inappropriate medications with anticholinergic activity. The true anticholinergic burden experienced by patients may be underestimated by the use of the ACB score alone, although the clinical significance of this finding is unclear. Calculation of true clinical anticholinergic burden load and its translation to a specific rating scale remains a challenge.
Introduction: Previous systematic reviews suggest early mobilization in the intensive care unit (ICU) population is feasible, safe, and may improve outcomes. Only one review investigated mobilization specifically in trauma ICU patients and failed to identify any relevant articles. The objective of the present systematic review was to conduct an up-to-date search of the literature to assess the effect of early mobilization in adult trauma ICU patients on mortality, length of stay (LOS) and duration of mechanical ventilation. Methods: We performed a systematic search of four electronic databases (Ovid MEDLINE, Embase, CINAHL, Cochrane Library) and the grey literature. To be included, studies must have compared early mobilization to delayed or no mobilization among trauma patients admitted to the ICU. Meta-analysis was performed to determine the effect of early mobilization on mortality, hospital LOS, ICU LOS, and duration of mechanical ventilation. Results: The search yielded 2,975 records from the 4 databases and 7 records from grey literature and bibliographic searches; of these, 9 articles met all eligibility criteria and were included in the analysis. There were 7 studies performed in the United States, 1 study from China and 1 study from Norway. Study populations included neurotrauma (3 studies), blunt abdominal trauma (2 studies), mixed injury types (2 studies) and burns (1 study). Cohorts ranged in size from 15 to 1,132 patients (median, 63) and varied in inclusion criteria. Most studies used some form of stepwise progressive mobility protocol. Two studies used simple ambulation as the mobilization measure, and 1 study employed upright sitting as their only intervention. Time to commencement of the intervention was variable across studies, and only 2 studies specified the timing of mobilization initiation. We did not detect a difference in mortality with early mobilization, although the pooled risk ratio (RR) was reduced (RR 0.90, 95% CI 0.74 to 1.09). Hospital LOS and ICU LOS were decreased with early mobilization, though this difference did not reach significance. Duration of mechanical ventilation was significantly shorter in the early mobilization group (mean difference −1.18. 95% CI −2.17 to −0.19). Conclusion: Our review identified few studies that examined mobilization of critically ill trauma patients in the ICU. On meta-analysis, early mobilization was found to reduce duration of mechanical ventilation, but the effects on mortality and LOS were not significant.
Introduction: Hypotension is known to be associated with increased mortality in severe traumatic brain injury (TBI) patients. Systolic blood pressure (SBP) of <90 mmHg is the threshold for hypotension in consensus TBI treatment guidelines; however, evidence suggests hypotension should be defined at higher levels for these patients. Our objective was to determine the influence of hypotension on mortality in TBI patients requiring ICU admission using different thresholds of SBP on arrival at the emergency department (ED). Methods: Retrospective cohort study of patients with severe TBI (Abbreviated Injury Scale Head score ≥3) admitted to ICU at the QEII Health Sciences Centre (Halifax, Canada) between 2002 and 2013. Patients were grouped by SBP on ED arrival ( <90 mmHg, <100 mmHg, <110 mmHg). We performed multiple logistic regression analysis with mortality as the dependent variable. Models were adjusted for confounders including age, gender, Injury Severity Score (ISS), injury mechanism, and trauma team activation (TTA). Results: A total of 1233 patients sustained a severe TBI and were admitted to the ICU during the study period. The mean age was 43.4 ± 23.9 years and most patients were male (919/1233; 74.5%). The most common mechanism of injury was motor vehicle collision (491/1233; 41.2%) followed by falls (427/1233; 35.8%). Mean length of stay in the ICU was 6.1 ± 6.4 days, and the overall mortality rate was 22.7%. SBP on arrival was available for 1182 patients. The <90 mmHg group had 4.6% (54/1182) of these patients; mean ISS was 20.6 ± 7.8 and mortality was 40.7% (22/54). The <100 mmHg had 9.3% (110/1182) of patients; mean ISS was 19.3 ± 7.9 and mortality was 34.5% (38/110). The <110 mmHg group had 16.8% (198/1182) of patients; mean ISS was 17.9 ± 8.0 and mortality was 28.8% (57/198). After adjusting for confounders, the association between hypotension and mortality was 2.22 (95% CI 1.19-4.16) using a <90 mmHg cutoff, 1.79 (95% CI 1.12-2.86) using a <100 mmHg cutoff, and 1.50 (95% CI 1.02-2.21) using a <110 mmHg cutoff. Conclusion: While we found that TBI patients with a SBP <90 mmHg were over 2 times more likely to die, patients with an SBP <110 mmHg on ED arrival were still 1.5 times more likely to die from their injuries compared to patients without hypotension. These results suggest that establishing a higher threshold for clinically meaningful hypotension in TBI patients is warranted.
Introduction: Long-term immobility has detrimental effects for critically ill patients admitted to the intensive care unit (ICU) including ICU-acquired weakness. Early mobilization of patients admitted to ICU has been demonstrated to be a safe, feasible and effective strategy to improve patient outcomes. The optimal mobilization of trauma ICU patients has not been extensively studied. Our objective was to determine the impact of an early mobilization protocol on outcomes among trauma patients admitted to the ICU. Methods: We analyzed all adult trauma patients ( > 18 years old) admitted to ICU over a 2-year period prior to and following implementation of an early mobilization protocol, allowing for a 1-year transition period. Data were collected from the Nova Scotia Trauma Registry. We compared patient characteristics and outcomes (mortality, length of stay [LOS], ventilator days) between the pre- and post-implementation groups. Associations between early mobilization and clinical outcomes were estimated using binary and linear regression models. Results: Overall, there were 526 patients included in the analysis (292 pre-implementation, 234 post-implementation). The study population ranged in age from 18 to 92 years (mean age 49.0 ± 20.4 years) and 74.3% of all patients were male. The pre- and post-implementation groups were similar in age, sex, and injury severity. In-hospital mortality was reduced in the post-implementation group (25.3% vs. 17.5%; p = 0.031). In addition, there was a reduction in ICU mortality in the post-implementation group (21.6% vs. 12.8%; p = 0.009). We did not observe any difference in overall hospital LOS, ICU LOS, or ventilator days between the two groups. Compared to the pre-implementation period, trauma patients admitted to the ICU following protocol implementation were less likely to die in-hospital (OR = 0.52, 95% CI 0.30-0.91; p = 0.021) or in the ICU (OR = 0.40, 95% CI 0.21- 0.76, p = 0.005). Results were similar following a sensitivity analysis limited to patients with blunt or penetrating injuries. There was no difference between the pre- and post-implementation groups with respect to in-hospital LOS, ICU LOS, or the number of ventilator days. Conclusion: We found that trauma patients admitted to ICU during the post-implementation period had decreased odds of in-hospital mortality and ICU mortality. Ours is the first study to demonstrate a significant reduction in trauma mortality following implementation of an ICU mobility protocol.
Introduction: The Prehospital Evidence-Based Practice (PEP) program is an online, freely accessible, continuously updated Emergency Medical Services (EMS) evidence repository. This summary describes the research evidence for the identification and management of adult patients suffering from sepsis syndrome or septic shock. Methods: PubMed was searched in a systematic manner. One author reviewed titles and abstracts for relevance and two authors appraised each study selected for inclusion. Primary outcomes were extracted. Studies were scored by trained appraisers on a three-point Level of Evidence (LOE) scale (based on study design and quality) and a three-point Direction of Evidence (DOE) scale (supportive, neutral, or opposing findings based on the studies’ primary outcome for each intervention). LOE and DOE of each intervention were plotted on an evidence matrix (DOE x LOE). Results: Eighty-eight studies were included for 15 interventions listed in PEP. The interventions with the most evidence were related to identification tools (ID) (n = 26, 30%) and early goal directed therapy (EGDT) (n = 21, 24%). ID tools included Systematic Inflammatory Response Syndrome (SIRS), quick Sequential Organ Failure Assessment (qSOFA) and other unique measures. The most common primary outcomes were related to diagnosis (n = 30, 34%), mortality (n = 40, 45%) and treatment goals (e.g. time to antibiotic) (n = 14, 16%). The evidence rank for the supported interventions were: supportive-high quality (n = 1, 7%) for crystalloid infusion, supportive-moderate quality (n = 7, 47%) for identification tools, prenotification, point of care lactate, titrated oxygen, temperature monitoring, and supportive-low quality (n = 1, 7%) for vasopressors. The benefit of prehospital antibiotics and EGDT remain inconclusive with a neutral DOE. There is moderate level evidence opposing use of high flow oxygen. Conclusion: EMS sepsis interventions are informed primarily by moderate quality supportive evidence. Several standard treatments are well supported by moderate to high quality evidence, as are identification tools. However, some standard in-hospital therapies are not supported by evidence in the prehospital setting, such as antibiotics, and EGDT. Based on primary outcomes, no identification tool appears superior. This evidence analysis can guide selection of appropriate prehospital therapies.
Introduction: Early and accurate diagnosis of critical conditions is essential in emergency medical services (EMS). Serum lactate testing may be used to identify patients with worse prognosis, including sepsis. Recently, the use of a point-of-care lactate (POCL) test has been evaluated in guiding treatment in patients with sepsis. Operating as part of the Prehospital Evidence Based Practice (PEP) Program, the authors sought to identify and describe the body of evidence for POCL use in EMS and the emergency department (ED) for patients with sepsis. Methods: Following PEP methodology, in May 2018, PubMed was searched in a systematic manner. Title and abstract screening were conducted by the program coordinator. These studies were collected, appraised and added to the existing body of literature contained within the PEP database. Evidence appraisal was conducted by two reviewers who assigned both a level of evidence (LOE) on a novel three tier scale and a direction of evidence (supportive, neutral or opposing; based on primary outcome). Data on setting and study design were also extracted. Results: Eight studies were included in our analysis. Three of these studies were conducted in the ED setting; each investigating the POCL test's ability to predict severe sepsis, ICU admission or death. All three studies found supportive results for POCL. A systematic review on the use of POCL in the ED determined that this test can also improve time to treatment. Five of the total 8 studies were conducted prehospitally. Two of these studies were supportive of POCL use in the prehospital setting; in terms of feasibility and the ability to predict sepsis. Both of these study sites used this early information as part of initiating a “sepsis alert” pathway. The other three prehospital studies provide neutral support for POCL. One study demonstrated moderate ability of POCL to predict severe illness. Two studies found poor agreement between prehospital POCL and serum lactate values. Conclusion: Limited low and moderate quality evidence suggest POCL may be feasible and helpful in predicting sepsis in the prehospital setting. However, there is sparse and inconsistent support for specific important outcomes, including accuracy.
Influenza and respiratory syncytial virus (RSV) are common causes of respiratory tract infections and place a burden on health services each winter. Systems to describe the timing and intensity of such activity will improve the public health response and deployment of interventions to these pressures. Here we develop early warning and activity intensity thresholds for monitoring influenza and RSV using two novel data sources: general practitioner out-of-hours consultations (GP OOH) and telehealth calls (NHS 111). Moving Epidemic Method (MEM) thresholds were developed for winter 2017–2018. The NHS 111 cold/flu threshold was breached several weeks in advance of other systems. The NHS 111 RSV epidemic threshold was breached in week 41, in advance of RSV laboratory reporting. Combining the use of MEM thresholds with daily monitoring of NHS 111 and GP OOH syndromic surveillance systems provides the potential to alert to threshold breaches in real-time. An advantage of using thresholds across different health systems is the ability to capture a range of healthcare-seeking behaviour, which may reflect differences in disease severity. This study also provides a quantifiable measure of seasonal RSV activity, which contributes to our understanding of RSV activity in advance of the potential introduction of new RSV vaccines.
X-ray diffraction topography is the name given to several x-ray diffraction techniques where large area x-ray beams diffracted from a crystal provide detailed information about the surface structure and internal perfection of crystal microstructures. Since x-ray topographic techniques are based on Bragg (reflection) or Laue (transmission) diffraction from a crystal lattice, they are extremely sensitive to any atomic lattice imperfections and strains. Alterations of the interplanar spacing as small as one part in ten thousand extending over a reasonable number of atomic ce11 lengths can be recorded as a corresponding change in the diffracted beam intensity. Line Modified-Asymmetric Crystal Topography (LM-ACT) is one such reflection technique that shows particular promise in Che field of microelectronics. The LM-ACT system is designed with low angular divergence in the x-ray beam probe. Low probe beam divergence allows details of device geometries on the order of microns to be resolved in the recorded x-ray intensity variation of the diffracted beam.
The LM-ACT system was applied here to the study of integrated circuits (IC) after specific processing steps were accomplished during IC fabrication and in the final product condition. Topographs obtained from specular crystal surfaces that were implanted through a patterned mask showed contrast variations between the implanted and non-implanted regions; details of the mask patterns have been resolved on the order of a few microns. LMACT topographs from annealed, and unannealed, Implanted specimens showed marked differences and as a result it is suggested that LM-ACT would be beneficial in optimizing the processing schedule for a particular wafer/electronic system. A significant feature of the LM-ACT technique is the capability for producing high resolution stereo-pair topographs that provide quantitative information through the depth of individual process layers in an integrated circuit.
The strength of ceramics or glasses can be increased by placing their surfaces into compression. Techniques include ion exchange, temperature glazing, surface chemical reactions and stress-induced phase transformations. Although most of these techniques are well recognized, little effort has been expended In experimentally determining the magnitude of the compressive stress, and in particular, to use experimental evidence to identify important material and process parameters that need to be controlled. The goal of this investigation was to determine some of the factors that effect the magnitude, profile and depth of the compressive layer introduced by a structural phase transformation. X-ray residual stress measurements were used to directly determine the state of the surface residual stress.
Civic improvement in Georgian Britain required significant amounts of capital. Tontines were an important means of financing projects. This article provides new evidence based largely on local newspapers that demonstrates their local and national importance for mutual assurance and building. Shifts in profitability depended on the price of Consols and this explains why building tontines increased in importance. Tontines were used to fund new leisure spaces, workhouses, prisons, bridges, streets and other improvements. Their popularity waned in the later nineteenth century but until then they were an important means of funding civic improvements.
Mismatch negativity (MMN) is an event-related potential (ERP) component reflecting auditory predictive coding. Repeated standard tones evoke increasing positivity (‘repetition positivity’; RP), reflecting strengthening of the standard's memory trace and the prediction it will recur. Likewise, deviant tones preceded by more standard repetitions evoke greater negativity (‘deviant negativity’; DN), reflecting stronger prediction error signaling. These memory trace effects are also evident in MMN difference wave. Here, we assess group differences and test-retest reliability of these indices in schizophrenia patients (SZ) and healthy controls (HC).
Electroencephalography was recorded twice, 2 weeks apart, from 43 SZ and 30 HC, during a roving standard paradigm. We examined ERPs to the third, eighth, and 33rd standards (RP), immediately subsequent deviants (DN), and the corresponding MMN. Memory trace effects were assessed by comparing amplitudes associated with the three standard repetition trains.
Compared with controls, SZ showed reduced MMNs and DNs, but normal RPs. Both groups showed memory trace effects for RP, MMN, and DN, with a trend for attenuated DNs in SZ. Intraclass correlations obtained via this paradigm indicated good-to-moderate reliabilities for overall MMN, DN and RP, but moderate to poor reliabilities for components associated with short, intermediate, and long standard trains, and poor reliability of their memory trace effects.
MMN deficits in SZ reflected attenuated prediction error signaling (DN), with relatively intact predictive code formation (RP) and memory trace effects. This roving standard MMN paradigm requires additional development/validation to obtain suitable levels of reliability for use in clinical trials.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
There are no estimates of the heritability of phenotypic udder traits in suckler sheep, which produce meat lambs, and whether these are associated with resilience to mastitis. Mastitis is a common disease which damages the mammary gland and reduces productivity. The aims of this study were to investigate the feasibility of collecting udder phenotypes, their heritability and their association with mastitis in suckler ewes. Udder and teat conformation, teat lesions, intramammary masses (IMM) and litter size were recorded from 10 Texel flocks in Great Britain between 2012 and 2014; 968 records were collected. Pedigree data were obtained from an online pedigree recording system. Univariate quantitative genetic parameters were estimated using animal and sire models. Linear mixed models were used to analyse continuous traits and generalised linear mixed models were used to analyse binary traits. Continuous traits had higher heritabilities than binary with teat placement and teat length heritability (h2) highest at 0.35 (SD 0.04) and 0.42 (SD 0.04), respectively. Udder width, drop and separation heritabilities were lower and varied with udder volume. The heritabilities of IMM and teat lesions (sire model) were 0.18 (SD 0.12) and 0.17 (SD 0.11), respectively. All heritabilities were sufficiently high to be in a selection programme to increase resilience to mastitis in the population of Texel sheep. Further studies are required to investigate genetic relationships between traits and to determine whether udder traits predict IMM, and the potential benefits from including traits in a selection programme to increase resilience to chronic mastitis.
A proper subgraph of a connected linear graph is said to disconnect the graph if removing it leaves a disconnected graph. In this paper we characterize, in the following sense, the disconnecting subgraphs of a fixed connected graph. We define two distinct types of disconnecting subgraphs (isthmuses and articulators) which are minimal in the sense that no proper subgraph of either type can disconnect the graph. We then show that any disconnecting subgraph must contain either an isthmus or an articulator. We also define a set of subgraphs (called dense) which form a lattice. We show that the union of the minimal dense subgraphs contains all isthmuses and articulators. In terms of these subgraphs we investigate some of the consequences of assuming that a disconnecting subgraph must contain at least m points.
The prevalence of mental disorders among Black, Latino, and Asian adults is lower than among Whites. Factors that explain these differences are largely unknown. We examined whether racial/ethnic differences in exposure to traumatic events (TEs) or vulnerability to trauma-related psychopathology explained the lower rates of psychopathology among racial/ethnic minorities.
We estimated the prevalence of TE exposure and associations with onset of DSM-IV depression, anxiety and substance disorders and with lifetime post-traumatic stress disorder (PTSD) in the Collaborative Psychiatric Epidemiology Surveys, a national sample (N = 13 775) with substantial proportions of Black (35.9%), Latino (18.9%), and Asian Americans (14.9%).
TE exposure varied across racial/ethnic groups. Asians were most likely to experience organized violence – particularly being a refugee – but had the lowest exposure to all other TEs. Blacks had the greatest exposure to participation in organized violence, sexual violence, and other TEs, Latinos had the highest exposure to physical violence, and Whites were most likely to experience accidents/injuries. Racial/ethnic minorities had lower odds ratios of depression, anxiety, and substance disorder onset relative to Whites. Neither variation in TE exposure nor vulnerability to psychopathology following TEs across racial/ethnic groups explained these differences. Vulnerability to PTSD did vary across groups, however, such that Asians were less likely and Blacks more likely to develop PTSD following TEs than Whites.
Lower prevalence of mental disorders among racial/ethnic minorities does not appear to reflect reduced vulnerability to TEs, with the exception of PTSD among Asians. This highlights the importance of investigating other potential mechanisms underlying racial/ethnic differences in psychopathology.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.