To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: Selecting appropriate patients for hospitalization following emergency department (ED) evaluation of syncope is critical for serious adverse event (SAE) identification. The primary objective of this study is to determine the association of hospitalization and SAE detection using propensity score (PS) matching. The secondary objective was to determine if SAE identification with hospitalization varied by the Canadian Syncope Risk Score (CSRS) risk-category. Methods: This was a secondary analysis of two large prospective cohort studies that enrolled adults (age ≥ 16 years) with syncope at 11 Canadian EDs. Patients with a serious condition identified during index ED evaluation were excluded. Outcome was a 30-day SAE identified either in-hospital for hospitalized patients or after ED disposition for discharged patients and included death, ventricular arrhythmia, non-lethal arrhythmia and non-arrhythmic SAE (myocardial infarction, structural heart disease, pulmonary embolism, hemorrhage). Patients were propensity matched using age, sex, blood pressure, prodrome, presumed ED diagnosis, ECG abnormalities, troponin, heart disease, hypertension, diabetes, arrival by ambulance and hospital site. Multivariable logistic regression assessed the interaction between CSRS and SAE detection and we report odds ratios (OR). Results: Of the 8183 patients enrolled, 743 (9.0%) patients were hospitalized and 658 (88.6%) were PS matched. The OR for SAE detection for hospitalized patients in comparison to those discharged from the ED was 5.0 (95%CI 3.3, 7.4), non-lethal arrhythmia 5.4 (95%CI 3.1, 9.6) and non-arrhythmic SAE 6.3 (95%CI 2.9, 13.5). Overall, the odds of any SAE identification, and specifically non-lethal arrhythmia and non-arrhythmia was significantly higher in-hospital among hospitalized patients than those discharged from the ED (p < 0.001). There were no significant differences in 30-day mortality (p = 1.00) or ventricular arrhythmia detection (p = 0.21). The interaction between ED disposition and CSRS was significant (p = 0.04) and the probability of 30-day SAEs while in-hospital was greater for medium and high risk CSRS patients. Conclusion: In this multicenter prospective cohort, 30-day SAE detection was greater for hospitalized compared with discharged patients. CSRS low-risk patients are least likely to have SAEs identified in-hospital; out-patient monitoring for moderate risk patients requires further study.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: An important challenge physicians face when treating acute heart failure (AHF) patients in the emergency department (ED) is deciding whether to admit or discharge, with or without early follow-up. The overall goal of our project was to improve care for AHF patients seen in the ED while avoiding unnecessary hospital admissions. The specific goal was to introduce hospital rapid referral clinics to ensure AHF patients were seen within 7 days of ED discharge. Methods: This prospective before-after study was conducted at two campuses of a large tertiary care hospital, including the EDs and specialty outpatient clinics. We enrolled AHF patients ≥50 years who presented to the ED with shortness of breath (<7 days). The 12-month before (control) period was separated from the 12-month after (intervention) period by a 3-month implementation period. Implementation included creation of rapid access AHF clinics staffed by cardiology and internal medicine, and development of referral procedures. There was extensive in-servicing of all ED staff. The primary outcome measure was hospital admission at the index visit or within 30 days. Secondary outcomes included mortality and actual access to rapid follow-up. We used segmented autoregression analysis of the monthly proportions to determine whether there was a change in admissions coinciding with the introduction of the intervention and estimated a sample size of 700 patients. Results: The patients in the before period (N = 355) and the after period (N = 374) were similar for age (77.8 vs. 78.1 years), arrival by ambulance (48.7% vs 51.1%), comorbidities, current medications, and need for non-invasive ventilation (10.4% vs. 6.7%). Comparing the before to the after periods, we observed a decrease in hospital admissions on index visit (from 57.7% to 42.0%; P <0.01), as well as all admissions within 30 days (from 65.1% to 53.5% (P < 0.01). The autoregression analysis, however, demonstrated a pre-existing trend to fewer admissions and could not attribute this to the intervention (P = 0.91). Attendance at a specialty clinic, amongst those discharged increased from 17.8% to 42.1% (P < 0.01) and the median days to clinic decreased from 13 to 6 days (P < 0.01). 30-day mortality did not change (4.5% vs. 4.0%; P = 0.76). Conclusion: Implementation of rapid-access dedicated AHF clinics led to considerably increased access to specialist care, much reduced follow-up times, and possible reduction in hospital admissions. Widespread use of this approach can improve AHF care in Canada.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
We sought to evaluate the role healthcare providers play in carbapenem-resistant Enterobacteriaceae (CRE) acquisition among hospitalized patients.
A 1:4 case-control study with incidence density sampling.
Academic healthcare center with regular CRE perirectal screening in high-risk units.
We included case patients with ≥1 negative CRE test followed by positive culture with a length of stay (LOS) >9 days. For controls, we included patients with ≥2 negative CRE tests and assignment to the same unit set as case patients with a LOS >9 days.
Controls were time-matched to each case patient. Case exposure was evaluated between days 2 and 9 before positive culture and control evaluation was based on maximizing overlap with the case window. Exposure sources were all CRE-colonized or -infected patients. Nonphysician providers were compared between study patients and sources during their evaluation windows. Dichotomous and continuous exposures were developed from the number of source-shared providers and were used in univariate and multivariate regression.
In total, 121 cases and 484 controls were included. Multivariate analysis showed odds of dichotomous exposure (≥1 source-shared provider) of 2.27 (95% confidence interval [CI], 1.25–4.15; P=.006) for case patients compared to controls. Multivariate continuous exposure showed odds of 1.02 (95% CI, 1.01–1.03; P=.009) for case patients compared to controls.
Patients who acquire CRE during hospitalization are more likely to receive care from a provider caring for a patient with CRE than those patients who do not acquire CRE. These data support the importance of hand hygiene and cohorting measures for CRE patients to reduce transmission risk.
Improving neurocognitive outcomes following treatment for brain metastases have become increasingly important. We propose that a brief telephone-based neurocognitive assessment may improve follow-up cognitive assessments in this palliative population. Aim: To prospectively assess the feasibility and reliability of a telephone based brief neurocognitive assessment compared to the same tests delivered face-to-face. Methods: Brain metastases patients to be treated with whole brain radiotherapy (WBRT) were assessed using a brief validated neurocognitive battery at baseline, at 1 month and 3 months following WBRT (in person and over the phone). The primary outcome was feasibility and inter-procedural (in person versus telephone) reliability. The secondary objective was to evaluate the change in neurocognitive function before and after WBRT. Results: Out of 39 patients enrolled, 82% of patients completed the baseline in-person and telephone neurocognitive assessments. However, at 1 month, only 41% of enrolled patients completed the in-person and telephone cognitive assessments and at 3 months, only 10% of patients completed them. Results pertaining to reliability and change in neurocognitive function will be updated. Conclusion: The pre-defined definition of feasibility (at least 80% completion for face to face and telephone neurocognitive assessments) was met at baseline. However, a large proportion of participants did not complete either telephone or in person neurocognitive follow-up at 1 month and at 3 months post-WBRT. Attrition remained a challenge for neurocognitive testing in this population even when a telephone-based brief assessment was used.
The terminal lake systems of central Australia are key sites for the reconstruction of late Quaternary paleoenvironments. Paleoshoreline deposits around these lakes reflect repeated lake filling episodes and such landforms have enabled the establishment of a luminescence-based chronology for filling events in previous studies. Here we present a detailed documentation of the morphology and chemistry of soils developed in four well-preserved beach ridges of late Pleistocene and mid-to-late Holocene age at Lake Callabonna to assess changes in dominant pedogenic processes. All soil profiles contain evidence for the incorporation of eolian-derived material, likely via the formation of desert pavements and vesicular horizons, and limited illuviation due to generally shallow wetting fronts. Even though soil properties in the four studied profiles also provide examples of parent material influence or site-specific processes related to the geomorphic setting, there is an overall trend of increasing enrichment of eolian-derived material since at least ~ 33 ka. Compared to the Holocene profiles, the derived average accumulation rates for the late Pleistocene profiles are significantly lower and may suggest that soils record important regional changes in paleoenvironments and dust dynamics related to shifts in the Southern Hemisphere westerlies.
Altered levels of selenium and copper have been linked with altered cardiovascular disease risk factors including changes in blood triglyceride and cholesterol levels. However, it is unclear whether this can be observed prenatally. This cross-sectional study includes 274 singleton births from 2004 to 2005 in Baltimore, Maryland. We measured umbilical cord serum selenium and copper using inductively coupled plasma mass spectrometry. We evaluated exposure levels vis-à-vis umbilical cord serum triglyceride and total cholesterol concentrations in multivariable regression models adjusted for gestational age, birth weight, maternal age, race, parity, smoking, prepregnancy body mass index, n-3 fatty acids and methyl mercury. The percent difference in triglycerides comparing those in the highest v. lowest quartile of selenium was 22.3% (95% confidence interval (CI): 7.1, 39.7). For copper this was 43.8% (95% CI: 25.9, 64.3). In multivariable models including both copper and selenium as covariates, copper, but not selenium, maintained a statistically significant association with increased triglycerides (percent difference: 40.7%, 95% CI: 22.1, 62.1). There was limited evidence of a relationship of increasing selenium with increasing total cholesterol. Our findings provide evidence that higher serum copper levels are associated with higher serum triglycerides in newborns, but should be confirmed in larger studies.
We consider the dynamics of actively entraining turbulent density currents on a conical sloping surface in a rotating fluid. A theoretical plume model is developed to describe both axisymmetric flow and single-stream currents of finite angular extent. An analytical solution is derived for flow dominated by the initial buoyancy flux and with a constant entrainment ratio, which serves as an attractor for solutions with alternative initial conditions where the initial fluxes of mass and momentum are non-negligible. The solutions indicate that the downslope propagation of the current halts at a critical level where there is purely azimuthal flow, and the boundary layer approximation breaks down. Observations from a set of laboratory experiments are consistent with the dynamics predicted by the model, with the flow approaching a critical level. Interpretation in terms of the theory yields an entrainment coefficient $E\propto 1/\Omega $ where the rotation rate is $\Omega $. We also derive a corresponding theory for density currents from a line source of buoyancy on a planar slope. Our theoretical models provide a framework for designing and interpreting laboratory studies of turbulent entrainment in rotating dense flows on slopes and understanding their implications in geophysical flows.
The emergence of invasive fungal wound infections (IFIs) in combat casualties led to development of a combat trauma-specific IFI case definition and classification. Prospective data were collected from 1133 US military personnel injured in Afghanistan (June 2009–August 2011). The IFI rates ranged from 0·2% to 11·7% among ward and intensive care unit admissions, respectively (6·8% overall). Seventy-seven IFI cases were classified as proven/probable (n = 54) and possible/unclassifiable (n = 23) and compared in a case-case analysis. There was no difference in clinical characteristics between the proven/probable and possible/unclassifiable cases. Possible IFI cases had shorter time to diagnosis (P = 0·02) and initiation of antifungal therapy (P = 0·05) and fewer operative visits (P = 0·002) compared to proven/probable cases, but clinical outcomes were similar between the groups. Although the trauma-related IFI classification scheme did not provide prognostic information, it is an effective tool for clinical and epidemiological surveillance and research.
To examine barriers to initiation and continuation of mental health treatment among individuals with common mental disorders.
Data were from the World Health Organization (WHO) World Mental Health (WMH) surveys. Representative household samples were interviewed face to face in 24 countries. Reasons to initiate and continue treatment were examined in a subsample (n = 636 78) and analyzed at different levels of clinical severity.
Among those with a DSM-IV disorder in the past 12 months, low perceived need was the most common reason for not initiating treatment and more common among moderate and mild than severe cases. Women and younger people with disorders were more likely to recognize a need for treatment. A desire to handle the problem on one's own was the most common barrier among respondents with a disorder who perceived a need for treatment (63.8%). Attitudinal barriers were much more important than structural barriers to both initiating and continuing treatment. However, attitudinal barriers dominated for mild-moderate cases and structural barriers for severe cases. Perceived ineffectiveness of treatment was the most commonly reported reason for treatment drop-out (39.3%), followed by negative experiences with treatment providers (26.9% of respondents with severe disorders).
Low perceived need and attitudinal barriers are the major barriers to seeking and staying in treatment among individuals with common mental disorders worldwide. Apart from targeting structural barriers, mainly in countries with poor resources, increasing population mental health literacy is an important endeavor worldwide.
Official suicide statistics for England are based on deaths given suicide verdicts and most cases given an open verdict following a coroner's inquest. Previous research indicates that some deaths given accidental verdicts are considered to be suicides by clinicians. Changes in coroners' use of different verdicts may bias suicide trend estimates. We investigated whether suicide trends may be over- or underestimated when they are based on deaths given suicide and open verdicts.
Possible suicides assessed by 12 English coroners in 1990/91, 1998 and 2005 and assigned open, accident/misadventure or narrative verdicts were rated by three experienced suicide researchers according to the likelihood that they were suicides. Details of all suicide verdicts given by these coroners were also recorded.
In 1990/91, 72.0% of researcher-defined suicides received a suicide verdict from the coroner, this decreased to 65.4% in 2005 (ptrend < 0.01); equivalent figures for combined suicide and open verdicts were 95.4% (1990/91) and 86.7% (2005). Researcher-defined suicides with a verdict of accident/misadventure doubled over that period, from 4.6% to 9.1% (p < 0.01). Narrative verdict cases rose from zero in 1990/91 to 25 in 2005 (4.2% of researcher-defined suicides that year). In 1998 and 2005, 50.0% of the medicine poisoning deaths given accidental/misadventure verdicts were rated as suicide by the researchers.
Between 1990/91 and 2005, the proportion of researcher-defined suicides given a suicide verdict by coroners decreased, largely due to an increased use of accident/misadventure verdicts, particularly for deaths involving poisoning. Consideration should be given to the inclusion of ‘accidental’ deaths by poisoning with medicines in the statistics available for monitoring suicides rates.
It is believed that when patients present to the emergency department (ED) with recent-onset atrial fibrillation or flutter (RAFF), controlling the ventricular rate before cardioversion improves the success rate. We evaluated the influence of rate control medication and other variables on the success of cardioversion.
This secondary analysis of a medical records review comprised 1,068 patients with RAFF who presented to eight Canadian EDs over 12 months. Univariate analysis was performed to find associations between predictors of conversion to sinus rhythm including use of rate control, rhythm control, and other variables. Predictive variables were incorporated into the multivariate model to calculate adjusted odds ratios (ORs) associated with successful cardioversion.
A total of 634 patients underwent attempted cardioversion: 428 electrical, 354 chemical, and 148 both. Adjusted ORs for factors associated with successful electrical cardioversion were use of rate control medication, 0.39 (95% confidence interval [CI] 0.21-0.74); rhythm control medication, 0.28 (95% CI 0.15-0.53); and CHADS2 score > 0, 0.43 (95% CI 0.15-0.83). ORs for factors associated with successful chemical cardioversion were use of rate control medication, 1.29 (95% CI 0.82-2.03); female sex, 2.37 (95% CI 1.50-3.72); and use of procainamide, 2.32 (95% CI 1.43-3.74).
We demonstrated reduced successful electrical cardioversion of RAFF when patients were pretreated with either rate or rhythm control medication. Although rate control medication was not associated with increased success of chemical cardioversion, use of procainamide was. Slowing the ventricular rate prior to cardioversion should be avoided.
Wheat bran extract (WBE) is a food-grade soluble fibre preparation that is highly enriched in arabinoxylan oligosaccharides. In this placebo-controlled cross-over human intervention trial, tolerance and effects on colonic protein and carbohydrate fermentation were studied. After a 1-week run-in period, sixty-three healthy adult volunteers consumed 3, 10 and 0 g WBE/d for 3 weeks in a random order, with 2 weeks' washout between each treatment period. Fasting blood samples were collected at the end of the run-in period and at the end of each treatment period for analysis of haematological and clinical chemistry parameters. Additionally, subjects collected a stool sample for analysis of microbiota, SCFA and pH. A urine sample, collected over 48 h, was used for analysis of p-cresol and phenol content. Finally, the subjects completed questionnaires scoring occurrence frequency and distress severity of eighteen gastrointestinal symptoms. Urinary p-cresol excretion was significantly decreased after WBE consumption at 10 g/d. Faecal bifidobacteria levels were significantly increased after daily intake of 10 g WBE. Additionally, WBE intake at 10 g/d increased faecal SCFA concentrations and lowered faecal pH, indicating increased colonic fermentation of WBE into desired metabolites. At 10 g/d, WBE caused a mild increase in flatulence occurrence frequency and distress severity and a tendency for a mild decrease in constipation occurrence frequency. In conclusion, WBE is well tolerated at doses up to 10 g/d in healthy adults volunteers. Intake of 10 g WBE/d exerts beneficial effects on gut health parameters.
Prior research on whether marriage is equally beneficial to the mental health of men and women is inconsistent due to methodological variation. This study addresses some prior methodological limitations and investigates gender differences in the association of first marriage and being previously married, with subsequent first onset of a range of mental disorders.
Cross-sectional household surveys in 15 countries from the WHO World Mental Health survey initiative (n=34493), with structured diagnostic assessment of mental disorders using the Composite International Diagnostic Interview 3.0. Discrete-time survival analyses assessed the interaction of gender and marital status in the association with first onset of mood, anxiety and substance use disorders.
Marriage (versus never married) was associated with reduced risk of first onset of most mental disorders in both genders; but for substance use disorders this reduced risk was stronger among women, and for depression and panic disorder it was confined to men. Being previously married (versus stably married) was associated with increased risk of all disorders in both genders; but for substance use disorders, this increased risk was stronger among women and for depression it was stronger among men.
Marriage was associated with reduced risk of the first onset of most mental disorders in both men and women but there were gender differences in the associations between marital status and onset of depressive and substance use disorders. These differences may be related to gender differences in the experience of multiple role demands within marriage, especially those concerning parenting.
Cytomegalovirus (CMV) seroprevalence was determined in 9343 first-time New Zealand blood donors between 2003 and 2006. Of 39 960 current seropositive donors the proportion testing seropositive more than 12 months previously was calculated. Overall, seroprevalence declined from 66·1% [95% confidence interval (CI) 64·1–68·1] in 2003 to 60·6% (95% CI 58·5–62·6) in 2006. Nevertheless, these rates are significantly higher than the 47% overall seroprevalence found in a 1988 study. Seroprevalence was higher in females than males and in older than in younger age groups in all four years examined. Ethnicity appeared to be related to seroprevalence with the highest rates found in Pacific Islanders (93·2%) and the lowest in Caucasians (54·8%). At least 38 242/39 960 (95·7%) seropositive donors were found to have seroconverted more than 12 months previously. Recent evidence suggests that such ‘remote’ seroconverters may pose a much lower risk of transfusion-transmitted CMV infection than recently infected seroconverting, but seronegative, blood donors.
In February 1992, an outbreak of cholera occurred among persons who had flown on a commercial airline flight from South America to Los Angeles. This study was conducted to determine the magnitude and the cause of the outbreak. Passengers were interviewed and laboratory specimens were collected to determine the magnitude of the outbreak. A case-control study was performed to determine the vehicle of infection. Seventy-five of the 336 passengers in the United States had cholera; 10 were hospitalized and one died. Cold seafood salad, served between Lima, Peru and Los Angeles, California, was the vehicle of infection (odds ratio, 11·6; 95% confidence interval, 3·3–44·5). This was the largest airline-associated outbreak of cholera ever reported and demonstrates the potential for airline-associated spread of cholera from epidemic areas to other parts of the world. Physicians should obtain a travel history and consider cholera in patients with diarrhoea who have travelled from cholera-affected countries. This outbreak also highlights the risks associated with eating cold foods prepared in cholera-affected countries.