To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Nursing home residents with dementia are sensitive to detrimental auditory environments. This paper presents the first literature review of empirical research investigating (1) the (perceived) intensity and sources of sounds in nursing homes, and (2) the influence of sounds on health of residents with dementia and staff.
A systematic review was conducted in PubMed, Web of Science and Scopus. Study quality was assessed with the Mixed Methods Appraisal Tool. We used a narrative approach to present the results.
We included 35 studies. Nine studies investigated sound intensity and reported high noise intensity with an average of 55–68 dB(A) (during daytime). In four studies about sound sources, human voices and electronic devices were the most dominant sources. Five cross-sectional studies focused on music interventions and reported positives effects on agitated behaviors. Four randomized controlled trials tested noise reduction as part of an intervention. In two studies, high-intensity sounds were associated with decreased nighttime sleep and increased agitation. The third study found an association between music and less agitation compared to other stimuli. The fourth study did not find an effect of noise on agitation. Two studies reported that a noisy environment had negative effects on staff.
The need for appropriate auditory environments that are responsive to residents’ cognitive abilities and functioning is not yet recognized widely. Future research needs to place greater emphasis on intervention-based and longitudinal study design.
We examined the 2-year stability of neurological soft signs (NSS) in 29 patients after a first episode of psychosis. The numbers of NSS at inclusion and at 2 years follow-up were similar, but there was a significant increase in the numbers of NSS in the sub-group of patients whose dosage of antipsychotic medication had increased over time.
Endocannabinoid System (ECS) has been highlighted as one of the most relevant research topics by neurobiologists, pharmacists, basic scientists and clinicians (Skaper and Di Marzo, 2012). Recent work has associated major depressive disorder with the ECS (Ashton and Moore, 2011). Despite the close relationship between depression and bipolar disorders, as far as we know, there is no characterization of ECS and congeners in a sample of patients with bipolar disorders.
Aims and objectives
The objective of this work is to characterize the plasma levels of endocannabinoids and congeners in a sample of patients with bipolar disorders.
The clinical group was composed by 19 patients with a diagnosis of bipolar disorders using SCID-IV (First et al., 1999). The control group was formed by 18 relatives of first- or second-degree of the patients.
The following endocannabinoids and congeners were quantified: N-palmitoleoylethanolamide (POEA), N-palmitolylethanolamide (PEA), N-oleoylethanolamide (OEA), N-stearoylethanolamide (SEA), N-arachidonoylethanolamide (AEA), N-dihomo-γ-linolenoylethanolamide (DGLEA), N-docosatetraenoylethanolamide (DEA), N-linoleoylethanolamide (LEA), N-docosahexaenoylethanolamide (DHEA), 2-arachidonoylglycerol (2-AG), 2-linoleoylglycerol (2-LG), and 2-oleoylglycerol (2-OG).
The result showed statistically significant lower levels of AEA, DEA and DHEA in clinical sample. Previous research also identified lower levels of AEA in depressed women (Hill et al., 2008, 2009). Until date, it is unknown if DEA and DHEA have some effect on EC receptors, and whether they have some direct effects on endocannabinoids.
It would be necessary to carry our other research with a larger sample, which could allow the control of potential confounding variables.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Diagnosing heart failure (HF) in primary care can be challenging, especially
in elderly patients with comorbidities. Insight in the prevalence, age,
comorbidity and routine practice of diagnosing HF in general practice may
improve the process of diagnosing HF.
To examine the prevalence of HF in relation to ageing and comorbidities, and
routine practice of diagnosing HF in general practice.
A retrospective cohort study was performed using data from electronic health
records of 56 320 adult patients of 11 general practices. HF patients were
compared with patients without HF using descriptive analyses and
χ2 tests. The following comorbidities were considered: chronic
obstructive pulmonary disorder (COPD), diabetes mellitus (DM), hypertension,
anaemia and renal function disorder (RFD). Separate analyses were performed
for men and women.
The point prevalence of HF was 1.2% (95% confidence interval
1.13–1.33) and increased with each age category from 0.04%
(18–44 years) to 20.9% (⩾85 years). All studied
comorbidities were significantly (P<0.001) more
common in HF patients than in patients without HF: COPD (24.1% versus
3.1%), DM (34.7% versus 6.5%), hypertension
(52.7% versus 16.0%), anaemia (10.9% versus
2.3%) and RFD (61.8% versus 7.5%). N-terminal pro-BNP
(NT-proBNP) was recorded in 38.1% of HF patients.
HF is highly associated with ageing and comorbidities. Diagnostic use of
NT-proBNP in routine primary care seems underutilized. Instruction of GPs to
determine NT-proBNP in patients suspected of HF is recommended, especially
In elderly patients with comorbidities.
Following stage 1 palliation, delayed sternal closure may be used as a technique to enhance thoracic compliance but may also prolong the length of stay and increase the risk of infection.
We reviewed all neonates undergoing stage 1 palliation at our institution between 2010 and 2017 to describe the effects of delayed sternal closure.
During the study period, 193 patients underwent stage 1 palliation, of whom 12 died before an attempt at sternal closure. Among the 25 patients who underwent primary sternal closure, 4 (16%) had sternal reopening within 24 hours. Among the 156 infants who underwent delayed sternal closure at 4 [3,6] days post-operatively, 11 (7.1%) had one or more failed attempts at sternal closure. Patients undergoing primary sternal closure had a shorter duration of mechanical ventilation and intensive care unit length of stay. Patients who failed delayed sternal closure had a longer aortic cross-clamp time (123±42 versus 99±35 minutes, p=0.029) and circulatory arrest time (39±28 versus 19±17 minutes, p=0.0009) than those who did not fail. Failure of delayed sternal closure was also closely associated with Technical Performance Score: 1.3% of patients with a score of 1 failed sternal closure compared with 18.9% of patients with a score of 3 (p=0.0028). Among the haemodynamic and ventilatory parameters studied, only superior caval vein saturation following sternal closure was different between patients who did and did not fail sternal closure (30±7 versus 42±10%, p=0.002). All patients who failed sternal closure did so within 24 hours owing to hypoxaemia, hypercarbia, or haemodynamic impairment.
When performed according to our current clinical practice, sternal closure causes transient and mild changes in haemodynamic and ventilatory parameters. Monitoring of SvO2 following sternal closure may permit early identification of patients at risk for failure.
Mineral phosphorus (P) used to fertilise crops is derived from phosphate rock, which is a finite resource. Preventing and recycling mineral P waste in the food system, therefore, are essential to sustain future food security and long-term availability of mineral P. The aim of our modelling exercise was to assess the potential of preventing and recycling P waste in a food system, in order to reduce the dependency on phosphate rock. To this end, we modelled a hypothetical food system designed to produce sufficient food for a fixed population with a minimum input requirement of mineral P. This model included representative crop and animal production systems, and was parameterised using data from the Netherlands. We assumed no import or export of feed and food. We furthermore assumed small P soil losses and no net P accumulation in soils, which is typical for northwest European conditions. We first assessed the minimum P requirement in a baseline situation, that is 42% of crop waste is recycled, and humans derived 60% of their dietary protein from animals (PA). Results showed that about 60% of the P waste in this food system resulted from wasting P in human excreta. We subsequently evaluated P input for alternative situations to assess the (combined) effect of: (1) preventing waste of crop and animal products, (2) fully recycling waste of crop products, (3) fully recycling waste of animal products and (4) fully recycling human excreta and industrial processing water. Recycling of human excreta showed most potential to reduce P waste from the food system, followed by prevention and finally recycling of agricultural waste. Fully recycling P could reduce mineral P input by 90%. Finally, for each situation, we studied the impact of consumption of PA in the human diet from 0% to 80%. The optimal amount of animal protein in the diet depended on whether P waste from animal products was prevented or fully recycled: if it was, then a small amount of animal protein in the human diet resulted in the most sustainable use of P; but if it was not, then the most sustainable use of P would result from a complete absence of animal protein in the human diet. Our results apply to our hypothetical situation. The principles included in our model however, also hold for food systems with, for example, different climatic and soil conditions, farming practices, representative types of crops and animals and population densities.
Pastoralists have traditional ecological knowledge (TEK), which is important for their livelihoods and for policies and interventions. Pastoralism is under pressure, however, which may result in a decline of pastoral lifestyle and its related TEK. We, therefore, addressed the following objectives (i) to inventorise and assess how pastoralists characterise and value soils and forages in their environment, (ii) to analyse how soil, forage and livestock (i.e. cattle) characteristics relate to herding decisions and (iii) to determine whether TEK underlying herding decisions differs across generations. Data were collected through focus groups and individual interviews with 72 pastoralists, belonging to three generations and to three agro-ecological zones. Using a three-point scale (high, medium, low), four grasses and three tree forages were assessed in terms of nutritional quality for milk, meat, health and strength. Using their own visual criteria, pastoralists identified five different soils, which they selected for herding at different times of the year. Pastoralists stated that Pokuri was the best soil because of its low moisture content, whereas Karaal was the worst because forage hardly grows on it. They stated that perennials, such as Andropogon gayanus and Loxoderra ledermannii, were of high nutritional quality, whereas annuals such as Andropogon pseudapricus and Hyparrhenia involucrata were of low nutritional quality. Afzelia africana was perceived of high quality for milk production, whereas Khaya senegalensis had the highest quality for meat, health and strength. Pastoralists first used soil, then forage and finally livestock characteristics in their herding decisions. Pastoralists’ TEK was not associated with their generations, but with their agro-ecological zones. This study suggests that pastoralists had common and detailed TEK about soils, forages and livestock characteristics, underlying their herding decisions. To conclude, pastoralists use a holistic approach, combining soil, vegetation and livestock TEK in herding decisions. Such TEK can guide restoration or improvement of grazing lands, and land use planning.
Phenylketonuria (PKU), a genetic metabolic disorder that is characterized by the inability to convert phenylalanine to tyrosine, leads to severe intellectual disability and other cerebral complications if left untreated. Dietary treatment, initiated soon after birth, prevents most brain-related complications. A leading hypothesis postulates that a shortage of brain monoamines may be associated with neurocognitive deficits that are observable even in early-treated PKU. However, there is a paucity of evidence as yet for this hypothesis.
We therefore assessed in vivo striatal dopamine D2/3 receptor (D2/3R) availability and plasma monoamine metabolite levels together with measures of impulsivity and executive functioning in 18 adults with PKU and average intellect (31.2 ± 7.4 years, nine females), most of whom were early and continuously treated. Comparison data from 12 healthy controls that did not differ in gender and age were available.
Mean D2/3R availability was significantly higher (13%; p = 0.032) in the PKU group (n = 15) than in the controls, which may reflect reduced synaptic brain dopamine levels in PKU. The PKU group had lower plasma levels of homovanillic acid (p < 0.001) and 3-methoxy-4-hydroxy-phenylglycol (p < 0.0001), the predominant metabolites of dopamine and norepinephrine, respectively. Self-reported impulsivity levels were significantly higher in the PKU group compared with healthy controls (p = 0.033). Within the PKU group, D2/3R availability showed a positive correlation with both impulsivity (r = 0.72, p = 0.003) and the error rate during a cognitive flexibility task (r = 0.59, p = 0.020).
These findings provide further support for the hypothesis that executive functioning deficits in treated adult PKU may be associated with cerebral dopamine deficiency.
Methyl isonicotinate is one of several patented 4-pyridyl carbonyl compounds being investigated for a variety of uses in thrips pest management. It is probably the most extensively studied thrips non-pheromone semiochemical, with field and glasshouse trapping experiments, and wind tunnel and Y-tube olfactometer studies in several countries demonstrating a behavioural response that results in increased trap capture of at least 12 thrips species, including the cosmopolitan virus vectors such as western flower thrips and onion thrips. Methyl isonicotinate has several of the characteristics that are required for an effective semiochemical tool and is being mainly used as a lure in combination with coloured sticky traps for enhanced monitoring of thrips in greenhouses. Research indicates that this non-pheromone semiochemical has the potential to be used for other thrips management strategies such as mass trapping, lure and kill, lure and infect, and as a behavioural synergist in conjunction with insecticides, in a range of indoor and outdoor crops.
Background: The pathophysiology of subarachnoid hemorrhage (SAH) is complex and includes disruption of the blood-brain barrier (BBB). We freshly isolated BBB endothelial cells (BECs) by 2 distinct methods after experimental SAH and then interrogated their gene expression profiles with the goal of uncovering new therapeutic targets. Methods: SAH was induced using the prechiasmatic blood injection mouse model. BBB permeability studies were performed by administering intraperitoneal cadaverine dye injections at 24h and 48h. BECs were isolated either by sequential magnetic-based sorting for CD45-CD31+ cells or by fluorescence-activated cell sorting (FACS) for Tie2+Pdgfrb- cells. Total RNA was extracted and analyzed using Affymetrix Mouse Gene 2.0 ST Arrays. Results: BBB impairment occurred at 24h and resolved by 48h after SAH. Analysis of gene expression patterns in BECs at 24h reveal clustering of SAH and sham samples. We identified 707 (2.8%) significant differentially-expressed genes (403 upregulated, 304 downregulated) out of 24,865 interrogated probe sets. Many significantly upregulated genes were involved in inflammatory pathways. These microarray results were validated with real-time polymerase chain reaction (RT-PCR). Conclusions: This study is the first to investigate in an unbiased manner, whole genome expression profiling of freshly-isolated BECs in an SAH animal model, yielding targets for novel therapeutic intervention.
Improvements in colorectal cancer (CRC) detection and treatment have led to greater numbers of CRC survivors, for whom there is limited evidence on which to provide dietary guidelines to improve survival outcomes. Higher intake of red and processed meat and lower intake of fibre are associated with greater risk of developing CRC, but there is limited evidence regarding associations with survival after CRC diagnosis. Among 3789 CRC cases in the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, pre-diagnostic consumption of red meat, processed meat, poultry and dietary fibre was examined in relation to CRC-specific mortality (n 1008) and all-cause mortality (n 1262) using multivariable Cox regression models, adjusted for CRC risk factors. Pre-diagnostic red meat, processed meat or fibre intakes (defined as quartiles and continuous grams per day) were not associated with CRC-specific or all-cause mortality among CRC survivors; however, a marginal trend across quartiles of processed meat in relation to CRC mortality was detected (P 0·053). Pre-diagnostic poultry intake was inversely associated with all-cause mortality among women (hazard ratio (HR)/20 g/d 0·92; 95 % CI 0·84, 1·00), but not among men (HR 1·00; 95 % CI 0·91, 1·09) (Pfor heterogeneity=0·10). Pre-diagnostic intake of red meat or fibre is not associated with CRC survival in the EPIC cohort. There is suggestive evidence of an association between poultry intake and all-cause mortality among female CRC survivors and between processed meat intake and CRC-specific mortality; however, further research using post-diagnostic dietary data is required to confirm this relationship.
The Amsterdam glacial basin was a major sedimentary sink from late Saalian until late Eemian (Picea zone, E6) times. The basin’s exemplary record makes it a potential reference area for the last interglacial stage. The cored Amsterdam-Terminal borehole was drilled in 1997 to provide a record throughout the Eemian interglacial. Integrated facies analysis has resulted in a detailed reconstruction of the sedimentary history.
After the Saalian ice mass had disappeared from the area, a large, deep lake had come into being, fed by the Rhine river. At the end of the glacial, the lake became smaller because it was cut off from the river-water supply, and eventually only a number of shallow pools remained in the Amsterdam basin. During the early Eemian (Betula zone, El), a seepage lake existed at the site. The lake deepened under the influence of a steadily rising sea level and finally evolved into a silled lagoon (late Quercus zone, E3). Initially, the lagoon water had fairly stable stratification, but as the sea level continued to rise the sill lost its significance, the lagoon becoming well mixed by the middle of the Corylus/Taxus zone (E4b). The phase of free exchange with the open sea ended in the early Carpinus zone (E5), when barriers developed in the sill area causing the lagoon to become stratified again. During the Late Eemian (late E5), a more dynamic system developed. The sandy barriers that had obstructed exchange with the open sea were no longer effective, and a tidally-influenced coastal lagoon formed.
The Eemian sedimentary history shown in the Amsterdam-Terminal borehole is intimately connected with the sea-level history. Because the site includes both a high-resolution pollen signal and a record of sea-level change, it has potential for correlation on various scales. Palaeomagnetic results show that the sediments predate the Blake Event, which confirms that this reversal excursion is relatively young. The U/Th age of the uppermost part of the Eemian sequence is 118.2±6.3 ka.
Recycling irrigation water can provide water during periods of drought for horticulture operations and can reduce nonpoint-source pollution, but water recycling increases production costs and can increase risk of disease infestation from waterborne pathogens such as Pythium and Phytophthora. This study of water recycling adoption by horticultural growers in Virginia, Maryland, and Pennsylvania finds that the potential for increased disease infestation would reduce growers’ probability of adopting water recycling. Widespread adoption of recycling irrigation water would require government incentives or coercion or growers’ ability to pass cost increases on to customers.
There is limited knowledge about the effect of livestock-associated methicillin-resistant Staphylococcus aureus (LA-MRSA) carriage on health-related quality of life (QoL). With this study, we explored whether LA-MRSA causes infections or affects health-related QoL in pig farmers. This prospective cohort study surveyed persons working on 49 farrowing pig farms in The Netherlands for 1 year (2010–2011). On six sampling moments, nasal swabs, environmental samples and questionnaires on activities and infections were collected. At the end of the study year, persons were asked about their QoL using the validated SF-36 and EQ-5D questionnaires. Of 120 persons, 44 (37%) were persistent MRSA carriers. MRSA carriage was not associated with infections, use of antimicrobials, healthcare contact and health-related QoL items in univariate or multivariate analysis, most likely due to the ‘healthy worker effect’. Despite high carriage rates, the impact of LA-MRSA carriage in this population of relatively healthy pig farmers on health and health-related QoL appears limited; more research is needed for confirmation.
The main resistance mechanism of codling moth (Cydia pomonella) in the tree fruit area of Lleida (NE Spain) is multifunction oxidases (MFO). We studied the frequency of MFO-resistant adults captured by different lures, with and without pear ester, and flights in orchards under different crop management systems. The factor year affected codling moth MFO-resistance level, particularly in the untreated orchards, highlighting the great influence of codling moth migration on the spread of resistance in field populations. Chemical treatments and adult flight were also very important but mating disruption technique showed no influence. The second adult flight showed the highest frequency, followed by the first flight and the third flight. In untreated orchards, there were no significant differences in the frequency of MFO-resistant individuals attracted by Combo and BioLure. Red septa lures baited with pear ester (DA) captured sufficient insects only in the first generation of 2010, obtaining a significantly lower proportion of MFO-resistant adults than Combo and BioLure. In the chemically treated orchards, in 2009 BioLure caught a significantly lower proportion of MFO-resistant adults than Combo during the first and third flight, and also than DA during the first flight. No significant differences were found between the lures or flights in 2010. These results cannot support the idea of a higher attractiveness of the pear ester for MFO-resistant adults in the field but do suggest a high influence of the response to the attractant depending on the management of the orchard, particularly with regard to the use of chemical insecticides.
In this chapter we discuss predictive modeling of series time obtained by functional magnetic resonance imaging (fMRI), representing an important case of spatiotemporal data. Following its development in the early 1990s, fMRI has become a well established approach to investigating brain activity in vivo (Huettel et al. 2004), providing temporally and spatially resolved recordings of the “blood oxygen level dependent” (BOLD) signal of neural tissue. fMRI time series consist of a temporal sequence of scans of the brain and the surrounding space (discretized into voxels), such that the resulting data sets may be stored as vector time series.
The practical work with fMRI time series poses considerable challenges in many aspects, including the huge dimensionality of the data, which usually is recorded from several 10 of voxels, and the plethora of artifacts and contaminations disturbing the data (Strother 2006). Further difficulties arise from the low temporal sampling frequency, typically well below 1 Hz, and the indirect relationship between the BOLD signal and the underlying neural processes.
Currently available approaches to fMRI time series analysis may be broadly classified into three groups:
• exploratory methods, such as cluster analysis (Goutte et al. 1999), principal component analysis (PCA) (Anderson et al. 1999) and independent component analysis (ICA) (McKeown 2000);
• massively univariate (voxel-wise) regression methods, implemented in software packages such as statistical parametric mapping (SPM) (Friston et al. 1994), the FMRIB software library (FSL) (Smith et al. 2004), or the analysis of functional Neuroimages (AFNI) package (Cox 1996);
• generative dynamic models, based on specific assumptions regarding the properties of the underlying neural masses and the biophysical processes which produce the experimental data; as examples we mention dynamic causal modeling (DCM) (Friston et al. 2003) and the hemodynamic state space model (SSM) of Riera et al. (2004a).
Among these three groups of methods, the third may be interpreted as an example of predictive modeling, while for the second group this is possible only in a very limited sense, and essentially impossible for the first group.
To determine longitudinal changes in psychopathology in a cohort of patients 30–43 years after their first cardiac surgery for Congenital Heart Disease (CHD) in childhood, to compare outcomes of the 30- to 43-year follow-up with normative data, and to identify medical predictors for psychopathology.
This study is the third follow-up of this cohort. The first and second follow-ups of this same cohort were conducted in 1990 and 2001, respectively. At all three follow-ups, psychopathology was assessed with standardised, parallel questionnaires. In 2011, subjective health status was assessed by the Short Form-36. Medical predictor variables were derived from medical examinations and medical records.
In this third follow-up, a total of 252 patients participated. Of these, 152 patients participated in all three follow-ups. Over a 30-year period, proportions of patients showing psychopathology decreased significantly.
At the 30- to 43-year follow-up, overall outcomes on psychopathology for the CHD sample were similar or even better compared with normative groups. Subjective health status was also better compared with normative data.
No differences were found between cardiac diagnostic groups. Medical variables that predicted the course of psychopathology over time were as follows: the scar, as judged by the patient, results of the first cardiac surgery, and the number of hospitalisations.
Over a 30-year period, psychopathology decreased in patients with CHD. Levels of psychopathology in these patients, who are now aged between 30 and 54 years, were comparable or even better than normative data.
Se bioavailability in commercial pet foods has been shown to be highly variable. The aim of the present study was to identify dietary factors associated with in vitro accessibility of Se (Se Aiv) in pet foods. Se Aiv is defined as the percentage of Se from the diet that is potentially available for absorption after in vitro digestion. Sixty-two diets (dog, n 52; cat, n 10) were in vitro enzymatically digested: fifty-four of them were commercially available (kibble, n 20; pellet, n 8; canned, n 17; raw meat, n 6; steamed meat, n 3) and eight were unprocessed (kibble, n 4; canned, n 4) from the same batch as the corresponding processed diets. The present investigation examined if Se Aiv was affected by diet type, dietary protein, methionine, cysteine, lysine and Se content, DM, organic matter and crude protein (CP) digestibility. Se Aiv differed significantly among diet types (P< 0·001). Canned and steamed meat diets had a lower Se Aiv than pelleted and raw meat diets. Se Aiv correlated positively with CP digestibility in extruded diets (kibbles, n 19; r 0·540, P =0·017) and negatively in canned diets (n 16; r − 0·611, P =0·012). Moreover, the canning process (n 4) decreased Se Aiv (P =0·001), whereas extrusion (n 4) revealed no effect on Se Aiv (P =0·297). These differences in Se Aiv between diet types warrant quantification of diet type effects on in vivo Se bioavailability.
On 23 May 2011, CDC identified a multistate cluster of Salmonella Heidelberg infections and two multidrug-resistant (MDR) isolates from ground turkey retail samples with indistinguishable pulsed-field gel electrophoresis patterns. We defined cases as isolation of outbreak strains in persons with illness onset between 27 February 2011 and 10 November 2011. Investigators collected hypothesis-generating questionnaires and shopper-card information. Food samples from homes and retail outlets were collected and cultured. We identified 136 cases of S. Heidelberg infection in 34 states. Shopper-card information, leftover ground turkey from a patient's home containing the outbreak strain and identical antimicrobial resistance profiles of clinical and retail samples pointed to plant A as the source. On 3 August, plant A recalled 36 million pounds of ground turkey. This outbreak increased consumer interest in MDR Salmonella infections acquired through United States-produced poultry and played a vital role in strengthening food safety policies related to Salmonella and raw ground poultry.
The 2013 multistate outbreaks contributed to the largest annual number of reported US cases of cyclosporiasis since 1997. In this paper we focus on investigations in Texas. We defined an outbreak-associated case as laboratory-confirmed cyclosporiasis in a person with illness onset between 1 June and 31 August 2013, with no history of international travel in the previous 14 days. Epidemiological, environmental, and traceback investigations were conducted. Of the 631 cases reported in the multistate outbreaks, Texas reported the greatest number of cases, 270 (43%). More than 70 clusters were identified in Texas, four of which were further investigated. One restaurant-associated cluster of 25 case-patients was selected for a case-control study. Consumption of cilantro was most strongly associated with illness on meal date-matched analysis (matched odds ratio 19·8, 95% confidence interval 4·0–∞). All case-patients in the other three clusters investigated also ate cilantro. Traceback investigations converged on three suppliers in Puebla, Mexico. Cilantro was the vehicle of infection in the four clusters investigated; the temporal association of these clusters with the large overall increase in cyclosporiasis cases in Texas suggests cilantro was the vehicle of infection for many other cases. However, the paucity of epidemiological and traceback information does not allow for a conclusive determination; moreover, molecular epidemiological tools for cyclosporiasis that could provide more definitive linkage between case clusters are needed.