To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
Effective integrated weed management in agricultural landscapes depends on the ability to identify and manage processes that drive weed dynamics. The current study reports the effects of grazing management and crop rotation strategies on the seedbank and emerged weed flora in an integrated crop-livestock system (ICLS) experiment during a 12-year period under no-tillage in sub-tropical southern Brazil. During winter, Italian ryegrass cover crops were grazed by sheep: grazing management treatments included two stocking methods (continuous and rotational) and two forage allowances (10 and 20 kg of herbage dry matter available per 100 kg animal live weight). During summer, the crop rotation treatments involved either soybean-maize or soybean-soybean in succession with winter-grazed cover crops. The treatments were part of a factorial randomized complete block design. Treatment effects were evaluated on the weed seedbank and emerged weed flora populations during winter-grazed cover crop and summer crop growth as well as during the harvest phase. The current results demonstrate that crop rotation and grazing management exhibited interactive effects on the determination of weed outcomes in an ICLS. However, overall, compared with moderate forage allowance, high forage allowance during the winter-grazed cover crop caused lower emerged weed flora in subsequent crops (20% reduction during crop growth and 90% reduction at crop harvest) and 48% reduction in seedbank size. High forage allowance promoted more residue from winter-grazed cover crop biomass, which remained during the summer crop phases and probably resulted in a physical barrier to weed emergence.
Pathological worry is a hallmark feature of generalised anxiety disorder (GAD), associated with dysfunctional emotional processing. The ventromedial prefrontal cortex (vmPFC) is involved in the regulation of such processes, but the link between vmPFC emotional responses and pathological v. adaptive worry has not yet been examined.
To study the association between worry and vmPFC activity evoked by the processing of learned safety and threat signals.
In total, 27 unmedicated patients with GAD and 56 healthy controls (HC) underwent a differential fear conditioning paradigm during functional magnetic resonance imaging.
Compared to HC, the GAD group demonstrated reduced vmPFC activation to safety signals and no safety–threat processing differentiation. This response was positively correlated with worry severity in GAD, whereas the same variables showed a negative and weak correlation in HC.
Poor vmPFC safety–threat differentiation might characterise GAD, and its distinctive association with GAD worries suggests a neural-based qualitative difference between healthy and pathological worries.
Anabolic-androgenic steroid (AAS) use is known to be associated with other psychiatric disorders, such as body image disorders, conduct disorder/sociopathy, and other substance use disorders (SUD) – but the causal pathways among these conditions remain poorly delineated.
We created a directed acyclic graph to diagram hypothesized relationships among AAS use and dependence, body image disorder (BID), conduct disorder/sociopathy, and other SUD. Using proportional hazards models, we then assessed potentially causal relationships among these variables, using a dataset of 233 male weightlifters, of whom 102 had used AAS.
BID and conduct disorder/sociopathy both strongly contributed to the development of AAS use, but did not appear to contribute further to the progression from AAS use to AAS dependence. Other SUD beginning prior to first AAS use – whether broadly defined or restricted only to opioids – failed to show an effect on AAS use or progression to AAS dependence. Conversely, AAS use contributed significantly to the subsequent first-time development of opioid use disorders but did not significantly increase the risk for first-time development of non-opioid SUD, taken as a whole.
Our analysis suggests that AAS use and other SUD are mutually attributable to underlying conduct disorder/sociopathy. SUD do not appear to represent a ‘gateway’ to subsequent AAS use. AAS use may represent a gateway to subsequent opioid use disorder, but probably not to other SUD.
Monitoring of nesting beaches is often the only feasible and low-cost approach for assessing sea turtle populations. We investigated spatio-temporal patterns of sea turtle nesting activity monitored over 17 successive years in the Lamu archipelago, Kenya. Community-based patrols were conducted on 26 stretches of beach clustered in five major locations. A total of 2,021 nests were recorded: 1,971 (97.5%) green turtle Chelonia mydas nests, 31 (1.5%) hawksbill Eretmochelys imbricata nests, 8 (0.4%) olive ridley Lepidochelys olivacea nests and 11 (0.5%) unidentified nests. Nesting occurred year-round, increasing during March–July, when 74% of nests were recorded. A stable trend in mean annual nesting densities was observed in all locations. Mean clutch sizes were 117.7 ± SE 1 eggs (range 20–189) for green turtles, 103 ± SE 6 eggs (range 37–150) for hawksbill turtles, and 103 ± SE 6 eggs (range 80–133) for olive ridley turtles. Curved carapace length for green turtles was 65–125 cm, and mean annual incubation duration was 55.5 ± SE 0.05 days. The mean incubation duration for green turtle nests differed significantly between months and seasons but not locations. The hatching success (pooled data) was 81.3% (n = 1,841) and was higher for in situ nests (81.0 ± SE 1.5%) compared to relocated nests (77.8 ± SE 1.4%). The results highlight the important contribution of community-based monitoring in Kenya to sustaining the sea turtle populations of the Western Indian Ocean region.
We design an experiment to test the hypothesis that, in violation of Bayes’ rule, some people respond more forcefully to the strength of information than to its weight. We provide incentives to motivate effort, use naturally occurring information, and control for risk attitude. We find that the strength–weight bias affects expectations but that its magnitude is significantly lower than originally reported. Controls for nonlinear utility further reduce the bias. Our results suggest that incentive compatibility and controls for risk attitude considerably affect inferences on errors in expectations.
To identify factors that may explain hospital-level differences in outcomes of programs to prevent central line–associated bloodstream infections.
Extensive qualitative case study comparing higher- and lower-performing hospitals on the basis of reduction in the rate of central line–associated bloodstream infections. In-depth interviews were transcribed verbatim and analyzed to determine whether emergent themes differentiated higher- from lower-performing hospitals.
Eight US hospitals that had participated in the federally funded On the CUSP—Stop BSI initiative.
One hundred ninety-four interviewees including administrative leaders, clinical leaders, professional staff, and frontline physicians and nurses.
A main theme that differentiated higher- from lower-performing hospitals was a distinctive framing of the goal of “getting to zero” infections. Although all sites reported this goal, at the higher-performing sites the goal was explicitly stated, widely embraced, and aggressively pursued; in contrast, at the lower-performing hospitals the goal was more of an aspiration and not embraced as part of the strategy to prevent infections. Five additional management practices were nearly exclusively present in the higher-performing hospitals: (1) top-level commitment, (2) physician-nurse alignment, (3) systematic education, (4) meaningful use of data, and (5) rewards and recognition. We present these strategies for prevention of healthcare-associated infection as a management “bundle” with corresponding suggestions for implementation.
Some of the variance associated with CLABSI prevention program outcomes may relate to specific management practices. Adding a management practice bundle may provide critical guidance to physicians, clinical managers, and hospital leaders as they work to prevent healthcare-associated infections.
To review the available literature on accountability frameworks to construct a framework that is relevant to voluntary partnerships between government and food industry stakeholders.
Between November 2012 and May 2013, a desk review of ten databases was conducted to identify principles, conceptual frameworks, underlying theories, and strengths and limitations of existing accountability frameworks for institutional performance to construct a new framework relevant to promoting healthy food environments.
Food policy contexts within high-income countries to address obesity and diet-related non-communicable diseases.
Eligible resources (n 26) were reviewed and the guiding principles of fifteen interdisciplinary frameworks were used to construct a new accountability framework.
Strengths included shared principles across existing frameworks, such as trust, inclusivity, transparency and verification; government leadership and good governance; public deliberations; independent bodies recognizing compliance and performance achievements; remedial actions to improve accountability systems; and capacity to manage conflicts of interest and settle disputes. Limitations of the three-step frameworks and ‘mutual accountability’ approach were an explicit absence of an empowered authority to hold all stakeholders to account for their performance.
We propose a four-step accountability framework to guide government and food industry engagement to address unhealthy food environments as part of a broader government-led strategy to address obesity and diet-related non-communicable diseases. An independent body develops clear objectives, a governance process and performance standards for all stakeholders to address unhealthy food environments. The empowered body takes account (assessment), shares the account (communication), holds to account (enforcement) and responds to the account (improvements).
Post-traumatic stress disorder (PTSD) in response to the World Trade Center (WTC) disaster of 11 September 2001 (9/11) is one of the most prevalent and persistent health conditions among both professional (e.g. police) and non-traditional (e.g. construction worker) WTC responders, even several years after 9/11. However, little is known about the dimensionality and natural course of WTC-related PTSD symptomatology in these populations.
Data were analysed from 10 835 WTC responders, including 4035 police and 6800 non-traditional responders who were evaluated as part of the WTC Health Program, a clinic network in the New York area established by the National Institute for Occupational Safety and Health. Confirmatory factor analyses (CFAs) were used to evaluate structural models of PTSD symptom dimensionality; and autoregressive cross-lagged (ARCL) panel regressions were used to examine the prospective interrelationships among PTSD symptom clusters at 3, 6 and 8 years after 9/11.
CFAs suggested that five stable symptom clusters best represent PTSD symptom dimensionality in both police and non-traditional WTC responders. This five-factor model was also invariant over time with respect to factor loadings and structural parameters, thereby demonstrating its longitudinal stability. ARCL panel regression analyses revealed that hyperarousal symptoms had a prominent role in predicting other symptom clusters of PTSD, with anxious arousal symptoms primarily driving re-experiencing symptoms, and dysphoric arousal symptoms primarily driving emotional numbing symptoms over time.
Results of this study suggest that disaster-related PTSD symptomatology in WTC responders is best represented by five symptom dimensions. Anxious arousal symptoms, which are characterized by hypervigilance and exaggerated startle, may primarily drive re-experiencing symptoms, while dysphoric arousal symptoms, which are characterized by sleep disturbance, irritability/anger and concentration difficulties, may primarily drive emotional numbing symptoms over time. These results underscore the importance of assessment, monitoring and early intervention of hyperarousal symptoms in WTC and other disaster responders.
A community outbreak of legionellosis occurred in Barrow-in-Furness, Cumbria, during July and August 2002. A descriptive study and active case-finding were instigated and all known wet cooling systems and other potential sources were investigated. Genotypic and phenotypic analysis, and amplified fragment length polymorphism of clinical human and environmental isolates confirmed the air-conditioning unit of a council-owned arts and leisure centre to be the source of infection. Subsequent sequence-based typing confirmed this link. One hundred and seventy-nine cases, including seven deaths [case fatality rate (CFR) 3·9%] were attributed to the outbreak. Timely recognition and management of the incident very likely led to the low CFR compared to other outbreaks. The outbreak highlights the responsibility associated with managing an aerosol-producing system, with the potential to expose and infect a large proportion of the local population and the consequent legal ramifications and human cost.
We use the WISE all sky survey observations to look for counterparts of hard X-ray selected sources from the XMM-Newton-SDSS survey. We then measure the 12 μm luminosity of the AGN by decomposing their optical to infrared SEDs with a host and an AGN component and compare it to the X-ray luminosity and their expected intrinsic relation. This way we select 20 X-ray under-luminous heavily obscured candidates and examine their X-ray and optical properties in more detail. We find evidence for a Compton-thick nucleus for six sources, a number lower than what expected from X-ray background synthesis models, which shows the limitations of our method.
Longitudinal symptoms of post-traumatic stress disorder (PTSD) are often characterized by heterogeneous trajectories, which may have unique pre-, peri- and post-trauma risk and protective factors. To date, however, no study has evaluated the nature and determinants of predominant trajectories of PTSD symptoms in World Trade Center (WTC) responders.
A total of 10835 WTC responders, including 4035 professional police responders and 6800 non-traditional responders (e.g. construction workers) who participated in the WTC Health Program (WTC-HP), were evaluated an average of 3, 6 and 8 years after the WTC attacks.
Among police responders, longitudinal PTSD symptoms were best characterized by four classes, with the majority (77.8%) in a resistant/resilient trajectory and the remainder exhibiting chronic (5.3%), recovering (8.4%) or delayed-onset (8.5%) symptom trajectories. Among non-traditional responders, a six-class solution was optimal, with fewer responders in a resistant/resilient trajectory (58.0%) and the remainder exhibiting recovering (12.3%), severe chronic (9.5%), subsyndromal increasing (7.3%), delayed-onset (6.7%) and moderate chronic (6.2%) trajectories. Prior psychiatric history, Hispanic ethnicity, severity of WTC exposure and WTC-related medical conditions were most strongly associated with symptomatic trajectories of PTSD symptoms in both groups of responders, whereas greater education and family and work support while working at the WTC site were protective against several of these trajectories.
Trajectories of PTSD symptoms in WTC responders are heterogeneous and associated uniquely with pre-, peri- and post-trauma risk and protective factors. Police responders were more likely than non-traditional responders to exhibit a resistant/resilient trajectory. These results underscore the importance of prevention, screening and treatment efforts that target high-risk disaster responders, particularly those with prior psychiatric history, high levels of trauma exposure and work-related medical morbidities.
People with psychosis demonstrate impaired response inhibition on the Stop Signal Task (SST). It is less clear if this impairment extends to reflection impulsivity, a form of impulsivity that has been linked to substance use in non-psychotic samples.
We compared 49 patients with first-episode psychosis (FEP) and 30 healthy control participants on two forms of impulsivity measured using the Information Sampling Test (IST) and the SST, along with clinical and IQ assessments. We also compared those patients who used cannabis with those who had either given up or never used.
Patients with FEP had significantly greater impairment in response inhibition but not in reflection impulsivity compared with healthy controls. By contrast, patients who reported current cannabis use demonstrated greater reflection impulsivity than those that had either given up or never used, whereas there were no differences in response inhibition.
These data suggest that abnormal reflection impulsivity is associated with substance use in psychosis but not psychosis itself; the opposite relationship may hold for response inhibition.