To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To assess the utility of an automated, statistically-based outbreak detection system to identify clusters of hospital-acquired microorganisms.
Multicenter retrospective cohort study.
The study included 43 hospitals using a common infection prevention surveillance system.
A space–time permutation scan statistic was applied to hospital microbiology, admission, discharge, and transfer data to identify clustering of microorganisms within hospital locations and services. Infection preventionists were asked to rate the importance of each cluster. A convenience sample of 10 hospitals also provided information about clusters previously identified through their usual surveillance methods.
We identified 230 clusters in 43 hospitals involving Gram-positive and -negative bacteria and fungi. Half of the clusters progressed after initial detection, suggesting that early detection could trigger interventions to curtail further spread. Infection preventionists reported that they would have wanted to be alerted about 81% of these clusters. Factors associated with clusters judged to be moderately or highly concerning included high statistical significance, large size, and clusters involving Clostridioides difficile or multidrug-resistant organisms. Based on comparison data provided by the convenience sample of hospitals, only 9 (18%) of 51 clusters detected by usual surveillance met statistical significance, and of the 70 clusters not previously detected, 58 (83%) involved organisms not routinely targeted by the hospitals’ surveillance programs. All infection prevention programs felt that an automated outbreak detection tool would improve their ability to detect outbreaks and streamline their work.
Automated, statistically-based outbreak detection can increase the consistency, scope, and comprehensiveness of detecting hospital-associated transmission.
A growing body of research suggests that deficient emotional self-regulation (DESR) is common and morbid among attention-deficit/hyperactivity disorder (ADHD) patients. The main aim of the present study was to assess whether high and low levels of DESR in adult ADHD patients can be operationalized and whether they are clinically useful.
A total of 441 newly referred 18- to 55-year-old adults of both sexes with Diagnostic and Statistical Manual of Mental Disorders: Fifth Edition (DSM-5) ADHD completed self-reported rating scales. We operationalized DESR using items from the Barkley Current Behavior Scale. We used receiver operator characteristic (ROC) curves to identify the optimal cut-off on the Barkley Emotional Dysregulation (ED) Scale to categorize patients as having high- versus low-level DESR and compared demographic and clinical characteristics between the groups.
We averaged the optimal Barkley ED Scale cut-points from the ROC curve analyses across all subscales and categorized ADHD patients as having high- (N = 191) or low-level (N = 250) DESR (total Barkley ED Scale score ≥8 or <8, respectively). Those with high-level DESR had significantly more severe symptoms of ADHD, executive dysfunction, autistic traits, levels of psychopathology, and worse quality of life compared with those with low-level DESR. There were no major differences in outcomes among medicated and unmedicated patients.
High levels of DESR are common in adults with ADHD and when present represent a burdensome source of added morbidity and disability worthy of further clinical and scientific attention.
Exercise has many health benefits for individuals with Type 1 Diabetes (T1D), however it carries the risk of hypoglycaemia. Research has indicated that intermittent high intensity exercise reduces this risk compared to steady state exercise, potentially via a greater anaerobic component inducing increased lactate and catecholamine response.Six physically active males aged 23 ± 5 years, BMI 24.9 ± 1.8 kg•m-2, Vo2 max 47.9 ± 10.1 ml•kg-1•min-1 diagnosed with Type 1 Diabetes for 9 ± 3 years and a HbA1c concentration of 8.6 ± 0.3% participated in a randomised counterbalanced trial. Participants exercised for 60 min on a cycle ergometer on two occasions separated by 7 days, which consisted of a moderate continuous steady state exercise session at 40% o2 max (CONT), or the same exercise intensity interspersed with 7 s high intensity sprints at 100% Vo2 max every 2 min (INT). Blood glucose concentration was assessed via capillary blood sampling every 10 min during exercise and at regular intervals in the 60 min post exercise (Accu-Chek Aviva, Roche, UK). Additional metabolic measures such as blood lactate concentration and carbohydrate oxidation rates were assessed during exercise. Participants ingested small quantities of a carbohydrate drink, if required, to avoid hypoglycaemia during exercise. Magnitude based inferences were used to compare the two exercise trials and Effect Sizes (ES) calculated using Cohen's d and results presented as mean ± SD.Average blood glucose concentration was lower on the INT trial compared to CONT during both the exercise phase (8.9 ± 1.7 mmol•l-1 vs 7.1 ± 1.1 mmol•l-1; ES 0.55) and the 60 min post-exercise recovery phase (8.1 ± 1.9 mmol•l-1 vs 7.0 ± 1.8 mmol•l-1; ES 0.56). Carbohydrate oxidation was greater on the INT trial compared to CONT (1.9 ± 1.4 g•min-1 vs 1.5 ± 0.6 g•min-1; ES 0.35). Capillary blood lactate concentration was markedly elevated on INT when compared to CONT (5.0 ± 1.4 mmol•l-1 vs 2.4 ± 1.1 mmol•l-1; ES 2.48). Carbohydrate ingestion during exercise was 13 ± 11 g and differed little between trials (ES 0.18).Despite greater anaerobic metabolic response, the addition of intermittent high intensity sprints to 60 min steady-state cycle exercise resulted in greater declines in blood glucose concentration during the exercise and post-exercise period, potentially by inhibition of the counter regulatory hormone response expected on the INT trial. These results would indicate that additional exogenous carbohydrate ingestion may be required for individuals with T1D when intermittent sprints are added to prolonged continuous exercise.
We are developing the novel αIIbβ3 antagonist, RUC-4, for subcutaneously (SC)-administered first-point-of-medical-contact treatment for ST segment elevation myocardial infarction (STEMI).
We studied the (1) pharmacokinetics (PK) of RUC-4 at 1.0, 1.93, and 3.86 mg/kg intravenous (IV), intramuscular (IM), and SC in non-human primates (NHPs); (2) impact of aspirin on RUC-4 IC50 in human platelet-rich plasma (PRP); (3) effect of different anticoagulants on the RUC-4 IC50 in human PRP; and (4) relationship between αIIbβ3 receptor blockade by RUC-4 and inhibition of ADP-induced platelet aggregation.
(1) All doses of RUC-4 were well tolerated, but animals demonstrated variable temporary bruising. IM and SC RUC-4 reached dose-dependent peak levels within 5–15 minutes, with T1/2 s between 0.28 and 0.56 hours. Platelet aggregation studies in NHPs receiving IM RUC-4 demonstrated >80% inhibition of the initial slope of ADP-induced aggregation with all three doses 30 minutes post-dosing, with subsequent dose-dependent loss of inhibition over 4–5 hours. (2) The RUC-4 IC50 for ADP-induced platelet aggregation was unaffected by aspirin treatment (40±9 nM vs 37±5 nM; p = 0.39). (3) The RUC-4 IC50 was significantly higher in PRP prepared from D-phenylalanyl-prolyl-arginyl chloromethyl ketone (PPACK)-anticoagulated blood compared to citrate-anticoagulated blood using either thrombin receptor activating peptide (TRAP) (122±17 vs 66±25 nM; p = 0.05; n = 4) or ADP (102±22 vs 54±13; p<0.001; n = 5). (4) There was a close correspondence between receptor blockade and inhibition of ADP-induced platelet aggregation, with aggregation inhibition beginning with ~40% receptor blockade and becoming nearly complete at >80% receptor blockade.
Based on these results and others, RUC-4 has now progressed to formal preclinical toxicology studies.
In the 2015 review paper ‘Petawatt Class Lasers Worldwide’ a comprehensive overview of the current status of high-power facilities of
was presented. This was largely based on facility specifications, with some description of their uses, for instance in fundamental ultra-high-intensity interactions, secondary source generation, and inertial confinement fusion (ICF). With the 2018 Nobel Prize in Physics being awarded to Professors Donna Strickland and Gerard Mourou for the development of the technique of chirped pulse amplification (CPA), which made these lasers possible, we celebrate by providing a comprehensive update of the current status of ultra-high-power lasers and demonstrate how the technology has developed. We are now in the era of multi-petawatt facilities coming online, with 100 PW lasers being proposed and even under construction. In addition to this there is a pull towards development of industrial and multi-disciplinary applications, which demands much higher repetition rates, delivering high-average powers with higher efficiencies and the use of alternative wavelengths: mid-IR facilities. So apart from a comprehensive update of the current global status, we want to look at what technologies are to be deployed to get to these new regimes, and some of the critical issues facing their development.
Both childhood maltreatment and insecure attachment are known to be associated with depression in adulthood. The extent insecure attachment increases the risk of adult clinical depression over that of parental maltreatment among women in the general population is explored, using those at high risk because of their selection for parental maltreatment together with an unselected sample.
Semi-structured interviews and investigator-based measures are employed.
Insecure attachment is highly associated with parental maltreatment with both contributing to the risk of depression, with attachment making a substantial independent contribution. Risk of depression did not vary by type of insecure attachment, but the core pathways of the dismissive and enmeshed involved the whole life course in terms of greater experience of a mother's physical abuse and their own anger as an adult, with both related to adult depression being more often provoked by a severely threatening event involving humiliation rather than loss. By contrast, depression of the insecure fearful and withdrawn was more closely associated with both current low self-esteem and an inadequately supportive core relationship. In terms of depression taking a chronic course, insecure attachment was again a key risk factor, but with this now closely linked with the early experience of a chaotic life style but with this involving only a modest number of women.
Both insecure attachment and parental maltreatment contribute to an increased risk of depression with complex effects involving types of insecure attachment.
Novel approaches to improving disaster response have begun to include the use of big data and information and communication technology (ICT). However, there remains a dearth of literature on the use of these technologies in disasters. We have conducted an integrative literature review on the role of ICT and big data in disasters. Included in the review were 113 studies that met our predetermined inclusion criteria. Most studies used qualitative methods (39.8%, n=45) over mixed methods (31%, n=35) or quantitative methods (29.2%, n=33). Nearly 80% (n=88) covered only the response phase of disasters and only 15% (n=17) of the studies addressed disasters in low- and middle-income countries. The 4 most frequently mentioned tools were geographic information systems, social media, patient information, and disaster modeling. We suggest testing ICT and big data tools more widely, especially outside of high-income countries, as well as in nonresponse phases of disasters (eg, disaster recovery), to increase an understanding of the utility of ICT and big data in disasters. Future studies should also include descriptions of the intended users of the tools, as well as implementation challenges, to assist other disaster response professionals in adapting or creating similar tools. (Disaster Med Public Health Preparedness. 2019;13:353–367)
A total of eight foxhound packs in England and Wales were screened for Echinococcus species using a genus-specific coproantigen ELISA and for Echinococcus granulosus sensu lato and Echinococcus equinus by coproPCR. Main screening (n = 364 hounds) occurred during 2010–2011 wherein a quarter (25.6%) of the foxhound fecal samples tested were Echinococcus coproantigen-positive (93/364). In total, five of eight (62.5%) hunts screened had coproantigen-positive hounds; coproantigen prevalence for individual foxhound packs ranged from 0 to 61.2% and was shown to be >30% in three hunts (in counties of Powys, Wales and Northumberland, England). Foxhound fecal samples from six of the eight tested hunts (four Welsh and two English hunts) were positive by coproPCR for E. granulosus s.l (including one sequence confirmation of E. granulosus sensu stricto) and E. equinus DNA. Analysis of hunt questionnaire data suggested that there was an association between poor foxhound husbandry, especially feeding practices and Echinococcus coproantigen prevalence. Clearer guidelines regarding the risk of canine echinococcosis are required for safe management of foxhound hunts in England and Wales.
Whitehouse adapts insights from evolutionary anthropology to interpret extreme self-sacrifice through the concept of identity fusion. The model neglects the role of normative systems in shaping behaviors, especially in relation to violent extremism. In peaceful groups, increasing fusion will actually decrease extremism. Groups collectively appraise threats and opportunities, actively debate action options, and rarely choose violence toward self or others.
Dry bean (Phaseolus vulgaris) can be grown as a local food source and as an alternative to soybean (Glycine max) to diversify organic crop rotations. To understand the benefits of diversification of organic cropping systems, the effects of preceding alfalfa (Medicago sativa) and corn (Zea mays) crops on yields of five dry bean types and one soybean type, and the effect of bean type on following spring wheat (Triticum aestivum) yields, were tested at four Minnesota locations. Dry bean and soybean yields following alfalfa were 25% greater than yields following corn at two of four locations, though bean yields following corn were greater at one location. A preceding alfalfa crop benefited bean yields at locations where hog manure or no manure was applied to corn, whereas bean yields following corn fertilized with cow manure were similar to or greater than bean yields following alfalfa. Among dry bean types, black bean yielded similarly to soybean at three of four locations, but dark red kidney bean consistently yielded 25–65% lower than soybean. Navy, pinto and heirloom dry bean types yielded similarly to soybean at two of four locations. Across locations, weed biomass was 3–15 times greater in dry bean than in soybean and dry bean yield response to weed competition varied among bean types. However, dry bean, regardless of the preceding crop, demonstrated the potential to produce yields comparable with soybean in organic systems and the substitution of dry bean for soybean did not affect subsequent wheat yields. More studies are needed to identify nitrogen fertility dynamics in organic systems as they relate to dry bean yield.
Efforts to address health disparities and achieve health equity are critically dependent on the development of a diverse research workforce. However, many researchers from underrepresented backgrounds face challenges in advancing their careers, securing independent funding, and finding the mentorship needed to expand their research.
Faculty from the University of Maryland at College Park and the University of Wisconsin-Madison developed and evaluated an intensive week-long research and career-development institute—the Health Equity Leadership Institute (HELI)—with the goal of increasing the number of underrepresented scholars who can sustain their ongoing commitment to health equity research.
In 2010-2016, HELI brought 145 diverse scholars (78% from an underrepresented background; 81% female) together to engage with each other and learn from supportive faculty. Overall, scholar feedback was highly positive on all survey items, with average agreement ratings of 4.45-4.84 based on a 5-point Likert scale. Eighty-five percent of scholars remain in academic positions. In the first three cohorts, 73% of HELI participants have been promoted and 23% have secured independent federal funding.
HELI includes an evidence-based curriculum to develop a diverse workforce for health equity research. For those institutions interested in implementing such an institute to develop and support underrepresented early stage investigators, a resource toolbox is provided.
In an effort to enhance education, training, and learning in the disaster health community, the National Center for Disaster Medicine and Public Health (NCDMPH) gathered experts from around the nation in Bethesda, Maryland, on September 8, 2016, for the 2016 Disaster Health Education Symposium: Innovations for Tomorrow. This article summarizes key themes presented during the disaster health symposium including innovations in the following areas: training and education that saves lives, practice, teaching, sharing knowledge, and our communities. This summary article provides thematic content for those unable to attend. Please visit http://ncdmph.usuhs.edu/ for more information. (Disaster Med Public Health Preparedness. 2017;11:160–162)
To determine the effect of graft choice (allograft, bone-patellar tendon-bone autograft, or hamstring autograft) on deep tissue infections following anterior cruciate ligament (ACL) reconstructions.
Retrospective cohort study.
SETTING AND POPULATION
Patients from 6 US health plans who underwent ACL reconstruction from January 1, 2000, through December 31, 2008.
We identified ACL reconstructions and potential postoperative infections using claims data. A hierarchical stratified sampling strategy was used to identify patients for medical record review to confirm ACL reconstructions and to determine allograft vs autograft tissue implanted, clinical characteristics, and infection status. We estimated infection rates overall and by graft type. We used logistic regression to assess the association between infections and patients’ demographic characteristics, comorbidities, and choice of graft.
On review of 1,452 medical records, we found 55 deep wound infections. With correction for sampling weights, infection rates varied by graft type: 0.5% (95% CI, 0.3%-0.8%) with allografts, 0.6% (0.1%–1.5%) with bone-patellar tendon-bone autografts, and 2.5% (1.9%–3.1%) with hamstring autograft. After adjusting for potential confounders, we found an increased infection risk with hamstring autografts compared with allografts (odds ratio, 5.9; 95% CI, 2.8–12.8). However, there was no difference in infection risk among bone-patellar tendon-bone autografts vs allografts (odds ratio, 1.2; 95% CI, 0.3–4.8).
The overall risk for deep wound infections following ACL reconstruction is low but it does vary by graft type. Infection risk was highest in hamstring autograft recipients compared with allograft recipients and bone-patellar tendon-bone autograft recipients.
We analyzed birth order differences in means and variances of height and body mass index (BMI) in monozygotic (MZ) and dizygotic (DZ) twins from infancy to old age. The data were derived from the international CODATwins database. The total number of height and BMI measures from 0.5 to 79.5 years of age was 397,466. As expected, first-born twins had greater birth weight than second-born twins. With respect to height, first-born twins were slightly taller than second-born twins in childhood. After adjusting the results for birth weight, the birth order differences decreased and were no longer statistically significant. First-born twins had greater BMI than the second-born twins over childhood and adolescence. After adjusting the results for birth weight, birth order was still associated with BMI until 12 years of age. No interaction effect between birth order and zygosity was found. Only limited evidence was found that birth order influenced variances of height or BMI. The results were similar among boys and girls and also in MZ and DZ twins. Overall, the differences in height and BMI between first- and second-born twins were modest even in early childhood, while adjustment for birth weight reduced the birth order differences but did not remove them for BMI.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.