To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Carbapenemase genes in carbapenem-resistant Enterobacterales (CP-CRE) may be transmitted between patients and bacteria. Reported rates of carbapenemase genes vary widely, and it is unclear whether having a carbapenemase gene portends worse outcomes given that all patients with CRE infections have limited treatment options. Methods: Using active population- and laboratory-based active surveillance data collected by the US CDC-funded Georgia Emerging Infections Program from 2011 to 2020, we assessed the frequency of carbapenemase genes in a convenience sample of CRE isolates using whole-genome sequencing (WGS), and we investigated risk factors for carbapenemase positivity. Only the first isolate per patient in a 30-day period was included. We compared characteristics of patients with CP-CRE and non–CP-CRE. Using multivariable log binomial regression, we assessed the association of carbapenemase gene positivity and 90-day mortality. Results: Of 284 CRE isolates, 171 isolates (60.2%) possessed a carbapenemase gene (Table 1), and KPC-3 was the most common carbapenemase gene (80.7%), with only 7 isolates possessing NDM (Table 2). No isolates possessed >1 carbapenemase gene, and most isolates were from urine (82.4%) (Table 1). Carbapenemase gene positivity was associated with lower age, male sex, black race, infection with Klebsiella pneumoniae, polymicrobial infection, having an indwelling medical device, receiving chronic dialysis, and prior stay in a long-term acute-care hospital, long-term care facility, and/or prior hospitalization in the last year. The 90-day mortality rates were similar in patients with non–CP-CRE and CP-CRE: 24.8% versus 25.7% (P = .86). In multivariable analysis, carbapenemase gene presence was not associated with 90-day mortality (adjusted risk ratio, 0.82; 95% CI, 0.50–1.35) when adjusting for CCI, infection with Klebsiella pneumoniae, and chronic dialysis use. Conclusions: The frequency of CP-CRE among CRE was high in this study, but unlike prior studies, the 90-day mortality rates wer similar in patients with CP-CRE compared to non–CP-CRE. Our results provide novel associations (eg, lower age, male sex, infection with Klebsiella pneumoniae, and indwelling medical devices) that infection preventionists could use to target high-risk patients for screening or isolation prior to CP-CRE detection.
Background: The Centers for Disease Control and Prevention’s Emerging Infections Program conducts active laboratory- and population-based surveillance for carbapenem-resistant Enterobacterales (CRE) and extended spectrum beta-lactamase-producing Enterobacterales (ESBL-E). To better understand the U.S. epidemiology of these organisms among children, we determined the incidence of pediatric CRE and ESBL-E cases and described their clinical characteristics. Methods: Surveillance was conducted among children <18 years of age for CRE from 2016–2020 in 10 sites, and for ESBL-E from 2019–2020 in 6 sites. Among catchment-area residents, an incident CRE case was defined as the first isolation of Escherichia coli, Enterobacter cloacae complex, Klebsiella aerogenes, K. oxytoca, or K. pneumoniae in a 30-day period resistant to ≥1 carbapenem from a normally sterile site or urine. An incident ESBL-E case was defined as the first isolation of E. coli, K. pneumoniae, or K. oxytoca in a 30-day period resistant to any third-generation cephalosporin and non-resistant to all carbapenems from a normally sterile site or urine. Case records were reviewed. Results: Among 159 CRE cases, 131 (82.9%) were isolated from urine and 19 (12.0%) from blood; median age was 5 years (IQR 1–10) and 94 (59.1%) were female. Combined CRE incidence rate per 100,000 population by year ranged from 0.47 to 0.87. Among 207 ESBL-E cases, 160 (94.7%) were isolated from urine and 6 (3.6%) from blood; median age was 6 years (IQR 2–15) and 165 (79.7%) were female. Annual ESBL incidence rate per 100,000 population was 26.5 in 2019 and 19.63 in 2020. Incidence rates of CRE and ESBL-E were >2-fold higher in infants (children <1 year) than other age groups. Among those with data available, CRE cases were more likely than ESBL-E cases to have underlying conditions (99/158 [62.7%] versus 59/169 [34.9%], P<0.0001), prior healthcare exposures (74/158 [46.8%] versus 38/169 [22.5%], P<0.0001), and be hospitalized for any reason around time of their culture collection (75/158 [47.5%] versus 38/169 [22.5%], P<0.0001); median duration of admission was 18 days [IQR 3–103] for CRE versus 10 days [IQR 4–43] for ESBL-E. Urinary tract infection was the most frequent infection for CRE (89/158 [56.3%]) and ESBL-E (125/169 [74.0%]) cases. Conclusion: CRE infections occurred less frequently than ESBL-infections in U.S. children but were more often associated with healthcare risk factors and hospitalization. Infants had highest incidence of CRE and ESBL-E. Continued surveillance, infection prevention and control efforts, and antibiotic stewardship outside and within pediatric care are needed
Stepwise non-pharmaceutical interventions and health system changes implemented as part of the COVID-19 response have had implications on the incidence, diagnosis, and reporting of other communicable diseases. Here, we established the impact of the COVID-19 outbreak response on gastrointestinal (GI) infection trends using routinely collected surveillance data from six national English laboratory, outbreak, and syndromic surveillance systems using key dates of governmental policy to assign phases for comparison between pandemic and historic data. Following decreases across all indicators during the first lockdown (March–May 2020), bacterial and parasitic pathogens associated with foodborne or environmental transmission routes recovered rapidly between June and September 2020, while those associated with travel and/or person-to-person transmission remained lower than expected for 2021. High out-of-season norovirus activity was observed with the easing of lockdown measures between June and October 2021, with this trend reflected in laboratory and outbreak systems and syndromic surveillance indicators. Above expected increases in emergency department (ED) attendances may have reflected changes in health-seeking behaviour and provision. Differential reductions across specific GI pathogens are indicative of the underlying routes of transmission. These results provide further insight into the drivers for transmission, which can help inform control measures for GI infections.
A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a reference nutrient intake (10 µg/d; 400 IU/d) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. Latest data indicate that UK population vitamin D intakes and status reamain relatively unchanged since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose–response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: (i) need for ‘real-world’ cost information for use in modelling work; (ii) supportive food legislation; (iii) improved consumer and health professional understanding of vitamin D’s importance; (iv) clinical consequences of inadequate vitamin D status and (v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.
To describe the epidemiology of carbapenem-resistant Enterobacterales (CRE) bacteriuria and to determine whether urinary catheters increase the risk of subsequent CRE bacteremia.
Using active population- and laboratory-based surveillance we described a cohort of patients with incident CRE bacteriuria and identified risk factors for developing CRE bacteremia within 1 year.
The study was conducted among the 8 counties of Georgia Health District 3 (HD3) in Atlanta, Georgia.
Residents of HD3 with CRE first identified in urine between 2012 and 2017.
We identified 464 patients with CRE bacteriuria (mean yearly incidence, 1.96 cases per 100,000 population). Of 425 with chart review, most had a urinary catheter (56%), and many resided in long-term care facilities (48%), had a Charlson comorbidity index >3 (38%) or a decubitus ulcer (37%). 21 patients (5%) developed CRE bacteremia with the same organism within 1 year. Risk factors for subsequent bacteremia included presence of a urinary catheter (odds ratio [OR], 8.0; 95% confidence interval [CI], 1.8–34.9), central venous catheter (OR, 4.3; 95% CI, 1.7–10.6) or another indwelling device (OR, 4.3; 95% CI, 1.6–11.4), urine culture obtained as an inpatient (OR, 5.7; 95% CI, 1.3–25.9), and being in the ICU in the week prior to urine culture (OR, 2.9; 95% CI, 1.1–7.8). In a multivariable analysis, urinary catheter increased the risk of CRE bacteremia (OR, 5.3; 95% CI, 1.2–23.6).
In patients with CRE bacteriuria, urinary catheters increase the risk of CRE bacteremia. Future interventions should aim to reduce inappropriate insertion and early removal of urinary catheters.
Background: Due to limited therapeutic options and potential for spread, carbapenem-resistant Enterobacteriaceae (CRE)-producing New Delhi metallo-β-lactamases (NDMs) are a public health priority. We investigated the epidemiology of NDM-producing CRE reported to the CDC to clarify its distribution and relative prevalence. Methods: The CDC’s Antibiotic Resistance Laboratory Network supports molecular testing of CRE for 5 carbapenemases nationally. Although KPC is the most common carbapenemase in the United States, non-KPC carbapenemases are a growing concern. We analyzed CRE with any of 4 non-KPC plasmid-mediated carbapenemases (NDM, VIM, IMP, or OXA-48 type) isolated from specimens collected from January 1, 2017, through June 30, 2019; only a patient’s first isolate per organism–carbapenemase combination was included. We excluded isolates from specimen sources associated with colonization screening (eg, perirectal). We compared the proportion of NDM-producing CRE to all non-KPC–producing CP-CRE between period A (January to June 2018) and period B (January to June 2019). Health departments and the CDC collected additional exposure and molecular information in selected states to better describe current NDM-producing CRE epidemiology. Results: Overall, 47 states reported 1,013 non–KPC-producing CP-CRE (range/state, 1–109 isolates; median, 11 isolates); 46 states reported 631 NDM-producing CRE (range/state, 1–84; median, 6). NDM-producing CRE increased quarterly from the third quarter of 2018 through the second quarter of 2019; CP-CRE isolates with other non-KPC carbapenemases remained stable (Fig. 1). In period A, 124 of 216 emerging CP-CRE had NDM (57.1%), compared with 255 of 359 emerging CP-CRE (71.0%) during period B (P = .1179). Among NDM-producing CRE, the proportion of Enterobacter spp increased from 10.5% in 2018 to 18.4% in 2019 (P = .0467) (Fig. 2). In total, 18 states reported more NDM-producing CRE in the first 6 months of 2019 than in all of 2018. Connecticut, Ohio, and Oregon were among states that conducted detailed investigations; these 3 states identified 24 NDM-producing CRE isolates from 23 patients in period B. Overall, 5 (21.7%) of 22 patients with history available traveled internationally ≤12 months prior to culture; 17 (73.9%) acquired NDM-producing CRE domestically. Among 15 isolates sequenced, 8 (53.3%) carried NDM-5 (6 E. coli, 1 Enterobacter spp and 1 Klebsiella spp) and 7 (46.7%) carried NDM-1 (6 Enterobacter spp and 1 Klebsiella spp). Species were diverse; no single strain type was shared by >2 isolates. Conclusions: Detection of NDM-producing CRE has increased across the AR Lab Network. Among states with detailed information available, domestic acquisition was common, and no single variant or strain predominated. Aggressive public health response and further understanding of current US NDM-CRE epidemiology are needed to prevent further spread.
Background: In April 2019, the Georgia Department of Public Health (DPH) initiated whole-genome sequencing (WGS) on NDM-producing Enterobacteriaceae identified since January 2018. The WGS data analyzed at CDC identified related Klebsiella pneumoniae isolates with hypervirulence markers from 2 patients. Carbapenemase-producing hypervirulent K. pneumoniae (CP-hvKP) are rarely reported in the United States, but they can to cause serious, highly resistant, invasive infections. We conducted an investigation to identify cases and prevent spread. Methods: We defined a case as NDM-producing K. pneumoniae with ≥4 hypervirulence markers identified by WGS, isolated from any specimen source from a Georgia patient. We reviewed the case patient’s medical history to identify potentially affected facilities. We also performed PCR-based colonization screening and retrospective and prospective laboratory-based surveillance. Finally, we assessed facility infection control practices. Results: Overall, 7 cases from 3 case patients (A, B, and C) were identified (Fig. 1). The index case specimen was collected from case-patient A at ventilator-capable skilled nursing facility 1 (vSNF1) in May 2018. Case-patient A had been hospitalized for 1 month in India before transfer to the United States. Case-patient B’s initial isolate was collected in January 2019 on admission to vSNF2 from a critical access hospital (CAH). The CAH laboratory retrospectively identified case-patient C, who overlapped with case-patient B at the CAH in October 2018. The CAH and the vSNF2 are geographically distant from vSNF1. Case-patients B and C had no known epidemiologic links to case-patient A. Colonization screening occurred at vSNF1 in May 2018, following detection of NDM-producing K. pneumoniae from case-patient A ∼1 year before determining that the isolate carried hypervirulence markers. Among 30 residents screened, 1 had NDM and several had other carbapenemases. Subsequent screening did not identify additional NDM. Colonization screening of 112 vSNF2 residents and 13 CAH patients in 2019 did not reveal additional case patients; case-patient B resided at vSNF2 at the time of screening and remained colonized. At all 3 facilities, the DPH assessed infection control practices, issued recommendations to resolve lapses, and monitored implementation. The DPH sequenced all 27 Georgia NDM–K. pneumoniae isolates identified since January 2018; all were different multilocus sequence types from the CP-hvKP isolates, and none possessed hypervirulence markers. Conclusions: We hypothesize that CP-hvKP was imported by a patient hospitalized in India and spread to 3 Georgia facilities in 2 distinct geographic regions through indirect patient transfers. Although a response to contain NDM at vSNF1 in 2018 likely limited CP-hvKP transmission, WGS identified hvKP and established the relatedness of isolates from distinct regions, thereby directing the DPH’s additional containment activities to halt transmission.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) represent a significant antibiotic resistance threat, in part because carbapenemase genes can spread on mobile genetic elements. Here, we describe the molecular epidemiology and outcomes of patients with CRE bacteriuria from the same city in a nonoutbreak setting. Methods: The Georgia Emerging Infections Program performs active, population-based CRE surveillance in Atlanta. We studied a cohort of patients with CRE (resistant to all tested third-generation cephalosporins and ≥1 carbapenem, excluding ertapenem) first identified in urine, and not in a prior or simultaneous sterile site, between 2012 and 2015. Whole-genome sequencing (WGS) was performed on a convenience sample. We obtained epidemiologic and outcome data through chart review and Georgia Vital Statistics records (90-day mortality). Using WGS, we created a core-genome alignment-based phylogenetic tree of the Klebsiella pneumoniae isolates and calculated the SNP difference between each sample. Using SAS version 9.4 software, we performed the Fisher exact test and univariable odds ratios (OR) with 95% CI to compare patient isolates with and without a carbapenemase gene. Results: Among 81 patients included, the median age was 68 (IQR, 57–74) years, and most were female (58%), black (60%), and resided in a long-term care facility 4 days prior to culture isolation (53%). Organisms isolated were K. pneumoniae (84%), Escherichia coli (7%), Enterobacter cloacae (7%), and Klebsiella oxytoca (1%). WGS identified at least 1 β-lactamase gene in 91% of the isolates; 85% contained a carbapenemase gene, the most frequent of which was blaKPC-3 (94%). Patients with CRE containing a carbapenemase gene were more likely to be black (OR, 3.7; 95% CI, 1.0–13.8) and to have K. pneumoniae (OR, 8.9; 95% CI, 2.2–35.0). Using a core-genome alignment of 3,708 genes (~63% of the complete genome), we identified a median of 67 (IQR, 23–3,881) SNP differences between each K. pneumoniae isolate. A phylogenetic tree identified clustering around carbapenemase gene and multilocus sequence type (84% were ST 258) but not based on referring laboratory or county of residence (Fig. 1). Although 7% of patients developed an invasive CRE infection within 1 year and 21% died within 90 days, having a carbapenemase gene was not associated with these outcomes. Conclusions: Molecular sequencing of a convenience sample of CRE bacteriuria support K. pneumoniae ST258 harboring blaKPC-3 being distributed throughout the Atlanta area, across the healthcare continuum. Overall mortality was high in this population, but the presence of carbapenemase genes was not associated with worse outcomes.
Background: Chlorhexidine bathing reduces bacterial skin colonization and prevents infections in specific patient populations. As chlorhexidine use becomes more widespread, concerns about bacterial tolerance to chlorhexidine have increased; however, testing for chlorhexidine minimum inhibitory concentrations (MICs) is challenging. We adapted a broth microdilution (BMD) method to determine whether chlorhexidine MICs changed over time among 4 important healthcare-associated pathogens. Methods: Antibiotic-resistant bacterial isolates (Staphylococcus aureus from 2005 to 2019 and Escherichia coli, Klebsiella pneumoniae, and Enterobacter cloacae complex from 2011 to 2019) were collected through Emerging Infections Program surveillance in 2 sites (Georgia and Tennessee) or through public health reporting in 1 site (Orange County, California). A convenience sample of isolates were collected from facilities with varying amounts of chlorhexidine use. We performed BMD testing using laboratory-developed panels with chlorhexidine digluconate concentrations ranging from 0.125 to 64 μg/mL. After successfully establishing reproducibility with quality control organisms, 3 laboratories performed MIC testing. For each organism, epidemiological cutoff values (ECVs) were established using ECOFFinder. Results: Among 538 isolates tested (129 S. aureus, 158 E. coli, 142 K. pneumoniae, and 109 E. cloacae complex), S. aureus, E. coli, K. pneumoniae, and E. cloacae complex ECVs were 8, 4, 64, and 64 µg/mL, respectively (Table 1). Moreover, 14 isolates had an MIC above the ECV (12 E. coli and 2 E. cloacae complex). The MIC50 of each species is reported over time (Table 2). Conclusions: Using an adapted BMD method, we found that chlorhexidine MICs did not increase over time among a limited sample of S. aureus, E. coli, K. pneumoniae, and E. cloacae complex isolates. Although these results are reassuring, continued surveillance for elevated chlorhexidine MICs in isolates from patients with well-characterized chlorhexidine exposure is needed as chlorhexidine use increases.
The first part of this chapter has been written with the patient’s journey in mind: from the time of presentation to the general practitioner (GP) with a problem such as abnormal uterine bleeding, through referral to secondary care for investigation, including hysteroscopy if appropriate, and to treatment as indicated. We hope this approach will clarify what is involved in providing such a service. In the second half of the chapter, the equipment required for providing hysteroscopy services is described in detail, making extensive use of published standards and guidelines for gynaecology and hysteroscopy specifically.
The COVID-19 pandemic is exerting major pressures on society, health and social care services and science. Understanding the progression and current impact of the pandemic is fundamental to planning, management and mitigation of future impact on the population. Surveillance is the core function of any public health system, and a multi-component surveillance system for COVID-19 is essential to understand the burden across the different strata of any health system and the population. Many countries and public health bodies utilise ‘syndromic surveillance’ (using real-time, often non-specific symptom/preliminary diagnosis information collected during routine healthcare provision) to supplement public health surveillance programmes. The current COVID-19 pandemic has revealed a series of unprecedented challenges to syndromic surveillance including: the impact of media reporting during early stages of the pandemic; changes in healthcare-seeking behaviour resulting from government guidance on social distancing and accessing healthcare services; and changes in clinical coding and patient management systems. These have impacted on the presentation of syndromic outputs, with changes in denominators creating challenges for the interpretation of surveillance data. Monitoring changes in healthcare utilisation is key to interpreting COVID-19 surveillance data, which can then be used to better understand the impact of the pandemic on the population. Syndromic surveillance systems have had to adapt to encompass these changes, whilst also innovating by taking opportunities to work with data providers to establish new data feeds and develop new COVID-19 indicators. These developments are supporting the current public health response to COVID-19, and will also be instrumental in the continued and future fight against the disease.
Chickenpox is caused by varicella-zoster-virus (VZV) and is highly contagious. Immigration detention settings are a high-risk environment for primary VZV transmission, with large, rapidly-changing populations in close quarters, and higher susceptibility among non-UK-born individuals. During outbreaks, operational challenges occur in detention settings because of high-turnover and the potential need to implement population movement restriction for prolonged periods. Between December 2017 and February 2018, four cases of chickenpox were notified amongst 799 detainees in an immigration removal centre (IRC). Microbiological investigations included case confirmation by vesicular fluid polymerase chain reaction, and VZV serology for susceptibility testing. Control measures involved movement restrictions, isolation of cases, quarantining and cohorting of non-immune contacts and extending VZV immunity testing to the wider detainee population to support outbreak management. Immunity was tested for 301/532 (57%) detainees, of whom 24 (8%) were non-immune. The level of non-immunity was lower than expected based on the existing literature on VZV seroprevalence in detained populations in England. Serology results identified non-immune contacts who could be cohorted and, due to the lack of isolation capacity, allowed the placement of cases with immune detainees. The widespread immunity testing of all detainees was proving challenging to sustain because it required significant resources and was having a severe impact on operational capacity and the ability to maintain core business activities at the IRC. Therefore, mathematical modelling was used to assess the impact of scaling back mass immunity testing. Modelling demonstrated that interrupting testing posed a risk of one additional case compared to continuing with testing. As such, the decision was made to stop testing, and the outbreak was successfully controlled without excessive strain on resources. Operational challenges generated learning for future outbreaks, with implications for a local and national policy on IRC staff occupational health requirements, and proposed reception screening of detainees for VZV immunity.
The cost-effectiveness of molecular pathology testing is highly context dependent. The field is fast-moving, and national health technology assessment may not be relevant or timely for local decision makers. This study illustrates a method of context-specific economic evaluation that can be carried out in a limited timescale without extensive resources.
We established a multi-disciplinary group including an oncologist, pathologists and a health economist. We set out diagnostic and treatment pathways and costs using registry data, health technology assessments, guidelines, audit data, and estimates from the group. Sensitivity analysis varied input parameters across plausible ranges. The evaluation setting was the West of Scotland and UK NHS perspective was adopted. The evaluation was assessed against the AdHopHTA checklist for hospital-based health technology assessment.
A context-specific economic evaluation could be carried out on a timely basis using limited resources. The evaluation met all relevant criteria in the AdHopHTA checklist. Health outcomes were expected to be at least equal to the current strategy. Annual cost savings of £637,000 were estimated resulting primarily from a reduction in the proportion of patients receiving intravenous infusional chemotherapy regimens. The result was not sensitive to any parameter. The data driving the main cost saving came from a small clinical audit. We recommended this finding was confirmed in a larger population.
The method could be used to evaluate testing changes elsewhere. The results of the case study may be transferable to other jurisdictions where the organization of cancer services is fragmented.
Syndromic surveillance is a form of surveillance that generates information for public health action by collecting, analysing and interpreting routine health-related data on symptoms and clinical signs reported by patients and clinicians rather than being based on microbiologically or clinically confirmed cases. In England, a suite of national real-time syndromic surveillance systems (SSS) have been developed over the last 20 years, utilising data from a variety of health care settings (a telehealth triage system, general practice and emergency departments). The real-time systems in England have been used for early detection (e.g. seasonal influenza), for situational awareness (e.g. describing the size and demographics of the impact of a heatwave) and for reassurance of lack of impact on population health of mass gatherings (e.g. the London 2012 Olympic and Paralympic Games).We highlight the lessons learnt from running SSS, for nearly two decades, and propose questions and issues still to be addressed. We feel that syndromic surveillance is an example of the use of ‘big data’, but contend that the focus for sustainable and useful systems should be on the added value of such systems and the importance of people working together to maximise the value for the public health of syndromic surveillance services.
Treatment for hoarding disorder is typically performed by mental health professionals, potentially limiting access to care in underserved areas.
We aimed to conduct a non-inferiority trial of group peer-facilitated therapy (G-PFT) and group psychologist-led cognitive–behavioural therapy (G-CBT).
We randomised 323 adults with hording disorder 15 weeks of G-PFT or 16 weeks of G-CBT and assessed at baseline, post-treatment and longitudinally (≥3 months post-treatment: mean 14.4 months, range 3–25). Predictors of treatment response were examined.
G-PFT (effect size 1.20) was as effective as G-CBT (effect size 1.21; between-group difference 1.82 points, t = −1.71, d.f. = 245, P = 0.04). More homework completion and ongoing help from family and friends resulted in lower severity scores at longitudinal follow-up (t = 2.79, d.f. = 175, P = 0.006; t = 2.89, d.f. = 175, P = 0.004).
Peer-led groups were as effective as psychologist-led groups, providing a novel treatment avenue for individuals without access to mental health professionals.
Declaration of interest
C.A.M. has received grant funding from the National Institutes of Health (NIH) and travel reimbursement and speakers’ honoraria from the Tourette Association of America (TAA), as well as honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. K.D. receives research support from the NIH and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. R.S.M. receives research support from the National Institute of Mental Health, National Institute of Aging, the Hillblom Foundation, Janssen Pharmaceuticals (research grant) and the Alzheimer's Association. R.S.M. has also received travel support from the National Institute of Mental Health for Workshop participation. J.Y.T. receives research support from the NIH, Patient-Centered Outcomes Research Institute and the California Tobacco Related Research Program, and honoraria and travel reimbursement from the NIH for serving as an NIH Study Section reviewer. All other authors report no conflicts of interest.
The Public Health England (PHE; United Kingdom) Real-Time Syndromic Surveillance Team (ReSST) currently operates four national syndromic surveillance systems, including an emergency department system. A system based on ambulance data might provide an additional measure of the “severe” end of the clinical disease spectrum. This report describes the findings and lessons learned from the development and preliminary assessment of a pilot syndromic surveillance system using ambulance data from the West Midlands (WM) region in England.
Is an Ambulance Data Syndromic Surveillance System (ADSSS) feasible and of utility in enhancing the existing suite of PHE syndromic surveillance systems?
An ADSSS was designed, implemented, and a pilot conducted from September 1, 2015 through March 1, 2016. Surveillance cases were defined as calls to the West Midlands Ambulance Service (WMAS) regarding patients who were assigned any of 11 specified chief presenting complaints (CPCs) during the pilot period. The WMAS collected anonymized data on cases and transferred the dataset daily to ReSST, which contained anonymized information on patients’ demographics, partial postcode of patients’ location, and CPC. The 11 CPCs covered a broad range of syndromes. The dataset was analyzed descriptively each week to determine trends and key epidemiological characteristics of patients, and an automated statistical algorithm was employed daily to detect higher than expected number of calls. A preliminary assessment was undertaken to assess the feasibility, utility (including quality of key indicators), and timeliness of the system for syndromic surveillance purposes. Lessons learned and challenges were identified and recorded during the design and implementation of the system.
The pilot ADSSS collected 207,331 records of individual ambulance calls (daily mean=1,133; range=923-1,350). The ADSSS was found to be timely in detecting seasonal changes in patterns of respiratory infections and increases in case numbers during seasonal events.
Further validation is necessary; however, the findings from the assessment of the pilot ADSSS suggest that selected, but not all, ambulance indicators appear to have some utility for syndromic surveillance purposes in England. There are certain challenges that need to be addressed when designing and implementing similar systems.
TodkillD, LoveridgeP, ElliotAJ, MorbeyRA, EdeghereO, Rayment-BishopT, Rayment-BishopC, ThornesJE, SmithG. Utility of Ambulance Data for Real-Time Syndromic Surveillance: A Pilot in the West Midlands Region, United Kingdom. Prehosp Disaster Med. 2017;32(6):667–672.
Computerised cognitive–behavioural therapy (cCBT) for depression has the potential to be efficient therapy but engagement is poor in primary care trials.
We tested the benefits of adding telephone support to cCBT.
We compared telephone-facilitated cCBT (MoodGYM) (n = 187) to minimally supported cCBT (MoodGYM) (n = 182) in a pragmatic randomised trial (trial registration: ISRCTN55310481). Outcomes were depression severity (Patient Health Questionnaire (PHQ)-9), anxiety (Generalized Anxiety Disorder Questionnaire (GAD)-7) and somatoform complaints (PHQ-15) at 4 and 12 months.
Use of cCBT increased by a factor of between 1.5 and 2 with telephone facilitation. At 4 months PHQ-9 scores were 1.9 points lower (95% CI 0.5–3.3) for telephone-supported cCBT. At 12 months, the results were no longer statistically significant (0.9 PHQ-9 points, 95% CI −0.5 to 2.3). There was improvement in anxiety scores and for somatic complaints.
Telephone facilitation of cCBT improves engagement and expedites depression improvement. The effect was small to moderate and comparable with other low-intensity psychological interventions.
In preparation for the London 2012 Olympic Games, existing syndromic surveillance systems operating in England were expanded to include daily general practitioner (GP) out-of-hours (OOH) contacts and emergency department (ED) attendances at sentinel sites (the GP OOH and ED syndromic surveillance systems: GPOOHS and EDSSS).
The further development of syndromic surveillance systems in time for the London 2012 Olympic Games provided a unique opportunity to investigate the impact of a large mass-gathering event on public health and health services as monitored in near real-time by syndromic surveillance of GP OOH contacts and ED attendances. This can, in turn, aid the planning of future events.
The EDSSS and GPOOHS data for London and England from July 13 to August 26, 2012, and a similar period in 2013, were divided into three distinct time periods: pre-Olympic period (July 13-26, 2012); Olympic period (July 27 to August 12); and post-Olympic period (August 13-26, 2012). Time series of selected syndromic indicators in 2012 and 2013 were plotted, compared, and risk assessed by members of the Real-time Syndromic Surveillance Team (ReSST) in Public Health England (PHE). Student’s t test was used to test any identified changes in pattern of attendance.
Very few differences were found between years or between the weeks which preceded and followed the Olympics. One significant exception was noted: a statistically significant increase (P value = .0003) in attendances for “chemicals, poisons, and overdoses, including alcohol” and “acute alcohol intoxication” were observed in London EDs coinciding with the timing of the Olympic opening ceremony (9:00 pm July 27, 2012 to 01:00 am July 28, 2012).
Syndromic surveillance was able to provide near to real-time monitoring and could identify hourly changes in patterns of presentation during the London 2012 Olympic Games. Reassurance can be provided to planners of future mass-gathering events that there was no discernible impact in overall attendances to sentinel EDs or GP OOH services in the host country. The increase in attendances for alcohol-related causes during the opening ceremony, however, may provide an opportunity for future public health interventions.
TodkillD, HughesHE, ElliotAJ, MorbeyRA, EdeghereO, HarcourtS, HughesT, EndericksT, McCloskeyB, CatchpoleM, IbbotsonS, SmithG. An Observational Study Using English Syndromic Surveillance Data Collected During the 2012 London Olympics – What did Syndromic Surveillance Show and What Can We Learn for Future Mass-gathering Events?Prehosp Disaster Med. 2016;31(6):628–634.
In the present work, the microstructure and mechanical properties of Gilsocarbon graphite have been characterized over a range of length-scales. Optical imaging, combined with 3D X-ray computed tomography and 3D high-resolution tomography based on focus ion beam milling has been adopted for microstructural characterization. A range of small-scale mechanical testing approaches are applied including an in situ micro-cantilever technique based in a Dualbeam workstation. It was found that pores ranging in size from nanometers to tens of micrometers in diameter are present which modify the deformation and fracture characteristics of the material. This multi-scale mechanical testing approach revealed the significant change of mechanical properties, for example flexural strength, of this graphite over the length-scale from a micrometer to tens of centimeters. Such differences emphasize why input parameters to numerical models have to be undertaken at the appropriate length-scale to allow predictions of the deformation, fracture and the stochastic features of the strength of the graphite with the required confidence. Finally, the results from a multi-scale model demonstrated that these data derived from the micro-scale tests can be extrapolated, with high confidence, to large components with realistic dimensions.