Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Laboratory identification of carbapenem-resistant Enterobacteriaceae (CRE) is a key step in controlling its spread. Our survey showed that most Veterans Affairs laboratories follow VA guidelines for initial CRE identification, whereas 55.0% use PCR to confirm carbapenemase production. Most respondents were knowledgeable about CRE guidelines. Barriers included staffing, training, and financial resources.
Cyber Operational Risk: Cyber risk is routinely cited as one of the most important sources of operational risks facing organisations today, in various publications and surveys. Further, in recent years, cyber risk has entered the public conscience through highly publicised events involving affected UK organisations such as TalkTalk, Morrisons and the NHS. Regulators and legislators are increasing their focus on this topic, with General Data Protection Regulation (“GDPR”) a notable example of this. Risk actuaries and other risk management professionals at insurance companies therefore need to have a robust assessment of the potential losses stemming from cyber risk that their organisations may face. They should be able to do this as part of an overall risk management framework and be able to demonstrate this to stakeholders such as regulators and shareholders. Given that cyber risks are still very much new territory for insurers and there is no commonly accepted practice, this paper describes a proposed framework in which to perform such an assessment. As part of this, we leverage two existing frameworks – the Chief Risk Officer (“CRO”) Forum cyber incident taxonomy, and the National Institute of Standards and Technology (“NIST”) framework – to describe the taxonomy of a cyber incident, and the relevant cyber security and risk mitigation items for the incident in question, respectively.Summary of Results: Three detailed scenarios have been investigated by the working party:
∙Employee leaks data at a general (non-life) insurer: Internal attack through social engineering, causing large compensation costs and regulatory fines, driving a 1 in 200 loss of £210.5m (c. 2% of annual revenue).
∙Cyber extortion at a life insurer: External attack through social engineering, causing large business interruption and reputational damage, driving a 1 in 200 loss of £179.5m (c. 6% of annual revenue).
∙Motor insurer telematics device hack: External attack through software vulnerabilities, causing large remediation / device replacement costs, driving a 1 in 200 loss of £70.0m (c. 18% of annual revenue).
Limitations: The following sets out key limitations of the work set out in this paper:
∙While the presented scenarios are deemed material at this point in time, the threat landscape moves fast and could render specific narratives and calibrations obsolete within a short-time frame.
∙There is a lack of historical data to base certain scenarios on and therefore a high level of subjectivity is used to calibrate them.
∙No attempt has been made to make an allowance for seasonality of renewals (a cyber event coinciding with peak renewal season could exacerbate cost impacts)
∙No consideration has been given to the impact of the event on the share price of the company.
∙Correlation with other risk types has not been explicitly considered.
Conclusions: Cyber risk is a very real threat and should not be ignored or treated lightly in operational risk frameworks, as it has the potential to threaten the ongoing viability of an organisation. Risk managers and capital actuaries should be aware of the various sources of cyber risk and the potential impacts to ensure that the business is sufficiently prepared for such an event. When it comes to quantifying the impact of cyber risk on the operations of an insurer there are significant challenges. Not least that the threat landscape is ever changing and there is a lack of historical experience to base assumptions off. Given this uncertainty, this paper sets out a framework upon which readers can bring consistency to the way scenarios are developed over time. It provides a common taxonomy to ensure that key aspects of cyber risk are considered and sets out examples of how to implement the framework. It is critical that insurers endeavour to understand cyber risk better and look to refine assumptions over time as new information is received. In addition to ensuring that sufficient capital is being held for key operational risks, the investment in understanding cyber risk now will help to educate senior management and could have benefits through influencing internal cyber security capabilities.
Cognitive behavioral therapy (CBT) is an effective treatment for many patients suffering from major depressive disorder (MDD), but predictors of treatment outcome are lacking, and little is known about its neural mechanisms. We recently identified longitudinal changes in neural correlates of conscious emotion regulation that scaled with clinical responses to CBT for MDD, using a negative autobiographical memory-based task.
We now examine the neural correlates of emotional reactivity and emotion regulation during viewing of emotionally salient images as predictors of treatment outcome with CBT for MDD, and the relationship between longitudinal change in functional magnetic resonance imaging (fMRI) responses and clinical outcomes. Thirty-two participants with current MDD underwent baseline MRI scanning followed by 14 sessions of CBT. The fMRI task measured emotional reactivity and emotion regulation on separate trials using standardized images from the International Affective Pictures System. Twenty-one participants completed post-treatment scanning. Last observation carried forward was used to estimate clinical outcome for non-completers.
Pre-treatment emotional reactivity Blood Oxygen Level-Dependent (BOLD) signal within hippocampus including CA1 predicted worse treatment outcome. In contrast, better treatment outcome was associated with increased down-regulation of BOLD activity during emotion regulation from time 1 to time 2 in precuneus, occipital cortex, and middle frontal gyrus.
CBT may modulate the neural circuitry of emotion regulation. The neural correlates of emotional reactivity may be more strongly predictive of CBT outcome. The finding that treatment outcome was predicted by BOLD signal in CA1 may suggest overgeneralized memory as a negative prognostic factor in CBT outcome.
Although most hospitals report very high levels of hand hygiene compliance (HHC), the accuracy of these overtly observed rates is questionable due to the Hawthorne effect and other sources of bias. In the study, we aimed (1) to compare HHC rates estimated using the standard audit method of overt observation by a known observer and a new audit method that employed a rapid (<15 minutes) “secret shopper” method and (2) to pilot test a novel feedback tool.
Quality improvement project using a quasi-experimental stepped-wedge design.
This study was conducted in 5 acute-care hospitals (17 wards, 5 intensive care units) in the Midwestern United States.
Sites recruited a hand hygiene observer from outside the acute-care units to rapidly and covertly observe entry and exit HHC during the study period, October 2016–September 2017. After 3 months of observations, sites received a monthly feedback tool that communicated HHC information from the new audit method.
The absolute difference in HHC estimates between the standard and new audit methods was ~30%. No significant differences in HHC were detected between the baseline and feedback phases (OR, 0.92; 95% CI, 0.84–1.01), but the standard audit method had significantly higher estimates than the new audit method (OR, 9.83; 95% CI, 8.82–10.95).
HHC estimates obtained using the new audit method were substantially lower than estimates obtained using the standard audit method, suggesting that the rapid, secret-shopper method is less subject to bias. Providing feedback using HHC from the new audit method did not seem to impact HHC behaviors.
We evaluated the utility of vancomycin-resistant Enterococcus (VRE) surveillance by varying 2 parameters: admission versus weekly surveillance and perirectal swabbing versus stool sampling.
Prospective, patient-level surveillance program of incident VRE colonization.
Liver transplant surgical intensive care unit (SICU) of a tertiary-care referral medical center with a high prevalence of VRE.
All patients admitted to the SICU from June to August 2015.
We conducted a point-prevalence estimate followed by admission and weekly surveillance by perirectal swabbing and/or stool sampling. Incident colonization was defined as a negative screen followed by positive surveillance. VRE was detected by culture on Remel Spectra VRE chromogenic agar. Microbiologically-confirmed VRE bloodstream infections (BSIs) were tracked for 2 months. Statistical analyses were calculated using the McNemar test, the Fisher exact test, the t test, and the χ2 test.
In total, 91 patients underwent VRE surveillance testing. The point prevalence of VRE colonization was 60.9%; VRE prevalence on admission was 30.1%. Weekly surveillance identified an additional 7 of 28 patients (25.0%) with incident colonization. VRE BSIs were more common in VRE-colonized patients than in noncolonized patients (8 of 43 vs 2 of 48; P=.028). In a direct comparison, perirectal swabs were more sensitive than stool samples in detecting VRE (64 of 67 vs 56 of 67; P=.023). Compliance with perirectal swabbing was 89% (201 of 226) compared to 56% (127 of 226) for stool collection (P≤0.001).
We recommend weekly VRE surveillance over admission-only screening in high-burden units such as liver transplant SICUs. Perirectal swabs had greater collection compliance and sensitivity than stool samples, making them the preferred methodology. Further work may have implications for antimicrobial stewardship and infection control.
To evaluate the impact of discontinuing routine contact precautions (CP) for endemic methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE) on hospital adverse events.
Academic medical center with single-occupancy rooms.
We compared hospital reportable adverse events 1 year before and 1 year after discontinuation of routine CP for endemic MRSA and VRE (preintervention and postintervention periods, respectively). Throughout the preintervention period, daily chlorhexidine gluconate bathing was expanded to nearly all inpatients. Chart reviews were performed to identify which patients and events were associated with CP for MRSA/VRE in the preintervention period as well as the patients that would have met prior criteria for MRSA/VRE CP but were not isolated in the postintervention period. Adverse events during the 2 periods were compared using segmented and mixed-effects Poisson regression models.
There were 24,732 admissions in the preintervention period and 25,536 in the postintervention period. Noninfectious adverse events (ie, postoperative respiratory failure, hemorrhage/hematoma, thrombosis, wound dehiscence, pressure ulcers, and falls or trauma) decreased by 19% (12.3 to 10.0 per 1,000 admissions, P=.022) from the preintervention to the postintervention period. There was no significant difference in the rate of infectious adverse events after CP discontinuation (20.7 to 19.4 per 1,000 admissions, P=.33). Patients with MRSA/VRE showed the largest reduction in noninfectious adverse events after CP discontinuation, with a 72% reduction (21.4 to 6.08 per 1,000 MRSA/VRE admissions; P<.001).
After discontinuing routine CP for endemic MRSA/VRE, the rate of noninfectious adverse events declined, especially in patients who no longer required isolation. This suggests that elimination of CP may substantially reduce noninfectious adverse events.
From 2000 to 2009, rates of multidrug-resistant Acinetobacter baumanii increased 10-fold to 0.2 per 1,000 patient days. From 2010 to 2015, however, rates markedly declined and have stayed below 0.05 per 1,000 patient days. Herein, we present a 15-year trend analysis and discuss interventions that may have led to the decline.
Rain-on-snow events trigger immediate and delayed avalanches as liquid water penetrates the snowpack. We present results from an extreme rain-on-snow event that triggered a glide avalanche near Snoqualmie Pass, Washington, USA. Snoqualmie Pass recorded 463 cm of snowfall from 13 December 2008 to 6 January 2009. This period of snowfall was followed by a strong southwesterly tropical flow that resulted in an extreme rain-on-snow event. Sensors at Snoqualmie Pass recorded 285 mm of precipitation over a 52 hour period. Flooding, slush flows, landslides and avalanches resulted from the influx of precipitation. Snow heights decreased rapidly over the period, with settlement rates approaching 80 mm h−1. Liquid water infiltrated and flowed through the snowpack within a few hours of the arrival of rain, yet many of the major avalanches occurred 12–30 or more hours after the onset of rain and water outflow. A glide avalanche occurred ∼30 hours after the onset of rain and the establishment of drainage through the snowpack. Increasing glide rates correlate with periods of rapid snow settlement. Here glide rates approached 670 mm h−1. Although glide and settlement rates increased during periods of intense precipitation, glide failure occurred 8 hours after peak precipitation and outflow.
We assess the gas-phase abundances of Si, C, and Fe from our recent measurements of Si++, C++, and Fe++ in the Orion Nebula by expanding on our earlier “blister” models. The Fe++ 22.9 μm line measured with the KAO yields Fe/H ~ 3 × 10−6 - considerably larger than in the diffuse ISM, where relative to solar, Fe/H is down by ~ 100. However, in Orion, Fe/H is still lower than solar by a factor ~ 10. The C and Si abundances are derived from new IUE high dispersion spectra of the C++ 1907, 1909 Å and Si++ 1883, 1892 Å lines. Gas-phase Si/C = 0.016 in the Orion ionized volume and is particularly insensitive to uncertainties in extinction and temperature structure. The solar value is 0.098. Gas-phase C/H = 3 × 10−4 and Si/H = 4.8 × 10−6. Compared to solar, Si is depleted by 0.135 in the ionized region, while C is essentially undepleted. This suggests that most Si and Fe resides in dust grains even in the ionized volume.
We apply a 2-D, axisymmetric code for modeling H II regions (Rubin Ap. J. 287, 653, 1984) to observations of the Orion Nebula. The model solves for the ionization and thermal structure and radiative transfer for the quasi-equilibrium volume. Assuming that the Orion Nebula is viewed face-on (along the symmetry axis) and that the geometry/density distribution is plane parallel with an exponential density gradient perpendicular to the slab, we use a x2 minimization technique to best fit the radio continuum maps. The best fit to the Schraml and Mezger map (Astrophys. J. 156, 269, 1969) has a density at the star of ∼1800 cm−3, a scale height of ∼0.23 pc, and ∼1.5x1049 ionizing photons s−1 so that ∼ 1/3 of the ionizing photons from the exciting source are escaping the nebula through the frontal density-bounded direction. Our model for Orion requires circular symmetry in the plane of the sky; nonsymmetrical features such as the ionization bar toward the SE cannot be reproduced. Further modeling that compares with line observations has been delayed to incorporate the important role played by recombinations in populating low-lying [O II] levels (Rubin 1985, Astrophys. J., submitted).
To examine variation in antibiotic coverage and detection of resistant pathogens in community-onset pneumonia.
A total of 128 hospitals in the Veterans Affairs health system.
Hospitalizations with a principal diagnosis of pneumonia from 2009 through 2010.
We examined proportions of hospitalizations with empiric antibiotic coverage for methicillin-resistant Staphylococcus aureus (MRSA) and Pseudomonas aeruginosa (PAER) and with initial detection in blood or respiratory cultures. We compared lowest- versus highest-decile hospitals, and we estimated adjusted probabilities (AP) for patient- and hospital-level factors predicting coverage and detection using hierarchical regression modeling.
Among 38,473 hospitalizations, empiric coverage varied widely across hospitals (MRSA lowest vs highest, 8.2% vs 42.0%; PAER lowest vs highest, 13.9% vs 44.4%). Detection rates also varied (MRSA lowest vs highest, 0.5% vs 3.6%; PAER lowest vs highest, 0.6% vs 3.7%). Whereas coverage was greatest among patients with recent hospitalizations (AP for anti-MRSA, 54%; AP for anti-PAER, 59%) and long-term care (AP for anti-MRSA, 60%; AP for anti-PAER, 66%), detection was greatest in patients with a previous history of a positive culture (AP for MRSA, 7.9%; AP for PAER, 11.9%) and in hospitals with a high prevalence of the organism in pneumonia (AP for MRSA, 3.9%; AP for PAER, 3.2%). Low hospital complexity and rural setting were strong negative predictors of coverage but not of detection.
Hospitals demonstrated widespread variation in both coverage and detection of MRSA and PAER, but probability of coverage correlated poorly with probability of detection. Factors associated with empiric coverage (eg, healthcare exposure) were different from those associated with detection (eg, microbiology history). Providing microbiology data during empiric antibiotic decision making could better align coverage to risk for resistant pathogens and could promote more judicious use of broad-spectrum antibiotics.
The purpose of this study was to quantify the effect of multidrug-resistant (MDR) gram-negative bacteria and methicillin-resistant Staphylococcus aureus (MRSA) healthcare-associated infections (HAIs) on mortality following infection, regardless of patient location.
We conducted a retrospective cohort study of patients with an inpatient admission in the US Department of Veterans Affairs (VA) system between October 1, 2007, and November 30, 2010. We constructed multivariate log-binomial regressions to assess the impact of a positive culture on mortality in the 30- and 90-day periods following the first positive culture, using a propensity-score–matched subsample.
Patients identified with positive cultures due to MDR Acinetobacter (n=218), MDR Pseudomonas aeruginosa (n=1,026), and MDR Enterobacteriaceae (n=3,498) were propensity-score matched to 14,591 patients without positive cultures due to these organisms. In addition, 3,471 patients with positive cultures due to MRSA were propensity-score matched to 12,499 patients without positive MRSA cultures. Multidrug-resistant gram-negative bacteria were associated with a significantly elevated risk of mortality both for invasive (RR, 2.32; 95% CI, 1.85–2.92) and noninvasive cultures (RR, 1.33; 95% CI, 1.22–1.44) during the 30-day period. Similarly, patients with MRSA HAIs (RR, 2.77; 95% CI, 2.39–3.21) and colonizations (RR, 1.32; 95% CI, 1.22–1.50) had an increased risk of death at 30 days.
We found that HAIs due to gram-negative bacteria and MRSA conferred significantly elevated 30- and 90-day risks of mortality. This finding held true both for invasive cultures, which are likely to be true infections, and noninvasive infections, which are possibly colonizations.
Environmental exposures during pregnancy may increase breast cancer risk for mothers and female offspring. Tumor tissue assays may provide insight regarding the mechanisms. This study assessed the feasibility of obtaining tumor samples and pathology reports from mothers (F0) who were enrolled in the Child Health and Development Studies during pregnancy from 1959 to 1967 and their daughters (F1) who developed breast cancer over more than 50 years of follow-up. Breast cancer cases were identified through linkage to the California Cancer Registry and self-report. Written consent was obtained from 116 F0 and 95 F1 breast cancer survivors to access their pathology reports and tumor blocks. Of those contacted, 62% consented, 13% refused and 24% did not respond. We obtained tissue samples for 57% and pathology reports for 75%, and if diagnosis was made ⩽10 years we obtained tissue samples and pathology reports for 91% and 79%, respectively. Obtaining pathology reports and tumor tissues of two generations is feasible and will support investigation of the relationship between early-life exposures and molecular tumor markers. However, we found that more recent diagnosis increased the accessibility of tumor tissue. We recommend that cohorts request consent for obtaining future tumor tissues at study enrollment and implement real-time tissue collection to enhance success of collecting tumor samples and data.
In recent years, a series of large-scale, high-profile natural disasters and terrorist attacks have demonstrated the need for thorough and effective disaster preparedness. While these extreme events affect communities and societies as a whole, they also carry specific risks for particular population groups. Crises such as Hurricane Katrina and the 2011 earthquake and tsunami disaster in Japan have illustrated the risk of significant and disproportionate morbidity and mortality among older adults during disasters. Age does not necessarily equate to vulnerability, but many physical and psychological consequences of the aging process can increase the risk of adverse outcomes. As the older population grows, so too does the need to ensure that adequate, practical, and appropriate measures exist to offset the specific risks from extreme events associated with this subpopulation. Effective risk and crisis communication plays a key role in mitigating the extent to which older adults are differentially affected during extreme events. By identifying the specific issues affecting older adults, this review highlights important areas for action for practitioners and policy-makers, particularly in the realm of crisis communication. (Disaster Med Public Health Preparedness. 2017;11:127–134)
To evaluate the impact of discontinuation of contact precautions (CP) for methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE) and expansion of chlorhexidine gluconate (CHG) use on the health system.
We compared hospital-wide laboratory-identified clinical culture rates (as a marker of healthcare-associated infections) 1 year before and after routine CP for endemic MRSA and VRE were discontinued and CHG bathing was expanded to all units. Culture data from patients and cost data on material utilization were collected. Nursing time spent donning personal protective equipment was assessed and quantified using time-driven activity-based costing.
Average positive culture rates before and after discontinuing CP were 0.40 and 0.32 cultures/100 admissions for MRSA (P=.09), and 0.48 and 0.40 cultures/100 admissions for VRE (P=.14). When combining isolation gown and CHG costs, the health system saved $643,776 in 1 year. Before the change, 28.5% intensive care unit and 19% medicine/surgery beds were on CP for MRSA/VRE. On the basis of average room entries and donning time, estimated nursing time spent donning personal protective equipment for MRSA/VRE before the change was 45,277 hours/year (estimated cost, $4.6 million).
Discontinuing routine CP for endemic MRSA and VRE did not result in increased rates of MRSA or VRE after 1 year. With cost savings on materials, decreased healthcare worker time, and no concomitant increase in possible infections, elimination of routine CP may add substantial value to inpatient care delivery.
Twenty-eight 14C analyses are reported for carbonized roots and other plant material collected from beneath 15 prehistoric lava flows erupted from the northeast rift zone (NERZ) of Mauna Loa Volcano (ML) utilizing the recovery techniques of Lockwood and Lipman (1980). Most samples were collected from the Hilo 7 1/2’ quadrangle during field work for a geologic map of that quadrangle (Buchanan-Banks, unpub data); a few sample sites are located in adjacent quadrangles: Piihonua to the west and Mountain View to the south. Altitudes are given in English units as well as metric to facilitate locating sites on USGS topographic maps.
Dates in this list have been determined at U. S. Geological Survey radiocarbon laboratory, Washington, since our 1960 date list (USGS V). Procedures for the preparation of acetylene gas used in the counting, and the method of counting, (two days in two separate counters) remain unchanged. However, the modern standard used is no longer wood grown in the 19th century, but 95% of the activity of NBS oxalic-acid radiocarbon standard, as recommended at the 1959 Groningen Radiocarbon Conference. Measurement of the oxalic-acid standard at our laboratory indicates 6.2 ± 1% more C14 activity than our modern wood standard; so use of the new standard should make no appreciable difference when comparing samples computed by the old method. W. F. Libby's (1955) half-life average for C14, 5568 ± 30 years, was used for the decay equation.
Ninety-six new 14C dates are reported for carbonized roots and other plant material coll from beneath prehistoric lava flows and ash deposits from Mauna Loa (ML) and Kilauea volcanoes. Before 1976, only 10 flows from these volcanoes had been dated by radiocarbon methods. Collection of dateable material has been facilitated by an improved understanding of the conditions of charcoal formation and preservation beneath basaltic lavas (Lockwood & Lipman, 1979).