To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Adults are at risk of being exposed to influenza from many sources. Healthcare personnel (HCP) have the additional risk of being exposed to ill patients.
To determine whether HCP were at higher risk than adults working in nonhealthcare roles (non-HCP).
Prospective cohort study.
Acute-care hospitals and other businesses in Toronto, Ontario, Canada.
Adults aged 18–69 years were enrolled for 1 or more of the 2010/2011, 2011/2012, and 2012/2013 influenza seasons. Swabs collected during acute respiratory illnesses were tested for influenza and pre- and postseason blood samples were tested for influenza-specific immune response.
The adjusted odds of influenza were similar for HCP and non-HCP (odds ratio [OR], 1.29; 95% confidence interval [CI], 0.63–2.63). Older adults and those vaccinated against influenza had lower odds, and those who shared their workspace and who used corrective eyewear had higher odds of influenza.
HCP and other working adults are at similar risk of influenza infection.
Prescribers who wrote at least 1 antibiotic prescription filled at a retail pharmacy in Tennessee in 2016.
Multivariable logistic regression, including prescriber gender, birth decade, specialty, and practice location, and patient gender and age group, to determine the association with high prescribing.
In 2016, 7,949,816 outpatient oral antibiotic prescriptions were filled in Tennessee: 1,195 prescriptions per 1,000 total population. Moreover, 50% of Tennessee’s outpatient oral antibiotic prescriptions were written by 9.3% of prescribers. Specific specialties and prescriber types were associated with high prescribing: urology (odds ratio [OR], 3.249; 95% confidence interval [CI], 3.208–3.289), nurse practitioners (OR, 2.675; 95% CI, 2.658–2.692), dermatologists (OR, 2.396; 95% CI, 2.365–2.428), physician assistants (OR, 2.382; 95% CI, 2.364–2.400), and pediatric physicians (OR, 2.340; 95% CI, 2.320–2.361). Prescribers born in the 1960s were most likely to be high prescribers (OR, 2.574; 95% CI, 2.532–2.618). Prescribers in rural areas were more likely than prescribers in all other practice locations to be high prescribers. High prescribers were more likely to prescribe broader-spectrum antibiotics (P < .001).
Targeting high prescribers, independent of specialty, degree, practice location, age, or gender, may be the best strategy for implementing cost-conscious, effective outpatient antimicrobial stewardship interventions. More information about high prescribers, such as patient volumes, clinical scope, and specific barriers to intervention, is needed.
To determine infection prevention and control (IPAC) practices for carbapenemase-producing Enterobacteriaceae (CPE), an emerging threat, at acute-care hospitals in Ontario, Canada.
A descriptive cross-sectional survey.
We surveyed IPAC directors and managers at all acute-care hospitals in Ontario, Canada, to gather information on IPAC practices related to CPE, including admission screening, other patient screening, environmental testing, use of precautions to prevent transmission, and outbreak management.
Of 116 acute-care hospitals, 105 (91%) responded. Admission screening included patients previously colonized or infected with CPE (n = 64, 61%), patients recently hospitalized outside of Canada (Indian subcontinent, n = 62, 59%; other countries, n = 56, 53%), and patients recently hospitalized in Canada (n = 22, 21%). Fifty-one hospitals (49%) screened patients for colonization during an outbreak. Almost all hospitals (n = 101, 96%) used precautions to prevent transmission from patients with CPE colonization or infection; most hospitals (n = 54, 53%) continued precautions indefinitely. Few hospitals (n = 19, 18%) performed environmental cultures. Eight hospitals (8%) reported at least 1 outbreak, and 6 hospitals (6%) reported transmission from sink or shower drains to patients.
Variability in practices may result from lack of evidence and challenges in updating guidelines as evidence emerges. A coordinated approach to slow the emergence of CPE should be considered in our population.
Healthcare workers (HCWs) are at risk of acquiring and transmitting respiratory viruses while working in healthcare settings.
To investigate the incidence of and factors associated with HCWs working during an acute respiratory illness (ARI).
HCWs from 9 Canadian hospitals were prospectively enrolled in active surveillance for ARI during the 2010–2011 to 2013–2014 influenza seasons. Daily illness diaries during ARI episodes collected information on symptoms and work attendance.
At least 1 ARI episode was reported by 50.4% of participants each study season. Overall, 94.6% of ill individuals reported working at least 1 day while symptomatic, resulting in an estimated 1.9 days of working while symptomatic and 0.5 days of absence during an ARI per participant season. In multivariable analysis, the adjusted relative risk of working while symptomatic was higher for physicians and lower for nurses relative to other HCWs. Participants were more likely to work if symptoms were less severe and on the illness onset date compared to subsequent days. The most cited reason for working while symptomatic was that symptoms were mild and the HCW felt well enough to work (67%). Participants were more likely to state that they could not afford to stay home if they did not have paid sick leave and were younger.
HCWs worked during most episodes of ARI, most often because their symptoms were mild. Further data are needed to understand how best to balance the costs and risks of absenteeism versus those associated with working while ill.
Cyber Operational Risk: Cyber risk is routinely cited as one of the most important sources of operational risks facing organisations today, in various publications and surveys. Further, in recent years, cyber risk has entered the public conscience through highly publicised events involving affected UK organisations such as TalkTalk, Morrisons and the NHS. Regulators and legislators are increasing their focus on this topic, with General Data Protection Regulation (“GDPR”) a notable example of this. Risk actuaries and other risk management professionals at insurance companies therefore need to have a robust assessment of the potential losses stemming from cyber risk that their organisations may face. They should be able to do this as part of an overall risk management framework and be able to demonstrate this to stakeholders such as regulators and shareholders. Given that cyber risks are still very much new territory for insurers and there is no commonly accepted practice, this paper describes a proposed framework in which to perform such an assessment. As part of this, we leverage two existing frameworks – the Chief Risk Officer (“CRO”) Forum cyber incident taxonomy, and the National Institute of Standards and Technology (“NIST”) framework – to describe the taxonomy of a cyber incident, and the relevant cyber security and risk mitigation items for the incident in question, respectively.Summary of Results: Three detailed scenarios have been investigated by the working party:
∙Employee leaks data at a general (non-life) insurer: Internal attack through social engineering, causing large compensation costs and regulatory fines, driving a 1 in 200 loss of £210.5m (c. 2% of annual revenue).
∙Cyber extortion at a life insurer: External attack through social engineering, causing large business interruption and reputational damage, driving a 1 in 200 loss of £179.5m (c. 6% of annual revenue).
∙Motor insurer telematics device hack: External attack through software vulnerabilities, causing large remediation / device replacement costs, driving a 1 in 200 loss of £70.0m (c. 18% of annual revenue).
Limitations: The following sets out key limitations of the work set out in this paper:
∙While the presented scenarios are deemed material at this point in time, the threat landscape moves fast and could render specific narratives and calibrations obsolete within a short-time frame.
∙There is a lack of historical data to base certain scenarios on and therefore a high level of subjectivity is used to calibrate them.
∙No attempt has been made to make an allowance for seasonality of renewals (a cyber event coinciding with peak renewal season could exacerbate cost impacts)
∙No consideration has been given to the impact of the event on the share price of the company.
∙Correlation with other risk types has not been explicitly considered.
Conclusions: Cyber risk is a very real threat and should not be ignored or treated lightly in operational risk frameworks, as it has the potential to threaten the ongoing viability of an organisation. Risk managers and capital actuaries should be aware of the various sources of cyber risk and the potential impacts to ensure that the business is sufficiently prepared for such an event. When it comes to quantifying the impact of cyber risk on the operations of an insurer there are significant challenges. Not least that the threat landscape is ever changing and there is a lack of historical experience to base assumptions off. Given this uncertainty, this paper sets out a framework upon which readers can bring consistency to the way scenarios are developed over time. It provides a common taxonomy to ensure that key aspects of cyber risk are considered and sets out examples of how to implement the framework. It is critical that insurers endeavour to understand cyber risk better and look to refine assumptions over time as new information is received. In addition to ensuring that sufficient capital is being held for key operational risks, the investment in understanding cyber risk now will help to educate senior management and could have benefits through influencing internal cyber security capabilities.
The emergence and spread of extensively multidrug-resistant organisms is a public health crisis, and long-term care settings have been identified as a reservoir for the cultivation of these organisms. Long-term care settings are now taking on increasingly ill residents with complicated medical problems, indwelling devices, and significant healthcare exposure, all of which are considered risk factors selecting for resistant organisms. Despite this, guidelines addressing infection prevention procedures in long-term care remain vague, and implementation of these guidelines is challenging, largely due to staff turnover, limited resources, knowledge gaps, and lack of organizational support. Human factors engineering approaches have emerged as an important innovation to address patient safety issues and develop interventions in the healthcare work system (ie, tools and technologies, tasks, organization, physical environment) that support human performance, which, in turn, lead to improvements in processes (eg, compliance with infection prevention guidelines) and outcomes (eg, reduced infection rates). We propose the concept of using the methods and approaches from the scientific field of human factors engineering to address the unique challenges of implementing infection prevention in the long-term care setting.
To describe the process by which the 12 community-based primary health care (CBPHC) research teams worked together and fostered cross-jurisdictional collaboration, including collection of common indicators with the goal of using the same measures and data sources.
A pan-Canadian mechanism for common measurement of the impact of primary care innovations across Canada is lacking. The Canadian Institutes for Health Research and its partners funded 12 teams to conduct research and collaborate on development of a set of commonly collected indicators.
A working group representing the 12 teams was established. They undertook an iterative process to consider existing primary care indicators identified from the literature and by stakeholders. Indicators were agreed upon with the intention of addressing three objectives across the 12 teams: (1) describing the impact of improving access to CBPHC; (2) examining the impact of alternative models of chronic disease prevention and management in CBPHC; and (3) describing the structures and context that influence the implementation, delivery, cost, and potential for scale-up of CBPHC innovations.
Nineteen common indicators within the core dimensions of primary care were identified: access, comprehensiveness, coordination, effectiveness, and equity. We also agreed to collect data on health care costs and utilization within each team. Data sources include surveys, health administrative data, interviews, focus groups, and case studies. Collaboration across these teams sets the foundation for a unique opportunity for new knowledge generation, over and above any knowledge developed by any one team. Keys to success are each team’s willingness to engage and commitment to working across teams, funding to support this collaboration, and distributed leadership across the working group. Reaching consensus on collection of common indicators is challenging but achievable.
Background: A previously healthy 26 year-old male presented with confusion and recurrent hypoglycemia (blood glucose lows of 2.5 mmol/L) while on vacation in Las Vegas. He denied substance or heavy alcohol use and the toxicology screen was negative. He was transferred home to Winnipeg for further care and was found to have only patchy memories of his trip and the days leading up to the trip, consistent with mixed anterograde and retrograde amnesia. MoCA score at presentation was 16/30 with points lost on orientation, delayed recall and visuospatial-executive tasks. MRI revealed T2 hyperintensities and diffusion abnormalities in bilateral hippocampi and globus pallidi. Electroencephalography showed triphasic waves. The patient was found to have a pancreatic insulinoma, which was surgically resected. In follow-up nine weeks later he was near his cognitive baseline, though he had ongoing difficulties with delayed recall. Repeat MRI showed improvement but not resolution of hippocampal and pallidal signal change, with mild hippocampal atrophy.
Neuropathological and animal studies have shown that structures most sensitive to hypoglycemic neural injury include the hippocampus, basal ganglia, and neocortex. The clinical and radiographic findings in this case illustrate an unusual presentation of insulinoma and the effects of hypoglycemia on the brain. Methods: N/A Results: N/A Conclusions: N/A
Magnetic field measurements in turbulent plasmas are often difficult to perform. Here we show that for
kG magnetic fields, a time-resolved Faraday rotation measurement can be made at the OMEGA laser facility. This diagnostic has been implemented using the Thomson scattering probe beam and the resultant path-integrated magnetic field has been compared with that of proton radiography. Accurate measurement of magnetic fields is essential for satisfying the scientific goals of many current laser–plasma experiments.
We explore morphological, kinematic and chemical trends of boxy/peanut (b/p) bulges of Milky Way (MW)-type galaxies, to better understand the formation history of the MW’s bulge. We show, using N-body simulations with both a kinematically cold and a kinematically hot disc, that colder populations develop a more prominent bar and X-shaped peanut as compared to their hotter counterpart. Colder discs also exhibit lower line-of-sight velocities (when viewed edge-on) at the edges of the b/p compared to hot discs, in agreement with what is seen for the MW bulge. Furthermore, we explore an N-body model which has three co-spatial discs with metallicities which correspond to the stellar populations of the inner Milky Way, where the α-enhanced thick disc populations are massive and centrally concentrated. The metallicity trends seen in observations of the Bulge can be reproduced in the model without the need of adding any additional components, which hints to the disc origin of the MW’s bulge.
Haiti has the highest human rabies burden in the Western Hemisphere. There is no published literature describing the public's perceptions of rabies in Haiti, information that is critical to developing effective interventions and government policies. We conducted a knowledge, attitudes and practices survey of 550 community members and 116 health professionals in Pétionville, Haiti in 2013 to understand the perception of rabies in these populations. The majority of respondents (85%) knew that dogs were the primary reservoir for rabies, yet only 1% were aware that bats and mongooses could transmit rabies. Animal bites were recognized as a mechanism of rabies transmission by 77% of the population and 76% were aware that the disease could be prevented by vaccination. Of 172 persons reporting a bite, only 37% sought medical treatment. The annual bite incidence rate in respondents was 0·9%. Only 31% of bite victims reported that they started the rabies vaccination series. Only 38% of respondents reported that their dog had been vaccinated against rabies. The majority of medical professionals recognized that dogs were the main reservoir for rabies (98%), but only 28% reported bats and 14% reported mongooses as posing a risk for rabies infection. Bites were reported as a mechanism of rabies transmission by 73% of respondents; exposure to saliva was reported by 20%. Thirty-four percent of medical professionals reported they would wash a bite wound with soap and water and 2·8% specifically mentioned rabies vaccination as a component of post-bite treatment. The majority of healthcare professionals recommended some form of rabies assessment for biting animals; 68·9% recommended a 14-day observation period, 60·4% recommended a veterinary consultation, and 13·2% recommended checking the vaccination status of the animal. Fewer than 15% of healthcare professionals had ever received training on rabies prevention and 77% did not know where to go to procure rabies vaccine for bite victims. Both study populations had a high level of knowledge about the primary reservoir for rabies and the mode of transmission. However, there is a need to improve the level of knowledge regarding the importance of seeking medical care for dog bites and additional training on rabies prevention for healthcare professionals. Distribution channels for rabies vaccines should be evaluated, as the majority of healthcare providers did not know where rabies vaccines could be obtained. Canine rabies vaccination is the primary intervention for rabies control programmes, yet most owned dogs in this population were not vaccinated.
Local abiotic and biotic conditions can alter the strength of exotic species impacts. To better understand the effects of exotic species on invaded ecosystems and to prioritize management efforts, it is important that exotic species impacts are put in local environmental context. We studied how differences in plant community composition, photosynthetically active radiation (PAR), and available soil N associated with Russian olive presence are conditioned by local environmental variation within a western U.S. riparian ecosystem. In four sites along the South Fork of the Republican River in Colorado, we established 200 pairs of plots (underneath and apart from Russian olive) to measure the effects of invasion across the ecosystem. We used a series of a priori mixed models to identify environmental variables that altered the effects of Russian olive. For all response variables, models that included the interaction of environmental characteristics, such as presence/absence of an existing cottonwood canopy, with the presence/absence of Russian olive canopy were stronger candidate models than those that just included Russian olive canopy presence as a factor. Compared with reference plots outside of Russian olive canopy, plots underneath Russian olive had higher relative exotic cover (exotic/total cover), lower perennial C4 grass cover, and higher perennial forb cover. These effects were reduced, however, in the presence of a cottonwood canopy. As expected, Russian olive was associated with reduced PAR and increased N, but these effects were reduced under cottonwood canopy. Our results demonstrate that local abiotic and biotic environmental factors condition the effects of Russian olive within a heterogeneous riparian ecosystem and suggest that management efforts should be focused in open areas where Russian olive impacts are strongest.
To measure transmission frequencies and risk factors for household acquisition of community-associated and healthcare-associated (HA-) methicillin-resistant Staphylococcus aureus (MRSA).
Prospective cohort study from October 4, 2008, through December 3, 2012.
Seven acute care hospitals in or near Toronto, Canada.
Total of 99 MRSA-colonized or MRSA-infected case patients and 183 household contacts.
Baseline interviews were conducted, and surveillance cultures were collected monthly for 3 months from household members, pets, and 8 prespecified high-use environmental locations. Isolates underwent pulsed-field gel electrophoresis and staphylococcal cassette chromosome mec typing.
Overall, of 183 household contacts 89 (49%) were MRSA colonized, with 56 (31%) detected at baseline. MRSA transmission from index case to contacts negative at baseline occurred in 27 (40%) of 68 followed-up households. Strains were identical within households. The transmission risk for HA-MRSA was 39% compared with 40% (P=.95) for community-associated MRSA. HA-MRSA index cases were more likely to be older and not practice infection control measures (P=.002–.03). Household acquisition risk factors included requiring assistance and sharing bath towels (P=.001–.03). Environmental contamination was identified in 78 (79%) of 99 households and was more common in HA-MRSA households.
Household transmission of community-associated and HA-MRSA strains was common and the difference in transmission risk was not statistically significant.
The superheating that usually occurs when a solid is melted by volumetric heating can produce irregular solid–liquid interfaces. Such interfaces can be visualised in ice, where they are sometimes known as Tyndall stars. This paper describes some of the experimental observations of Tyndall stars and a mathematical model for the early stages of their evolution. The modelling is complicated by the strong crystalline anisotropy, which results in an anisotropic kinetic undercooling at the interface; it leads to an interesting class of free boundary problems that treat the melt region as infinitesimally thin.
We present the initial performance of the Gaia Radial Velocity Spectrometer, providing an overview of its performance, which is essentially nominal in terms of spectral resolution, throughput and operation, except for the presence of unexpectedly high levels of scattered background. This is mainly Solar in origin, and reduces the limiting magnitude for radial velocity measurements by ∼1 magnitude to V ∼ 16. Radial velocity calibration accuracies are compliant with requirements.
Post-traumatic stress disorder (PTSD) in response to the World Trade Center (WTC) disaster of 11 September 2001 (9/11) is one of the most prevalent and persistent health conditions among both professional (e.g. police) and non-traditional (e.g. construction worker) WTC responders, even several years after 9/11. However, little is known about the dimensionality and natural course of WTC-related PTSD symptomatology in these populations.
Data were analysed from 10 835 WTC responders, including 4035 police and 6800 non-traditional responders who were evaluated as part of the WTC Health Program, a clinic network in the New York area established by the National Institute for Occupational Safety and Health. Confirmatory factor analyses (CFAs) were used to evaluate structural models of PTSD symptom dimensionality; and autoregressive cross-lagged (ARCL) panel regressions were used to examine the prospective interrelationships among PTSD symptom clusters at 3, 6 and 8 years after 9/11.
CFAs suggested that five stable symptom clusters best represent PTSD symptom dimensionality in both police and non-traditional WTC responders. This five-factor model was also invariant over time with respect to factor loadings and structural parameters, thereby demonstrating its longitudinal stability. ARCL panel regression analyses revealed that hyperarousal symptoms had a prominent role in predicting other symptom clusters of PTSD, with anxious arousal symptoms primarily driving re-experiencing symptoms, and dysphoric arousal symptoms primarily driving emotional numbing symptoms over time.
Results of this study suggest that disaster-related PTSD symptomatology in WTC responders is best represented by five symptom dimensions. Anxious arousal symptoms, which are characterized by hypervigilance and exaggerated startle, may primarily drive re-experiencing symptoms, while dysphoric arousal symptoms, which are characterized by sleep disturbance, irritability/anger and concentration difficulties, may primarily drive emotional numbing symptoms over time. These results underscore the importance of assessment, monitoring and early intervention of hyperarousal symptoms in WTC and other disaster responders.
Longitudinal symptoms of post-traumatic stress disorder (PTSD) are often characterized by heterogeneous trajectories, which may have unique pre-, peri- and post-trauma risk and protective factors. To date, however, no study has evaluated the nature and determinants of predominant trajectories of PTSD symptoms in World Trade Center (WTC) responders.
A total of 10835 WTC responders, including 4035 professional police responders and 6800 non-traditional responders (e.g. construction workers) who participated in the WTC Health Program (WTC-HP), were evaluated an average of 3, 6 and 8 years after the WTC attacks.
Among police responders, longitudinal PTSD symptoms were best characterized by four classes, with the majority (77.8%) in a resistant/resilient trajectory and the remainder exhibiting chronic (5.3%), recovering (8.4%) or delayed-onset (8.5%) symptom trajectories. Among non-traditional responders, a six-class solution was optimal, with fewer responders in a resistant/resilient trajectory (58.0%) and the remainder exhibiting recovering (12.3%), severe chronic (9.5%), subsyndromal increasing (7.3%), delayed-onset (6.7%) and moderate chronic (6.2%) trajectories. Prior psychiatric history, Hispanic ethnicity, severity of WTC exposure and WTC-related medical conditions were most strongly associated with symptomatic trajectories of PTSD symptoms in both groups of responders, whereas greater education and family and work support while working at the WTC site were protective against several of these trajectories.
Trajectories of PTSD symptoms in WTC responders are heterogeneous and associated uniquely with pre-, peri- and post-trauma risk and protective factors. Police responders were more likely than non-traditional responders to exhibit a resistant/resilient trajectory. These results underscore the importance of prevention, screening and treatment efforts that target high-risk disaster responders, particularly those with prior psychiatric history, high levels of trauma exposure and work-related medical morbidities.