Please note, due to essential maintenance online transactions will not be possible between 02:30 and 04:00 BST, on Tuesday 17th September 2019 (22:30-00:00 EDT, 17 Sep, 2019). We apologise for any inconvenience.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Increased post-traumatic stress disorder (PTSD) rates have been documented in children exposed to war. However, the contribution of childhood adversities and environmental sensitivity to children's responses to adversities and trauma are still far from settled.
To evaluate the relative roles of war, childhood adversities and sensitivity in the genesis of PTSD.
Data on childhood adversities and sensitivity was collected from 549 Syrian refugee children in Lebanon. PTSD symptoms were assessed using the PTSD Reaction Index.
Although childhood adversities, war events and sensitivity were all significantly related to PTSD in bivariate analyses, multivariate analyses showed that childhood adversities were the most important variable in predicting PTSD. The effect of war on PTSD was found to be dependent on the interplay between childhood adversities and sensitivity, and was most prominent in highly sensitive children with lower levels of adversities; in sensitive children experiencing high levels of adversities, the effects of war exposure on PTSD were less pronounced.
When considering the effects of war on PTSD in refugee children, it is important to take account of the presence of other adversities as well as of children's sensitivity. Sensitive children may be more vulnerable to the negative effects of war exposure, but only in contexts that are characterised by low childhood adversities.
Smoking is the largest single contributor to poor physical health and increased mortality in people with serious mental illnesses. The aim of the study was to investigate the utility of electronic cigarettes (e-cigarettes) as a harm reduction intervention in this population.
Fifty tobacco smokers with a psychotic disorder were enrolled onto a 24-week pilot study (ClinicalTrials.gov: NCT02212041) investigating the efficacy of a 6-week free e-cigarette intervention to reduce smoking. Cigarette and e-cigarette use was self-reported at weekly visits, and verified using carbon monoxide tests. Psychopathology, e-cigarette acceptability and adverse effects were assessed using standardised scales.
There was a significant (⩾50%) reduction in cigarettes consumed per day between baseline and week 6 [F(2.596,116.800) = 25.878, p < 0.001], and e-cigarette use was stable during this period [F(2.932,46.504) = 2.023, p = 0.115]. These changes were verified by significant carbon monoxide reductions between these time points [F(3.335,126.633) = 5.063, p = 0.002].
The provision of e-cigarettes is a potentially useful harm reduction intervention in smokers with a psychotic disorder.
The commercial release of crops with engineered resistance to 2,4-D and dicamba will alter the spatial and temporal use of these herbicides. This, in turn, has elicited concerns about off-target injury to sensitive crops. In 2014 and 2015, studies were conducted in Tifton, GA, to describe how herbicide (2,4-D and dicamba), herbicide rate (1/75 and 1/250 field use), and application timing (20, 40, and 60 DAP) influence watermelon injury, vine development, yield, and the accumulation of herbicide residues in marketable fruit. In general, greater visual injury and reductions in vine growth, relative to the non-treated check, were observed when herbicide applications were made before watermelon plants had begun to flower. Although the main effects of herbicide and rate were less influential than the timing of applications with respect to plant development, the 1/75 rates were more injurious than the 1/250 rates; dicamba was more injurious than 2,4-D. In 2014, the 1/75 and 1/250 rates of each herbicide reduced marketable fruit numbers 13 to 20%, but only for the 20 DAP application. The 1/75 rate of each herbicide when applied at either 20 or 40 DAP reduced the number of fruit harvested per plot in 2015. Dicamba residues were detected in marketable fruit when the 1/75 rate in 2014 and 2015 and the 1/250 rate in 2015 was applied to plants at 40 or 60 DAP. Residues of 2,4-D were detected in 2015 when the 1/75 and 1/250 rates were applied at 60 DAP. Across both years, the maximum level of residue detected was 0.030 ppm. While early season injury may reduce watermelon yields, herbicide residue detection is more likely in marketable fruit when an off-target contact incident occurs closer to harvest.
Primary voters are frequently characterized as an ideologically extreme subset of their party, and thus partially responsible for party polarization in government. This study uses a combination of administrative records on primary turnout and five recent surveys from 2008–14 to show that primary voters have similar demographic attributes and policy attitudes as rank-and-file voters in their party. These similarities do not vary according to the openness of the primary. These results suggest that the composition of primary electorates does not exert a polarizing effect above what might arise from voters in the party as a whole.
Few studies have focussed on the health and immunity of triploid Atlantic salmon and therefore much is still unknown about their response to commercially significant pathogens. This is important if triploid stocks are to be considered for full-scale commercial production. This study aimed to investigate and compare the response of triploid and diploid Atlantic salmon to an experimental challenge with Neoparamoeba perurans, causative agent of amoebic gill disease (AGD). This disease is economically significant for the aquaculture industry. The results indicated that ploidy had no significant effect on gross gill score or gill filaments affected, while infection and time had significant effects. Ploidy, infection and time did not affect complement or anti-protease activities. Ploidy had a significant effect on lysozyme activity at 21 days post-infection (while infection and time did not), although activity was within the ranges previously recorded for salmonids. Stock did not significantly affect any of the parameters measured. Based on the study results, it can be suggested that ploidy does not affect the manifestation or severity of AGD pathology or the serum innate immune response. Additionally, the serum immune response of diploid and triploid Atlantic salmon may not be significantly affected by amoebic gill disease.
Obsessive-compulsive disorder (OCD) is associated with variable risk of suicide and prevalence of suicide attempt (SA). The present study aimed to assess the prevalence of SA and associated sociodemographic and clinical features in a large international sample of OCD patients.
A total of 425 OCD outpatients, recruited through the International College of Obsessive-Compulsive Spectrum Disorders (ICOCS) network, were assessed and categorized in groups with or without a history of SA, and their sociodemographic and clinical features compared through Pearson’s chi-squared and t tests. Logistic regression was performed to assess the impact of the collected data on the SA variable.
14.6% of our sample reported at least one SA during their lifetime. Patients with an SA had significantly higher rates of comorbid psychiatric disorders (60 vs. 17%, p<0.001; particularly tic disorder), medical disorders (51 vs. 15%, p<0.001), and previous hospitalizations (62 vs. 11%, p<0.001) than patients with no history of SA. With respect to geographical differences, European and South African patients showed significantly higher rates of SA history (40 and 39%, respectively) compared to North American and Middle-Eastern individuals (13 and 8%, respectively) (χ2=11.4, p<0.001). The logistic regression did not show any statistically significant predictor of SA among selected independent variables.
Our international study found a history of SA prevalence of ~15% in OCD patients, with higher rates of psychiatric and medical comorbidities and previous hospitalizations in patients with a previous SA. Along with potential geographical influences, the presence of the abovementioned features should recommend additional caution in the assessment of suicide risk in OCD patients.
Substantial policy, communication and operational gaps exist between mental health services and the police for individuals with enduring mental health needs.
To map and cost pathways through mental health and police services, and to model the cost impact of implementing key policy recommendations.
Within a case-linkage study, we estimated 1-year individual-level healthcare and policing costs. Using decision modelling, we then estimated the potential impact on costs of three recommended service enhancements: street triage, Mental Health Act assessments for all Section 136 detainees and outreach custody link workers.
Under current care, average 1-year mental health and police costs were £10 812 and £4552 per individual respectively (n = 55). The cost per police incident was £522. Models suggested that each service enhancement would alter per incident costs by between −8% and +6%.
Recommended enhancements to care pathways only marginally increase individual-level costs.
Hip and knee arthroplasty infections are associated with considerable healthcare costs. The merits of reducing the postoperative surveillance period from 1 year to 90 days have been debated.
To report the first pan-Canadian hip and knee periprosthetic joint infection (PJI) rates and to describe the implications of a shorter (90-day) postoperative surveillance period.
Prospective surveillance for infection following hip and knee arthroplasty was conducted by hospitals participating in the Canadian Nosocomial Infection Surveillance Program (CNISP) using standard surveillance definitions.
Overall hip and knee PJI rates were 1.64 and 1.52 per 100 procedures, respectively. Deep incisional and organ-space hip and knee PJI rates were 0.96 and 0.71, respectively. In total, 93% of hip PJIs and 92% of knee PJIs were identified within 90 days, with a median time to detection of 21 days. However, 11%–16% of deep incisional and organ-space infections were not detected within 90 days. This rate was reduced to 3%–4% at 180 days post procedure. Anaerobic and polymicrobial infections had the shortest median time from procedure to detection (17 and 18 days, respectively) compared with infections due to other microorganisms, including Staphylococcus aureus.
PJI rates were similar to those reported elsewhere, although differences in national surveillance systems limit direct comparisons. Our results suggest that a postoperative surveillance period of 90 days will detect the majority of PJIs; however, up to 16% of deep incisional and organ-space infections may be missed. Extending the surveillance period to 180 days could allow for a better estimate of disease burden.
Seclusion may be harmful and traumatic to patients, detrimental to therapeutic relationships, and can result in physical injury to staff. Further, strategies to reduce seclusion have been identified as a potential method of improving cost-effectiveness of psychiatric services. However, developing alternative strategies to seclusion can be difficult. Interventions to reduce seclusion do not lend themselves to evaluation using randomized controlled trials (RCTs), though comprehensive literature reviews have demonstrated considerable non-RCT evidence for interventions to reduce seclusion in psychiatric facilities. In the UK, a recent 5-year evaluation of seclusion practice in a high secure UK hospital revealed reduced rates of seclusion without an increase in adverse incidents. To assess the effect of a novel intervention strategy for reduction of long-term segregation on a high secure, high dependency forensic psychiatry ward in the UK, we introduced a pilot program involving stratified levels of seclusion (“long-term segregation”), multidisciplinary feedback and information sharing, and a bespoke occupational therapy program. Reduced seclusion was demonstrated and staff feedback was mainly positive, indicating increased dynamism and empowerment on the ward. A more structured, stratified approach to seclusion, incorporating multidisciplinary team-working, senior administrative involvement, dynamic risk assessment, and bespoke occupational therapy may lead to a more effective model of reducing seclusion in high secure hospitals and other psychiatric settings. While lacking an evidence base at the level of RCTs, innovative, pragmatic strategies are likely to have an impact at a clinical level and should guide future practice and research.
Consistently large differences occur in the calibrated 14C ages of stratigraphically associated shell and charcoal samples from Kilometer 4, an Archaic Period archaeological site located on the extreme south coast of Peru. A series of nine shell and charcoal samples were collected from a Late Archaic Period (~6000–4000 BP) sector of the site. After calibration, the intercepts of the charcoal dates were ~100–750 years older than the paired shell samples. Due to the hyper-arid conditions in this region that promote long-term preservation of organic material, we argue that the older charcoal dates are best explained by people using old wood for fuel during the Middle Holocene. Given this “old wood” problem, marine shell may actually be preferable to wood charcoal for dating archaeological sites in coastal desert environments as in southern Peru and Northern Chile.
We have refined marine reservoir age estimates for eastern Pacific coastal waters with radiocarbon measurements of mollusk shells collected prior to 1950. We have also investigated interspecific variability in 14C ages for historic and ancient shells from San Francisco Bay.
The spatial and temporal distribution of 145 radiocarbon dates on 66 Australian stick-nest rat middens (Muridae: Leporillus spp.) range from modern to 10,900 ± 90 BP. As in American packrat middens, age frequency follows a logarithmic decay, both continentally and at major sites. This is probably a result of natural decay processes. Unlike American middens of similar age, relatively few range changes in plant distribution have been detected in Australia. The distribution of 14C ages and the associated midden materials provide important paleoenvironmental information from the arid interior of Australia. The middens record subtle changes in vegetation and dramatic changes in the fauna unlike those interpreted from sites on the coastal rim or the southeastern periphery of the arid zone.
We demonstrate variable radiocarbon content within 2 historic (AD 1936) and 2 prehistoric (about 8200 BP and 3500 BP) Mytilus californianus shells from the Santa Barbara Channel region, California, USA. Historic specimens from the mainland coast exhibit a greater range of intrashell variability (i.e. 180–240 14C yr) than archaeological specimens from Daisy Cave on San Miguel Island (i.e. 120 14C yr in both shells). δ13C and δ18O profiles are in general agreement with the up welling of deep ocean water depleted in 14C as a determinant of local marine reservoir correction (ΔR) in the San Miguel Island samples. Upwelling cycles are difficult to identify in the mainland specimens, where intrashell variations in 14C content may be a complex product of oceanic mixing and periodic seasonal inputs of 14C-depeleted terrestrial runoff. Though the mechanisms controlling ΔR at subannual to annual scales are not entirely clear, the fluctuations represent significant sources of random dating error in marine environments, particularly if a small section of shell is selected for accelerator mass spectrometry (AMS) dating. For maximum precision and accuracy in AMS dating of marine shells, we recommend that archaeologists, paleontologists, and 14C lab personnel average out these variations by sampling across multiple increments of growth.
Public agencies at all levels of government and other organizations that manage archaeological resources often face the problem of many undertakings that collectively impact large numbers of individually significant archaeological resources. Such situations arise when an agency is managing a large area, such as a national forest, land management district, park unit, wildlife refuge, or military installation. These situations also may arise in regard to large-scale development projects, such as energy developments, highways, reservoirs, transmission lines, and other major infrastructure projects that cover substantial areas. Over time, the accumulation of impacts from small-scale projects to individual archaeological resources may degrade landscape or regional-scale cultural phenomena. Typically, these impacts are mitigated at the site level without regard to how the impacts to individual resources affect the broader population of resources. Actions to mitigate impacts rarely are designed to do more than avoid resources or ensure some level of data recovery at single sites. Such mitigation activities are incapable of addressing research question at a landscape or regional scale.
To explore whether surgical teams with greater stability among their members (ie, members have worked together more in the past) experience lower rates of sharps-related percutaneous blood and body fluid exposures (BBFE) during surgical procedures.
A 10-year retrospective cohort study.
A single large academic teaching hospital.
Surgical teams participating in surgical procedures (n=333,073) performed during 2001–2010 and 2,113 reported percutaneous BBFE were analyzed.
A social network measure (referred to as the team stability index) was used to quantify the extent to which surgical team members worked together in the previous 6 months. Poisson regression was used to examine the effect of team stability on the risk of BBFE while controlling for procedure characteristics and accounting for procedure duration. Separate regression models were generated for percutaneous BBFE involving suture needles and those involving other surgical devices.
The team stability index was associated with the risk of percutaneous BBFE (adjusted rate ratio, 0.93 [95% CI, 0.88–0.97]). However, the association was stronger for percutaneous BBFE involving devices other than suture needles (adjusted rate ratio, 0.92 [95% CI, 0.85–0.99]) than for exposures involving suture needles (0.96 [0.88–1.04]).
Greater team stability may reduce the risk of percutaneous BBFE during surgical procedures, particularly for exposures involving devices other than suture needles. Additional research should be conducted on the basis of primary data gathered specifically to measure qualities of relationships among surgical team personnel.
To use a unique multicomponent administrative data set assembled at a large academic teaching hospital to examine the risk of percutaneous blood and body fluid (BBF) exposures occurring in operating rooms.
A 10-year retrospective cohort design.
A single large academic teaching hospital.
All surgical procedures (n=333,073) performed in 2001–2010 as well as 2,113 reported BBF exposures were analyzed.
Crude exposure rates were calculated; Poisson regression was used to analyze risk factors and account for procedure duration. BBF exposures involving suture needles were examined separately from those involving other device types to examine possible differences in risk factors.
The overall rate of reported BBF exposures was 6.3 per 1,000 surgical procedures (2.9 per 1,000 surgical hours). BBF exposure rates increased with estimated patient blood loss (17.7 exposures per 1,000 procedures with 501–1,000 cc blood loss and 26.4 exposures per 1,000 procedures with >1,000 cc blood loss), number of personnel working in the surgical field during the procedure (34.4 exposures per 1,000 procedures having ≥15 personnel ever in the field), and procedure duration (14.3 exposures per 1,000 procedures lasting 4 to <6 hours, 27.1 exposures per 1,000 procedures lasting ≥6 hours). Regression results showed associations were generally stronger for suture needle–related exposures.
Results largely support other studies found in the literature. However, additional research should investigate differences in risk factors for BBF exposures associated with suture needles and those associated with all other device types.