To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
To compare risk of surgical site infection (SSI) following cesarean delivery between women covered by Medicaid and private health insurance.
Cesarean deliveries covered by Medicaid or private insurance and reported to the National Healthcare Safety Network (NHSN) and state inpatient discharge databases by hospitals in California (2011–2013).
Deliveries reported to NHSN and state inpatient discharge databases were linked to identify SSIs in the 30 days following cesarean delivery, primary payer, and patient and procedure characteristics. Additional hospital-level characteristics were obtained from public databases. Relative risk of SSI by primary payer primary payer was assessed using multivariable logistic regression adjusting for patient, procedure, and hospital characteristics, accounting for facility-level clustering.
Of 291,757 cesarean deliveries included, 48% were covered by Medicaid. SSIs were detected following 1,055 deliveries covered by Medicaid (0.75%) and 955 deliveries covered by private insurance (0.63%) (unadjusted odds ratio, 1.2; 95% confidence interval [CI], 1.1–1.3; P < .0001). The adjusted odds of SSI following cesarean deliveries covered by Medicaid was 1.4 (95% CI, 1.2–1.6; P < .0001) times the odds of those covered by private insurance.
In this, the largest and only multicenter study to investigate SSI risk following cesarean delivery by primary payer, Medicaid-insured women had a higher risk of infection than privately insured women. These findings suggest the need to evaluate and better characterize the quality of maternal healthcare for and needs of women covered by Medicaid to inform targeted infection prevention and policy.
There is renewed interest in the inverse association between psychiatric hospital and prison places, with reciprocal time trends shown in more than one country. We hypothesised that the numbers of admissions to psychiatric hospitals and committals to prisons in Ireland would also correlate inversely over time (i.e. dynamic measures of admission and committal rather than static, cross-sectional numbers of places).
Publicly available activity statistics for psychiatric hospitals and prisons in Ireland were collated from 1986 to 2010.
There was a reciprocal association between psychiatric admissions and prison committals (Pearson r=−0.788, p<0.001), an increase of 91 prison committals for every 100 psychiatric hospital admissions foregone.
Penrose’s hypothesis applies to admissions to psychiatric hospitals and prisons in Ireland over time (dynamic measures), just as it does to the numbers of places in psychiatric hospitals and prisons in Ireland and elsewhere (static, cross-sectional measures). Although no causal connection can be definitively established yet, mentally disordered prisoners are usually known to community mental health services. Psychiatric services for prisons and the community should be linked to ensure that the needs of those currently accessing care through prisons can also be met in the community.
Life span bias potentially alters species abundance in death assemblages through the overrepresentation of short-lived organisms compared with their long-lived counterparts. Although previous work found that life span bias did not contribute significantly to live–dead discordance in bivalve assemblages, life span bias better explained discordance in two groups: longer-lived bivalve species and species with known life spans. More studies using local, rather than global, species-wide life spans and mortality rates would help to determine the prevalence of life span bias, especially for long-lived species with known life spans. Here, we conducted a field study at two sites in North Carolina to assess potential life span bias between Mercenaria mercenaria and Chione elevata, two long-lived bivalve species that can be aged directly. We compared the ability of directly measured local life spans with that of regional and global life spans to predict live–dead discordance between these two species. The shorter-lived species (C. elevata) was overrepresented in the death assemblage compared with its live abundance, and local life span data largely predicted the amount of live–dead discordance; local life spans predicted 43% to 88% of discordance. Furthermore, the global maximum life span for M. mercenaria resulted in substantial overpredictions of discordance (1.4 to 1.6 times the observed live–dead discordance). The results of this study suggest that life span bias should be considered as a factor affecting proportional abundances of species in death assemblages and that using life span estimates appropriate to the study locality improves predictions of discordance based on life span compared with using global life span estimates.
As the US population ages, the number of hip and knee arthroplasties is expected to increase. Because surgical site infections (SSIs) following these procedures contribute substantial morbidity, mortality, and costs, we projected SSIs expected to occur from 2020 through 2030.
We used a stochastic Poisson process to project the number of primary and revision arthroplasties and SSIs. Primary arthroplasty rates were calculated using annual estimates of hip and knee arthroplasty stratified by age and gender from the 2012–2014 Nationwide Inpatient Sample and standardized by census population data. Revision rates, dependent on time from primary procedure, were obtained from published literature and were uniformly applied for all ages and genders. Stratified complex SSI rates for arthroplasties were obtained from 2012–2015 National Healthcare Safety Network data. To evaluate the possible impact of prevention measures, we recalculated the projections with an SSI rate reduced by 30%, the national target established by the US Department of Health and Human Services (HHS).
Without a reduction in SSI rates, we projected an increase in complex SSIs following hip and knee arthroplasty of 14% between 2020 and 2030. We projected a total burden of 77,653 SSIs; however, meeting the 30% rate reduction could prevent 23,297 of these SSIs.
Given current SSI rates, we project that complex SSI burden for primary and revision arthroplasty may increase due to an aging population. Reducing the SSI rate to the national HHS target could prevent 23,000 SSIs and reduce subsequent morbidity, mortality, and Medicare costs.
Foodborne non-typhoidal salmonellosis causes approximately 1 million illnesses annually in the USA. In April 2015, we investigated a multistate outbreak of 65 Salmonella Paratyphi B variant L(+) tartrate(+) infections associated with frozen raw tuna imported from Indonesia, which was consumed raw in sushi. Forty-six (92%) of 50 case-patients interviewed ate sushi during the week before illness onset, and 44 (98%) of 45 who specified ate sushi containing raw tuna. Two outbreak strains were isolated from the samples of frozen raw tuna. Traceback identified a single importer as a common source of tuna consumed by case-patients; this importer issued three voluntary recalls of tuna sourced from one Indonesian processor. Four Salmonella Weltevreden infections were also linked to this outbreak. Whole-genome sequencing was useful in establishing a link between Salmonella isolated from ill people and tuna. This outbreak highlights the continuing foodborne illness risk associated with raw seafood consumption, the importance of processing seafood in a manner that minimises contamination with pathogenic microorganisms and the continuing need to ensure imported foods are safe to eat. People at higher risk for foodborne illness should not consume undercooked animal products, such as raw seafood.
Structured, empirically supported psychological interventions are lacking for patients who require organ transplantation. This stage IA psychotherapy development project developed and tested the feasibility, acceptability, tolerability, and preliminary efficacy of an 8-week group cognitive behavioral stress management intervention adapted for patients with end-stage liver disease awaiting liver transplantation.
Twenty-nine English-speaking United Network for Organ Sharing–registered patients with end-stage liver disease from a single transplantation center enrolled in 8-week, group cognitive-behavioral liver stress management and relaxation training intervention adapted for patients with end-stage liver disease. Patients completed pre- and postintervention surveys that included the Beck Depression Inventory II and the Beck Anxiety Inventory. Feasibility, acceptability, tolerability, and preliminary efficacy were assessed.
Attendance rate was 69.40%. The intervention was rated as “good” to “excellent” by 100% of participants who completed the postintervention survey in teaching them new skills to relax and to cope with stress, and by 94.12% of participants in helping them feel supported while waiting for a liver transplant. No adverse events were recorded over the course of treatment. Attrition was 13.79%. Anxious and depressive symptoms were not statistically different after the intervention.
Significance of results
The liver stress management and relaxation training intervention is feasible, acceptable, and tolerable to end-stage liver disease patients within a transplant clinic setting. Anxious and depressive symptoms remained stable postintervention. Randomized controlled trials are needed to study the intervention's effectiveness in this population.
Bovine herpes virus 1 (BHV-1) manifests as a latent viral infection putatively affecting bovines. Understanding its effect on cattle herds is critical to maintaining sustainable beef and dairy production systems, as well as aiding in the development of herd health policies. The primary objective of the current study was, therefore, to use a whole-farm bio-economic model to evaluate the effect of herd seroprevalence to BHV-1 on the productive and economic performance of a spring calving beef cow herd. As part of a wider epidemiological study of herd pathogen status, a total of 4240 cows from 134 spring calving beef cow herds across the Republic of Ireland were blood sampled to measure the seroprevalence to BHV-1. Using data from a national breeding database, productive and reproductive performance indicators were used to parameterize a single year, static and deterministic whole-farm bio-economic model. A spring-calving, pasture-based suckler beef cow production system with an emphasis on calf-to-weanling production was simulated. The impact of BHV-1 seropositivity on whole-farm technical and economic performance was relatively small, with a marginal drop in the net margin of 4% relative to a baseline seronegative herd. Subsequent risk factors for increased pathogenicity were considered such as total herd size, percentage of intra-herd movements and vaccination status for BHV-1. In contrast to all others, scenarios representing herds that were either small in size or those which indicated an active vaccination policy for BHV-1 had no reduction in net margin against the baseline as a result of seropositivity to BHV-1.
Abstract presentations of scientific information at meetings are important for broadcasting new information. Publication of these studies should be the final goal, but minimal data exist documenting publication rates, especially for paediatric sub-speciality meetings. The goal of this study was to document the manuscript publication rate for paediatric cardiac echocardiography abstracts and to determine whether there were differences between abstracts that were published versus not published.
Paediatric cardiac echocardiography abstracts presented from 2007 to 2011 at the American Society of Echocardiography Meetings were reviewed. Characteristics of the abstracts were noted. A Medline/Pubmed search was performed using keywords, first author, and senior author criteria to determine publication. Fisher’s exact tests or χ2 tests were used for analysis.
A total of 194 abstracts were reviewed. In all, 27 abstracts were oral presentations and 167 were poster presentations. A total of 124 abstracts were prospective studies and 70 were retrospective studies; 11 abstracts were basic science studies and 183 were clinical studies. Altogether, 25 abstracts dealt with three-dimensional echocardiography, 15 with fetal echocardiography, 56 with deformation analysis, 79 with standard transthoracic echocardiography, and 19 were in the other category. A total of 73 abstracts were subsequently published – with a 37.6% publication rate – 2.1±1.7 years after initial presentation. There were no significant differences in publication rates based on the above-noted variables.
A paediatric cardiac echocardiography abstract publication rate of 37.6% is comparable to previous published publication rates for other meetings. No differences in variables analysed were noted between published versus unpublished abstracts.
To identify the factors associated with first Clostridium difficile infection (CDI) that predict fecal microbiota transplantation (FMT) for recurrent CDI
We carried out a retrospective single-center cohort study to compare the clinical characteristics of 200 patients who underwent FMT for recurrent CDI to 75 patients who did not.
A single academic hospital in the United States
The time from first to second CDI correlated to subsequent FMT use. Concomitant inflammatory bowel disease (IBD; P=.002), use of immunosuppressive therapy (P=.04), and use of metronidazole within 2 months before the first CDI (P=.02) correlated positively to subsequent FMT in univariate analysis. The use of oral vancomycin for first CDI was more common in those who required FMT than those who did not in univariate (P=.02) and multivariate (P=.03) analyses. In contrast, intravenous vancomycin use within 2 months before the first CDI reduced the risk for FMT in univariate P=.000003) and multivariate (P=.0001) analyses. Black patients with recurrent CDI were less likely to receive FMT than white patients (P=.00005). Patients who received FMT were also less likely to have comorbidities.
This study provides important insights into the factors predictive for FMT in patients with recurrent CDI and highlights the potential racial and medical characteristics that affect the access of the patients to FMT.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
Standards for organic pig production recommend that growing pigs are maintained on pasture. There is currently no information on the nutritional implications of such a system, since grazing intakes have not been recorded in pigs of this production stage. This study used n-alkane methodology previously validated in sows (Wilson et al., 1999) to measure the herbage intakes of individual pigs under such conditions.
Nose ringing is widely used in conventional outdoor pig production as the only reliable method of preventing sows destroying pasture by rooting (Edwards et al., 1998), but is prohibited by some organic sector bodies as it inhibits the sows’ behaviour. Some organic producers use a rotation policy in which the sows are moved to fresh pasture about three times a year, after green cover has been destroyed. As well as limiting nutrient leaching, frequent movement also limits parasite build-up in a system which prohibits the routine use of anthelmintics. However, it has a high labour demand. An alternative strategy is to maintain the sows on a larger area for the whole year. This abstract presents initial data on comparison of the two systems regarding annual pattern of pasture damage by sows.
The excretory behaviour of outdoor lactating sows has important implications for sow and piglet health, especially in organic systems, where use of anthelmintics and other medication is restricted. It is also important in determining the environmental impact of the system. If foraging and excretion are spatially separated this limits risk of parasite infection, but may lead to nutrient “hotspot” formation with potential for leaching and poor nutrient cycling to subsequent crops. Where nose-ringing of organic sows is not permitted by the certification scheme, pasture will be destroyed by foraging activity, further promoting nutrient losses. This study aimed to investigate the spatial distribution of excretory behaviour and patterns of pasture loss during the period from farrowing to weaning.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
During 2000–07, five giant icebergs (B15A, B15J, B15K, C16 and C25) adrift in the southwestern Ross Sea, Antarctica, were instrumented with global positioning system (GPS) receivers and other instruments to monitor their behavior in the near-coastal environment. The measurements show that collision processes can strongly influence iceberg behavior and delay their progress in drifting to the open ocean. Collisions appear to have been a dominant control on the movement of B15A, the largest of the icebergs, during the 4-year period it gyrated within the limited confines of Ross Island, the fixed Ross Ice Shelf and grounded C16. Iceberg interactions in the near-coastal regime are largely driven by ocean tidal effects which determine the magnitude of forces generated during collision and break-up events. Estimates of forces derived from the observed drift trajectories during the iceberg-collisioninduced calving of iceberg C19 from the Ross Ice Shelf, during the iceberg-induced break-off of the tip of the Drygalski Ice Tongue and the break-up of B15A provide a crude estimate of the stress scale involved in iceberg calving. Considering the total area the vertical face of new rifts created in the calving or break-up process, and not accounting for local stress amplification near rift tips, this estimated stress scale is 104 Pa.
We sought to identify and review published studies that discuss the ethical considerations, from a physician’s perspective, of managing a hunger strike in a prison setting.
A database search was conducted to identify relevant publications. We included case studies, case series, guidelines and review articles published over a 20-year period. Non-English language publications were translated.
The review found 23 papers from 12 jurisdictions published in five languages suitable for inclusion.
Key themes from included publications are identified and summarised in the context of accepted guidelines from the World Medical Association. Whilst there seems to be an overall consensus favouring autonomy over beneficence, tensions along this fine balance are magnified in jurisdictions where legislation leads to a dual loyalty conflict for the physician.
It is a widely held view that by the beginning of the Christian era the staging of full-length dramas had become very rare or had ceased altogether, and that any plays that were written during this time, like the ten 'Senecan’ plays that survive, were probably intended as no more than closet dramas. But the question of whether such plays were ever staged (and hence the question of whether they were intended to be staged or were even stageable) is complicated by the fact that the word ‘tragedy’ was also applied to other forms of dramatic performances: to the ballet of the pantomime artist (tragoedia saltata) and to short concert productions (tragoedia cantata). There was also the citharoedia, a solo performance which consisted of a tragic aria accompanied by the lyre. Finally, the traditional kind of tragedy could be recited rather than staged. Each of these kinds of performances could take place in or out of the theater, with varying degrees of elaboration. When, therefore, an ancient writer speaks of the performance of a tragedy, it is not always clear just what type of production is being alluded to. When, for instance, Dio Cassius says that the Emperor Caligula just before his assassination (A.D. 41) wished to put on a ballet and enact a tragedy (ϰαὶ ὀϱχήσασθαι ϰαὶ τϱαγῳδίαν ὑποϰϱίνασθαι ἠθέλησεν) and announced that the revels would be prolonged three more days for the purpose, what does he mean ? Was there to be only one kind of performance or two? That is, were the pantomimus and his hypocritae to dance and act a single tragedy, or was a story to be rendered first in dance, and then the same or another story acted out in dialogue and song? If the latter supposition is correct, was the tragedy to be a large-scale play or a modified and shortened concert tragedy? We know from Suetonius that the same play could be conceived of as being performed in different ways. He says that on the day before Caligula was killed, the pantomimus Mnester danced the same tragedy that the tragoedus Neoptolemus had acted in the games at which Philip of Macedon was killed. Of course, in judging this matter we must also consider the question of how accurately the conceptions and practices reported by Suetonius (ca. 120) and Dio (ca. 220) correspond to those of Caligula's time.
Research shows that cognitive rehabilitation (CR) has the potential to improve goal performance and enhance well-being for people with early stage Alzheimer’s disease (AD). This single subject, multiple baseline design (MBD) research investigated the clinical efficacy of an 8-week individualised CR intervention for individuals with early stage AD.
Three participants with early stage AD were recruited to take part in the study. The intervention consisted of eight sessions of 60–90 minutes of CR. Outcomes included goal performance and satisfaction, quality of life, cognitive and everyday functioning, mood, and memory self-efficacy for participants with AD; and carer burden, general mental health, quality of life, and mood of carers.
Visual analysis of MBD data demonstrated a functional relationship between CR and improvements in participants’ goal performance. Subjective ratings of goal performance and satisfaction increased from baseline to post-test for three participants and were maintained at follow-up for two. Baseline to post-test quality of life scores improved for three participants, whereas cognitive function and memory self-efficacy scores improved for two.
Our findings demonstrate that CR can improve goal performance, and is a socially acceptable intervention that can be implemented by practitioners with assistance from carers between sessions. This study represents one of the promising first step towards filling a practice gap in this area. Further research and randomised-controlled trials are required.