To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The loggerhead turtle (Caretta caretta) is a circumglobal species and is listed as vulnerable globally. The North Pacific population nests in Japan and migrates to the Central North Pacific and Pacific coast of North America to feed. In the Mexican Pacific, records of loggerhead presence are largely restricted to the Gulf of Ulloa along the Baja California Peninsula, where very high fisheries by-catch mortality has been reported. Records of loggerhead turtles within the Sea of Cortez also known as the Gulf of California (GC) exist; however, their ecology in this region is poorly understood. We used satellite tracking and an environmental variable analysis (chlorophyll-a (Chl-a) and sea surface temperature (SST)) to determine movements and habitat use of five juvenile loggerhead turtles ranging in straight carapace length from 62.7–68.3 cm (mean: 66.7 ± 2.3 cm). Satellite tracking durations ranged from 73–293 days (mean: 149 ± 62.5 days), transmissions per turtle from 14–1006 (mean: 462 ± 379.5 transmissions) and total travel distance from 1237–5222 km (mean: 3118 ± 1490.7 km). We used travel rate analyses to identify five foraging areas in the GC, which occurred mainly in waters from 10–80 m deep, with mean Chl-a concentrations ranging from 0.28–13.14 mg m−3 and SST ranging from 27.8–34.4°C. This is the first study to describe loggerhead movements in the Gulf of California and our data suggest that loggerhead foraging movements are performed in areas with eutrophic levels of Chl-a.
Humans are contributing to large carnivore declines around the globe, and conservation interventions should focus on increasing local stakeholder tolerance of carnivores and be informed by both biological and social considerations. In the Okavango Delta (Botswana), we tested new conservation strategies alongside a pre-existing government compensation programme. The new strategies included the construction of predator-proof livestock enclosures, the establishment of an early warning system linked to GPS satellite lion collars, depredation event investigations and educational programmes. We conducted pre- and post-assessments of villagers’ livestock management practices, attitudes towards carnivores and conservation, perceptions of human–carnivore coexistence and attitudes towards established conservation programmes. Livestock management levels were low and 50% of farmers lost livestock to carnivores, while 5–10% of owned stock was lost. Respondents had strong negative attitudes towards lions, which kill most depredated livestock. Following new management interventions, tolerance of carnivores significantly increased, although tolerance of lions near villages did not. The number of respondents who believed that coexistence with carnivores was possible significantly increased. Respondents had negative attitudes towards the government-run compensation programme, citing low and late payments, but were supportive of the new management interventions. These efforts show that targeted, intensive management can increase stakeholder tolerance of carnivores.
IMPaCT is a five-year project funded by the Department of Health, UK. Running in the UK and now Sweden, the IMPACT Project aims to target the poor physical health and excessive substance use seen in people with SMI. There is evidence that behavioural interventions may be associated with an improvement in physical health and substance use in this population.
IMPaCT is a randomised controlled trial of a health promotion intervention which consists of a manualised modular approach to working with people with severe mental illness to empower them to improve their physical health and substance use habits. It consists of The Manual, The Reference Guide and The Better Health Handbook which make up a therapy package to support clients to become healthier.
The therapy is provided by care coordinators (mental health practitioners) over a 6–9 month period and combines Cognitive Behavioural Therapy (CBT) with Motivational Interviewing (MI) principles. The aim is to work with clients to help them identify their own problem health behaviours, e.g. smoking, diet, exercise, drug and alcohol use. Realistic goals are set and revised with the client, and individual and group sessions are used to develop personal motivation to change. Information, workbooks and diaries are provided to record progress and give helpful hints, while meaningful alternative activities are introduced to replace problem health behaviours.
The increased prevalence of metabolic syndrome in people with severe mental illness (SMI) is well documented. The International Diabetes Federation (IDF) criteria for metabolic syndrome are three or more of the following: waist circumference ( 80 cm (females), (94 cm (males) OR BMI (30, triglycerides >1.7 mmol/l or on treatment, raised blood pressure (systolic >130 mg Hg or diastolic >85 mm Hg, OR on treatment for hypertension), raised fasting blood glucose (.5.6 mmol/l) OR diagnosed type II diabetes) and reduced HDL cholesterol (< 1.03 mmol/l) OR on treatment.
The IMPACT RCT is a Department of Health funded trial of a health promotion intervention (HPI) delivered by care co-ordinators to people with SMI across South London, Kent and Sussex. The intervention is focussed on improving health by addressing modifiable lifestyle factors such as diet, physical activity, obesity, cigarette smoking, alcohol and substance use.
We investigated the prevalence of metabolic syndrome in a sample of 212 patients for whom we had relevant baseline measures.
Data (weight, BMI, waist circumference, blood pressure, fasting HDL cholesterol, triglycerides and glucose levels) were analysed on 212 patients.
45% of the sample met IDF criteria for metabolic syndrome. Mean BMI was 30.6, glucose 6.4 mmol/L, triglycerides 2.0 mmol/L, HDL 1.2 (mmol/L), waist circumference 105.8 cm, and BP 122/82 mm Hg.
Metabolic syndrome was highly prevalent in this sample, significantly increasing the risk of physical morbidity and potentially lowering life expectancy. There is an unmet need for health promotion interventions in order to lower morbidity and mortality risk in these populations.
Enterococcus causes clinically significant bloodstream infections (BSIs). In centers with a higher prevalence of vancomycin resistant enterococcus (VRE) colonization, a common clinical question is whether empiric treatment directed against VRE should be initiated in the setting of a suspected enterococcal BSI. Unfortunately, VRE treatment options are limited, and relatively expensive, and subject patients to the risk of adverse reactions. We hypothesized that the results of VRE colonization screening could predict vancomycin resistance in enterococcal BSI.
We reviewed 370 consecutive cases of enterococcal BSI over a 7-year period at 2 tertiary-care hospitals to determine whether vancomycin-resistant BSIs could be predicted based on known colonization status (ie, patients with swabs performed within 30 days, more remotely, or never tested). We calculated sensitivity and specificity, and we plotted negative predictives values (NPVs) and positive predictive values (PPVs) as a function of prevalence.
A negative screening swab within 30 days of infection yielded NPVs of 90% and 95% in settings where <27.0% and 15.0% of enterococcal BSI are resistant to vancomycin, respectively. In patients with known VRE colonization, the PPV for VRE in enterococcal BSI was >50% at any prevalence exceeding 25%.
The results of a negative VRE screening test result performed within 30 days can help eliminate unnecessary empiric therapy in patients with suspected enterococcal BSI. Conversely, patients with positive VRE screening swabs require careful consideration of empiric VRE-directed therapy when enterococcal BSI appears likely.
The impact of hurricanes on emergency services is well-known. Recent history demonstrates the need for prehospital and emergency department coordination to serve communities during evacuation, storm duration, and cleanup. The use of telehealth applications may enhance this coordination while lessening the impact on health-care systems. These applications can address triage, stabilization, and diversion and may be provided in collaboration with state and local emergency management operations through various shelters, as well as during other emergency medical responses.
Throughout history there have been theological tensions between official church teachers and church theologians, creating at times a divide between both the magisterium and theologians and also between theologians of different methodological approaches. We offer as examples of tension the declarations by the USCCB's Committee on Doctrine (CD) on the “inadequacies in the theological methodology and conclusions” of our book and of the books of three other contemporary theologians. These examples afford us the opportunity both to consider the theological tensions in general and to propose a solution to them. We establish some ecclesial context for dialogue with the CD, calling attention to four factors in this context: first, recent patterns of discourse between theologians and the magisterium in statements issued against particular theologians; second, an important change in the Catholic concept of church; third, an equally important change in how Catholic theologians set about doing theological ethics; and fourth, the reaffirmation of the importance of conscience by the Second Vatican Council in the 1960s and, more recently, by Pope Francis.
Four experiments examine how lack of awareness of inequality affect behaviour towards the rich and poor. In Experiment 1, participants who became aware that wealthy individuals donated a smaller percentage of their income switched from rewarding the wealthy to rewarding the poor. In Experiments 2 and 3, participants who played a public goods game – and were assigned incomes reflective of the US income distribution either at random or on merit – punished the poor (for small absolute contributions) and rewarded the rich (for large absolute contributions) when incomes were unknown; when incomes were revealed, participants punished the rich (for their low percentage of income contributed) and rewarded the poor (for their high percentage of income contributed). In Experiment 4, participants provided with public education contributions for five New York school districts levied additional taxes on mostly poorer school districts when incomes were unknown, but targeted wealthier districts when incomes were revealed. These results shed light on how income transparency shapes preferences for equity and redistribution. We discuss implications for policy-makers.
Computer-assisted navigation (CAN) improves the accuracy of spinal instrumentation in vertebral fractures and degenerative spine disease; however, it is not widely adopted because of lack of training, high capital costs, workflow hindrances, and accuracy concerns. We characterize shifts in the use of spinal CAN over time and across disciplines in a single-payer health system, and assess the impact of intra-operative CAN on trainee proficiency across Canada.
A prospectively maintained Ontario database of patients undergoing spinal instrumentation from 2005 to 2014 was reviewed retrospectively. Data were collected on treated pathology, spine region, surgical approach, institution type, and surgeon specialty. Trainee proficiency with CAN was assessed using an electronic questionnaire distributed across 15 Canadian orthopedic surgical and neurosurgical programs.
In our provincial cohort, 16.8% of instrumented fusions were CAN-guided. Navigation was used more frequently in academic institutions (15.9% vs. 12.3%, p<0.001) and by neurosurgeons than orthopedic surgeons (21.0% vs. 12.4%, p<0.001). Of residents and fellows 34.1% were fully comfortable using spinal CAN, greater for neurosurgical than orthopedic surgical trainees (48.1% vs. 11.8%, p=0.008). The use of CAN increased self-reported proficiency in thoracic instrumentation for all trainees by 11.0% (p=0.036), and in atlantoaxial instrumentation for orthopedic trainees by 18.0% (p=0.014).
Spinal CAN is used most frequently by neurosurgeons and in academic centers. Most spine surgical trainees are not fully comfortable with the use of CAN, but report an increase in technical comfort with CAN guidance particularly for thoracic instrumentation. Increased education in spinal CAN for trainees, particularly at the fellowship stage and, specifically, for orthopedic surgery, may improve adoption.
Due to concerns over increasing fluoroquinolone (FQ) resistance among gram-negative organisms, our stewardship program implemented a preauthorization use policy. The goal of this study was to assess the relationship between hospital FQ use and antibiotic resistance.
Large academic medical center.
We performed a retrospective analysis of FQ susceptibility of hospital isolates for 5 common gram-negative bacteria: Acinetobacter spp., Enterobacter cloacae, Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa. Primary endpoint was the change of FQ susceptibility. A Poisson regression model was used to calculate the rate of change between the preintervention period (1998–2005) and the postimplementation period (2006–2016).
Large rates of decline of FQ susceptibility began in 1998, particularly among P. aeruginosa, Acinetobacter spp., and E. cloacae. Our FQ restriction policy improved FQ use from 173 days of therapy (DOT) per 1,000 patient days to <60 DOT per 1,000 patient days. Fluoroquinolone susceptibility increased for Acinetobacter spp. (rate ratio [RR], 1.038; 95% confidence interval [CI], 1.005–1.072), E. cloacae (RR, 1.028; 95% CI, 1.013–1.044), and P. aeruginosa (RR, 1.013; 95% CI, 1.006–1.020). No significant change in susceptibility was detected for K. pneumoniae (RR, 1.002; 95% CI, 0.996–1.008), and the susceptibility for E. coli continued to decline, although the decline was not as steep (RR, 0.981; 95% CI, 0.975–0.987).
A stewardship-driven FQ restriction program stopped overall declining FQ susceptibility rates for all species except E. coli. For 3 species (ie, Acinetobacter spp, E. cloacae, and P. aeruginosa), susceptibility rates improved after implementation, and this improvement has been sustained over a 10-year period.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
The environmental bacterium, Burkholderia pseudomallei, is responsible for the potentially fatal disease melioidosis. Factors responsible for the temporospatial distribution of cases are incompletely understood, although a combination of rainfall, groundwater levels and the physicochemical properties of soil are important. The distribution of culture-positive cases of melioidosis from 1996 to 2016 in Far North Queensland, Australia, was investigated to determine the association with different soil types and landforms in Cairns, the region's largest city. Cases were clustered on alluvial fan landforms of strongly bleached gradational textured and yellow massive gradational textured soils indicating these soils are more suitable for B. pseudomallei and risk of melioidosis infection is higher in these areas, cases were less frequent on other soil types on alluvial fan landforms (despite comparable population density) and beach ridges. This indicates that the combination of these soils might be more suitable for B. pseudomallei, increasing the risk of disease in these locations. Sociodemographic characterisics of the population in cluster locations were considered. Knowledge of local soil characteristics may help predict cases of melioidosis and inform public health strategies to prevent the disease. Because melioidosis case clusters were identified, testing for the presence of B. pseudomallei across the study area is a useful target of future research.
Significant reductions recently seen in the size of wide-bandgap power electronics have not been accompanied by a relative decrease in the size of the corresponding magnetic components. To achieve this, a new generation of materials with high magnetic saturation and permeability are needed. Here, we develop gram-scale syntheses of superparamagnetic Fe/FexOy core–shell nanoparticles and incorporate them as the magnetic component in a strongly magnetic nanocomposite. Nanocomposites are typically formed by the organization of nanoparticles within a polymeric matrix. However, this approach can lead to high organic fractions and phase separation; reducing the performance of the resulting material. Here, we form aminated nanoparticles that are then cross-linked using epoxy chemistry. The result is a magnetic nanoparticle component that is covalently linked and well separated. By using this ‘matrix-free’ approach, we can substantially increase the magnetic nanoparticle fraction, while still maintaining good separation, leading to a superparamagnetic nanocomposite with strong magnetic properties.
To identify predominant dietary patterns in four African populations and examine their association with obesity.
We used data from the Africa/Harvard School of Public Health Partnership for Cohort Research and Training (PaCT) pilot study established to investigate the feasibility of a multi-country longitudinal study of non-communicable chronic disease in sub-Saharan Africa. We applied principal component analysis to dietary intake data collected from an FFQ developed for PaCT to ascertain dietary patterns in Tanzania, South Africa, and peri-urban and rural Uganda. The sample consisted of 444 women and 294 men.
We identified two dietary patterns: the Mixed Diet pattern characterized by high intakes of unprocessed foods such as vegetables and fresh fish, but also cold cuts and refined grains; and the Processed Diet pattern characterized by high intakes of salad dressing, cold cuts and sweets. Women in the highest tertile of the Processed Diet pattern score were 3·00 times more likely to be overweight (95 % CI 1·66, 5·45; prevalence=74 %) and 4·24 times more likely to be obese (95 % CI 2·23, 8·05; prevalence=44 %) than women in this pattern’s lowest tertile (both P<0·0001; prevalence=47 and 14 %, respectively). We found similarly strong associations in men. There was no association between the Mixed Diet pattern and overweight or obesity.
We identified two major dietary patterns in several African populations, a Mixed Diet pattern and a Processed Diet pattern. The Processed Diet pattern was associated with obesity.
NeuroStar transcranial magnetic stimulation (TMS) is an effective acute treatment for patients with major depressive disorder (MDD). In order to further understand use of the NeuroStar in a clinical setting, Neuronetics has established a patient treatment and outcomes registry to collect and analyze utilization information on patients receiving treatment with the NeuroStar.
Individual NeuroStar providers are invited to participate in the registry and agree to provide their de-identified patient treatment data. The NeuroStar has an integrated electronic data management system (TrakStar) which allows for the data collection to be automated. The data collected for the registry include Demographic Elements (age, gender), Treatment Parameters, and Clinical Ratings. Clinical assessments are: Clinician Global Impression - Severity of Illness (CGI-S) and thePatient Health Questionnaire 9-item (PHQ-9). De-identified patient data is uploaded to Registry server; an independent statistical service then creates final data reports.
Over 500 patients have entered the NeuroStar Outcomes Registry since Sept 2016. Mean patient age: 48.0 (SD±16.0); 64% Female. Baseline PHQ-9, mean 18.8 (SD±5.0.) Response/Remission Rate, PHQ-9: 61%/33% CGI-S: 78%/59%.
For the initial 500 patients in the Outcomes Registry, approximately 2/3 patients achieve respond and 1/3 patients achieve remission with an acute course of NeuroStar. These treatment outcomes consistent with NeuroStar open-label study data (Carpenter, 2012). The TrakStar data management system makes large scale data collection feasible. The NeuroStarOutcomes Registry is ongoing, and expected to reach 6000 outpatients from more than 47 clinical sites in 36 months.
Despite lessons learned from the recent Ebola epidemic, attempts to survey and determine non-health care worker, industry-specific needs to address highly infectious diseases have been minimal. The aircraft rescue and fire fighting (ARFF) industry is often overlooked in highly infectious disease training and education, even though it is critical to their field due to elevated occupational exposure risk during their operations.
Supervisors perceived Frontline respondents to be more willing and comfortable to encounter potential highly infectious disease scenarios than the Frontline indicated. More than one-third of respondents incorrectly marked transmission routes of viral hemorrhagic fevers. There were discrepancies in self-reports on the existence of highly infectious disease orientation and skills demonstration, employee resources, and personal protective equipment policies, with a range of 7.5%-24.0% more Supervisors than Frontline respondents marking activities as conducted.
There are deficits in highly infectious disease knowledge, skills, and abilities among ARFF members that must be addressed to enhance member safety, health, and well-being. (Disaster Med Public Health Preparedness. 2018;12:675-679)
Urinary catheters, many of which are placed in the emergency department (ED) setting, are often inappropriate, and they are associated with infectious and noninfectious complications. Although several studies evaluating the effect of interventions have focused on reducing catheter use in the ED setting, the organizational contexts within which these interventions were implemented have not been compared.
A total of 18 hospitals in the Ascension health system (ie, system-based hospitals) and 16 hospitals in the state of Michigan (ie, state-based hospitals led by the Michigan Health and Hospital Association) implemented ED interventions focused on reducing urinary catheter use. Data on urinary catheter placement in the ED, indications for catheter use, and presence of physician order for catheter placement were collected for interventions in both hospital types. Multilevel negative binomial regression was used to compare the system-based versus state-based interventions.
A total of 13,215 patients (889 with catheters) from the system-based intervention were compared to 12,104 patients (718 with catheters) from the state-based intervention. Statistically significant and sustainable reductions in urinary catheter placement (incidence rate ratio, 0.79; P=.02) and improvements in appropriate use of urinary catheters (odds ratio [OR], 1.86; P=.004) in the ED were observed in the system-based intervention, compared to the state-based intervention. Differences by collaborative structure in changes in presence of physician order for urinary catheter placement (OR, 1.14; P=.60) were not observed.
An ED intervention consisting of establishing institutional guidelines for appropriate catheter placement and identifying clinical champions to promote adherence was associated with reducing unnecessary urinary catheter use under a system-based collaborative structure.