We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Implementation of genome-scale sequencing in clinical care has significant challenges: the technology is highly dimensional with many kinds of potential results, results interpretation and delivery require expertise and coordination across multiple medical specialties, clinical utility may be uncertain, and there may be broader familial or societal implications beyond the individual participant. Transdisciplinary consortia and collaborative team science are well poised to address these challenges. However, understanding the complex web of organizational, institutional, physical, environmental, technologic, and other political and societal factors that influence the effectiveness of consortia is understudied. We describe our experience working in the Clinical Sequencing Evidence-Generating Research (CSER) consortium, a multi-institutional translational genomics consortium.
Methods:
A key aspect of the CSER consortium was the juxtaposition of site-specific measures with the need to identify consensus measures related to clinical utility and to create a core set of harmonized measures. During this harmonization process, we sought to minimize participant burden, accommodate project-specific choices, and use validated measures that allow data sharing.
Results:
Identifying platforms to ensure swift communication between teams and management of materials and data were essential to our harmonization efforts. Funding agencies can help consortia by clarifying key study design elements across projects during the proposal preparation phase and by providing a framework for data sharing data across participating projects.
Conclusions:
In summary, time and resources must be devoted to developing and implementing collaborative practices as preparatory work at the beginning of project timelines to improve the effectiveness of research consortia.
An increasing number of treatment studies focus on impaired cognition and emotion processing in schizophrenia. In study 1 we evaluated neuronal activation with fMRI during facial emotion processing in schizophrenia patients treated with new antipsychotics. The study 2 was carried out in order to evaluate whether combinations of new antipsychotics with a cognitive training (Cogpack) or a Training of Affect Decoding (TAD) were more effective than new antipsychotics alone.
Methods
In the first study patients with schizophrenia (n=11) and matched healthy controls (n=11) viewed facial displays of emotions. FMRI was used to measure BOLD signal changes as patients alternated beween tasks requiring discrimination of emotional valence of faces and age. In the second study schizophrenic patients (n=20) were compared with a randomized group of patients in the Cogpack (N=20) and in the TAD (n=20).
Results
The same activation patterns in the amygdala were apparent in schizophrenic patients treated with new antipsychotics and healthy controls. The cognition training group revealed significant improvements in cognitive functions and transfer effects in skills needed for daily life. In the TAD group significant improvements were found in recognition of sad facial emotions.
Conclusions
New antipsychotics may improve the functionality of the networks needed for emotion processing and cognition. Cogpack training and TAD, in combination with new antipsychotics, are important treatment techniques for improving social functioning relevant for rehabilitation.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Introduction: Emergency department (ED) staff carry a high risk for the burnout syndrome of increased emotional exhaustion, depersonalization and decreased personal accomplishment. Previous research has shown that task-oriented coping skills were associated with reduced levels of burnout compared to emotion-oriented coping. ED staff at one hospital participated in an intervention to teach task-oriented coping skills. We hypothesized that the intervention would alter staff coping behaviors and ultimately reduce burnout. Methods: ED physicians, nurses and support staff at two regional hospitals were surveyed using the Maslach Burnout Inventory (MBI) and the Coping Inventory for Stressful Situations (CISS). Surveys were performed before and after the implementation of communication and conflict resolution skills training at the intervention facility (I) consisting of a one-day course and a small group refresher 6 to 15 months later. Descriptive statistics and multivariate analysis assessed differences in staff burnout and coping styles compared to the control facility (C) and over time. Results: 85/143 (I) and 42/110 (C) ED staff responded to the initial survey. Post intervention 46 (I) and 23(C) responded. During the two year study period there was no statistically significant difference in CISS or MBI scores between hospitals (CISS: (Pillai's trace = .02, F(3,63) = .47, p = .71, partial η2 = .02); MBI: (Pillai's trace = .01, F(3,63) = .11, p = .95, partial η2 = .01)) or between pre- and post-intervention groups (CISS: (Pillai's trace = .01, F(3,63) = .22, p = .88, partial η2 = .01); MBI: (Pillai's trace = .09, F(3,63) = 2.15, p = .10, partial η2 = .01)). Conclusion: We were not able to measure improvement in staff coping or burnout in ED staff receiving communication skills intervention over a two year period. Burnout is a multifactorial problem and environmental rather than individual factors may be more important to address. Alternatively, to demonstrate a measurable effect on burnout may require more robust or inclusive interventions.
Introduction: In Nova Scotia, under the Paramedics Providing Palliative Care program, paramedics can now manage symptom crises in patients with palliative care goals and often at home without the need to transport to hospital. Growing recognition that non-cancer conditions benefit from a palliative approach is expanding the program. Our team previously found treatment of pain and breathlessness is not optimized, pain scores are underutilized, and paramedics were more comfortable (pre-launch) with a palliative approach in cancer versus non-cancer conditions. Our objective was to compare symptom management in cancer versus non-cancer subgroup. Methods: We conducted a retrospective cohort study. The Electronic Patient Care Record and Special Patient Program were queried for patients with palliative goals from July 1, 2015 to July 1, 2016. Descriptive analysis was conducted and results were compared with a t-test and Bonferroni correction (alpha = p < 0.007). Results: 1909 unique patients; 765/1909 (40.1%) cancer and 1144/1909 (59.9%) non-cancer. Female sex: cancer 357/765 (46.7%), non-cancer 538/1144 (47.0%). Mean age cancer: 73.3 (11.65), non-cancer 77.7 (12.80). Top non-cancer conditions: COPD (495/1144, 43.3%), CHF (322/1144, 28.1%), stroke (172/1144, 15.0%) and dementia (149/1144, 13.0%). Comorbidities for cancer patients (range): 0 to 3; non-cancer 0 to 5. Most common chief complaint (CC) for cancer and non-cancer: respiratory distress, 10.8% vs 21.5%. Overall, no difference in proportion treated cancer vs non-cancer, 11.5% vs 10.1%, p = 0.35. Some difference in individual therapies: morphine 83/765 (10.8%) vs 55/1144 (4.8%), p < 0.001, hydromorphone 9/765 (1.2%) vs 2/1144 (0.2%), p = 0.014, salbutamol 38/765 (5.0%) vs 5/1144 (0.4%), p < 0.001 and ipratropium 27/765 (3.5%) vs 134/1144 (11.7%), p < 0.001, in addition to any support with home medication which is not queriable. Pre-treatment pain scores were documented more often than post-treatment in both groups (58.7% vs 25.6% (p < 0.001), 57.4% vs 26.9% (p < 0.001)). Conclusion: Non-cancer patients represent an important proportion of palliative care calls for paramedics. Cancer and non-cancer patients had very similar CC and received similar treatment, although low proportions, despite pre-launch findings that non-cancer conditions were likely to be undertreated. Pain scores remain underutilized. Further research into the underlying reason(s) is required to improve the support of non-cancer patients by paramedics.
Field studies were conducted to determine the possible rate and timing of nicosulfuron to suppress annual ryegrass (ARG) seeded as a cover crop at the time of corn planting without affecting corn performance near Ridgetown, ON, Canada, in 2016 and 2017. Nicosulfuron was applied at rates from 0.8 to 50 g ai ha–1 when the ARG was at the two- to three- or four- to five-leaf stages, or approximately 3 or 4 wk after emergence of both corn and ARG. There were no differences between the two application timings in grain yield responses or ARG suppression. As the rate of nicosulfuron increased from 0.8 to 50 g ai ha–1, ARG was suppressed 6% to 76% and 5% to 96%, at 1 and 4 wk after application (WAA), respectively. At 4 WAA, ARG biomass decreased from 29 to 1 g m–2 as the rate of nicosulfuron increased from 0.8 to 50 g ai ha–1, compared to 36 g m–2 in the untreated control. Where nicosulfuron was not applied to ARG, grain corn yield was reduced by 6% compared to the ARG-free control; similar effects on corn yield were observed with nicosulfuron at the lowest rate applied at 0.8 g ai ha–1. Grain corn yield was reduced by 2.5% with the application of nicosulfuron at 25 g ai ha–1 (label rate for corn) compared to no ARG control, but this was not statistically significant. This study identified rates of nicosulfuron that suppressed ARG when emerged approximately the same day as corn, but there was evidence that grain corn yields were lowered because of interference, possibly during the critical weed control period. Based on this study, an ARG cover crop should not be seeded at the same time as corn unless one is willing to accept a risk for corn grain yield losses for the sake of the cover crop.
The ‘Critically Endangered’ White-winged Flufftail Sarothrura ayresi is regarded as one of the rarest and most threatened rallids in Africa. Due to the species’ low density, habitat preference, cryptic colouration, elusive behaviour and lack of auditory cues has resulted in it being one of the most challenging species to survey using traditional methods such as auditory surveys and rope dragging. Numerous data deficiencies exist regarding facets of the species’ ecology, distribution, habitat-use and population status. A stratified array of nine camera localities was used within high-altitude palustrine wetland habitat to ascertain if this non-invasive technique could successfully document the first estimate of site occupancy, fine scale habitat use and activity patterns of this very rare species. Our study accumulated a total of 626 camera days and eight independent sightings of White-winged Flufftail across the respective austral summer season. Furthermore, our study confirms the applicability of camera trapping to other rare and elusive rallid species. Our results confirm that White-winged Flufftail is a low-density habitat specialist species, with site occupancy influenced positively by basal and canopy vegetation cover and detection probability influenced negatively by water depth within associated wetland habitats. Activity pattern analyses displayed that peak activity occurred at dawn and dusk, which yielded the highest degree of activity overlap with the only other migratory rallid recorded, Spotted Crake Porzana prozana. Our study also recorded the first apparent territorial display behaviour noted for the species. Our study supports the need for conservation initiatives focused on securing contiguous sections of suitable wetland habitat in order to accommodate the persistence of this globally threatened species.
This paper discusses the sustainability of livestock systems, emphasising bidirectional relations with animal health. We review conventional and contrarian thinking on sustainability and argue that in the most common approaches to understanding sustainability, health aspects have been under-examined. Literature review reveals deep concerns over the sustainability of livestock systems; we recognise that interventions are required to shift to more sustainable trajectories, and explore approaches to prioritising in different systems, focusing on interventions that lead to better health. A previously proposed three-tiered categorisation of ‘hot spots’, ‘cold spots’ and ‘worried well’ animal health trajectories provides a mental model that, by taking into consideration the different animal health status, animal health risks, service response needs and key drivers in each system, can help identify and implement interventions. Combining sustainability concepts with animal health trajectories allows for a richer analysis, and we apply this to three case studies drawn from North Africa and the Middle East; Bangladesh; and the Eastern Cape of South Africa. We conclude that the quest for sustainability of livestock production systems from the perspective of human and animal health is elusive and difficult to reconcile with the massive anticipated growth in demand for livestock products, mainly in low- and middle-income countries, as well as the aspirations of poor livestock keepers for better lives. Nevertheless, improving the health of livestock can contribute to health sustainability both through reducing negative health impacts of livestock and increasing efficiency of production. However, the choice of the most appropriate options must be under-pinned by an understanding of agro-ecology, economy and values. We argue that a new pillar of One Health should be added to the three traditional sustainability pillars of economics, society and environment when addressing livestock systems.
Glyphosate-resistant (GR) and multiple herbicide–resistant (groups 2 and 9) Canada fleabane have been confirmed in 30 and 23 counties in Ontario, respectively. The widespread incidence of herbicide-resistant Canada fleabane highlights the importance of developing integrated weed management strategies. One strategy is to suppress Canada fleabane using cover crops. Seventeen different cover crop monocultures or polycultures were seeded after winter wheat harvest in late summer to determine GR Canada fleabane suppression in corn grown the following growing season. All cover crop treatments seeded after wheat harvest suppressed GR Canada fleabane in corn the following year. At 4 wk after cover crop emergence (WAE), estimated cover crop ground cover ranged from 31% to 68%, a density of 124 to 638 plants m–2, and a range of biomass from 29 to 109 g m–2, depending on cover crop species. All of the cover crop treatments suppressed GR Canada fleabane in corn grown the following growing season from May to September compared to the no cover crop control. Among treatments evaluated, annual ryegrass (ARG), crimson clover (CC)/ARG, oilseed radish (OSR)/CC/ARG, and OSR/CC/cereal rye (CR) were the best treatments for the suppression of GR Canada fleabane in corn. ARG alone or in combination with CC provided the most consistent GR Canada fleabane suppression, density reduction, and biomass reduction in corn. Grain corn yields were not affected by the use of the cover crops evaluated for Canada fleabane suppression.
Many novel therapeutic options for depression exist that are either not mentioned in clinical guidelines or recommended only for use in highly specialist services. The challenge faced by clinicians is when it might be appropriate to consider such ‘non-standard’ interventions. This analysis proposes a framework to aid this decision.
Declaration of interest
In the past 3 years R.H.M.W. has received support for research, expenses to attend conferences and fees for lecturing and consultancy work (including attending advisory boards) from various pharmaceutical companies including Astra Zeneca, Cyberonics, Eli Lilly, Janssen, LivaNova, Lundbeck, MyTomorrows, Otsuka, Pfizer, Roche, Servier, SPIMACO and Sunovion. D.M.B.C. has received fees from LivaNova for attending an advisory board. In the past 3 years A.J.C. has received fees for lecturing from Astra Zeneca and Lundbeck; fees for consulting from LivaNova, Janssen and Allergan; and research grant support from Lundbeck.
In the past 3 years A.C. has received fees for lecturing from pharmaceutical companies namely Lundbeck and Sunovion. In the past 3 years A.L.M. has received support for attending seminars and fees for consultancy work (including advisory board) from Medtronic Inc and LivaNova. R.M. holds joint research grants with a number of digital companies that investigate devices for depression including Alpha-stim, Big White Wall, P1vital, Intel, Johnson and Johnson and Lundbeck through his mindTech and CLAHRC EM roles. M.S. is an associate at Blueriver Consulting providing intelligence to NHS organisations, pharmaceutical and devices companies. He has received honoraria for presentations and advisory boards with Lundbeck, Eli Lilly, URGO, AstraZeneca, Phillips and Sanofi and holds shares in Johnson and Johnson. In the past 3 years P.R.A.S. has received support for research, expenses to attend conferences and fees for lecturing and consultancy work (including attending an advisory board) from life sciences companies including Corcept Therapeutics, Indivior and LivaNova. In the past 3 years P.S.T. has received consultancy fees as an advisory board member from the following companies: Galen Limited, Sunovion Pharmaceuticals Europe Ltd, myTomorrows and LivaNova. A.H.Y. has undertaken paid lectures and advisory boards for all major pharmaceutical companies with drugs used in affective and related disorders and LivaNova. He has received funding for investigator initiated studies from AstraZeneca, Eli Lilly, Lundbeck and Wyeth.
Objectives: This study investigated the relationship between close proximity to detonated blast munitions and cognitive functioning in OEF/OIF/OND Veterans. Methods: A total of 333 participants completed a comprehensive evaluation that included assessment of neuropsychological functions, psychiatric diagnoses and history of military and non-military brain injury. Participants were assigned to a Close-Range Blast Exposure (CBE) or Non-Close-Range Blast Exposure (nonCBE) group based on whether they had reported being exposed to at least one blast within 10 meters. Results: Groups were compared on principal component scores representing the domains of memory, verbal fluency, and complex attention (empirically derived from a battery of standardized cognitive tests), after adjusting for age, education, PTSD diagnosis, sleep quality, substance abuse disorder, and pain. The CBE group showed poorer performance on the memory component. Rates of clinical impairment were significantly higher in the CBE group on select CVLT-II indices. Exploratory analyses examined the effects of concussion and multiple blasts on test performance and revealed that number of lifetime concussions did not contribute to memory performance. However, accumulating blast exposures at distances greater than 10 meters did contribute to poorer performance. Conclusions: Close proximity to detonated blast munitions may impact memory, and Veterans exposed to close-range blast are more likely to demonstrate clinically meaningful deficits. These findings were observed after statistically adjusting for comorbid factors. Results suggest that proximity to blast should be considered when assessing for memory deficits in returning Veterans. Comorbid psychiatric factors may not entirely account for cognitive difficulties. (JINS, 2018, 24, 466–475)
The Foodborne Diseases Active Surveillance Network (FoodNet) conducts population-based surveillance for Campylobacter infection. For 2010 through 2015, we compared patients with Campylobacter jejuni with patients with infections caused by other Campylobacter species. Campylobacter coli patients were more often >40 years of age (OR = 1·4), Asian (OR = 2·3), or Black (OR = 1·7), and more likely to live in an urban area (OR = 1·2), report international travel (OR = 1·5), and have infection in autumn or winter (OR = 1·2). Campylobacter upsaliensis patients were more likely female (OR = 1·6), Hispanic (OR = 1·6), have a blood isolate (OR = 2·8), and have an infection in autumn or winter (OR = 1·7). Campylobacter lari patients were more likely to be >40 years of age (OR = 2·9) and have an infection in autumn or winter (OR = 1·7). Campylobacter fetus patients were more likely male (OR = 3·1), hospitalized (OR = 3·5), and have a blood isolate (OR = 44·1). International travel was associated with antimicrobial-resistant C. jejuni (OR = 12·5) and C. coli (OR = 12) infections. Species-level data are useful in understanding epidemiology, sources, and resistance of infections.
Accurate and reproducible patient positioning is a critical step in radiotherapy for breast cancer. This has seen the use of permanent skin markings becoming standard practice in many centres. Permanent skin markings may have a negative impact on long-term cosmetic outcome, which may in turn, have psychological implications in terms of body image. The aim of this study was to investigate the feasibility of using a semi-permanent tattooing device for the administration of skin marks for breast radiotherapy set-up.
Materials and methods
This was designed as a phase II double-blinded randomised-controlled study comparing our standard permanent tattoos with the Precision Plus Micropigmentation (PPMS) device method. Patients referred for radical breast radiotherapy were eligible for the study. Each study participant had three marks applied using a randomised combination of the standard permanent and PPMS methods and was blinded to the type of each mark. Follow up was at routine appointments until 24 months post radiotherapy. Participants and a blind assessor were invited to score the visibility of each tattoo at each follow-up using a Visual Analogue Scale. Tattoo scores at each time point and change in tattoo scores at 24 months were analysed by a general linear model using the patient as a fixed effect and the type of tattoo (standard or research) as covariate. A simple questionnaire was used to assess radiographer feedback on using the PPMS.
Results
In total, 60 patients were recruited to the study, of which 55 were available for follow-up at 24 months. Semi-permanent tattoos were more visible at 24 months than the permanent tattoos. Semi-permanent tattoos demonstrated a greater degree of fade than the permanent tattoos at 24 months (final time point) post completion of radiotherapy. This was not statistically significant, although it was more apparent for the patient scores (p=0·071) than the blind assessor scores (p=0·27). No semi-permanent tattoos required re-marking before the end of radiotherapy and no adverse skin reactions were observed.
Conclusion
The PPMS presents a safe and feasible alternative to our permanent tattooing method. An extended period of follow-up is required to fully assess the extent of semi-permanent tattoo fade.
Steroid nasal irrigation for chronic rhinosinusitis patients following endoscopic sinus surgery reduces symptom recurrence. There are minimal safety data to recommend this treatment. This study evaluated the safety of betamethasone nasal irrigation by measuring its impact on endogenous cortisol levels.
Methods:
Participants performed daily betamethasone nasal irrigation for six weeks. The impact on pre- and post-intervention serum and 24-hour urinary free cortisol was assessed. Efficacy was evaluated using the 22-item Sino-Nasal Outcome Test.
Results:
Thirty participants completed the study (16 females and 14 males; mean age = 53.9 ± 15.6 years). Serum cortisol levels were unchanged (p = 0.28). However, 24-hour urinary free cortisol levels decreased (47.5 vs 41.5 nmol per 24 hours; p = 0.025). Sino-Nasal Outcome Test scores improved (41.13 ± 21.94 vs 23.4 ± 18.17; p < 0.001). The minimal clinical important difference was reached in 63 per cent of participants.
Conclusion:
Daily betamethasone nasal irrigation is an efficacious treatment modality not associated with changes in morning serum cortisol levels. The changes in 24-hour urinary free cortisol levels are considered clinically negligible. Hence, continued use of betamethasone nasal irrigation remains a viable and safe treatment option for chronic rhinosinusitis patients following functional endoscopic sinus surgery.
The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.
This paper is the fourth in a 5-part series that focuses on educating and training the clinical and translational science workforce. The goal of this paper is to delineate components of effective career development programs that go beyond didactic training. All academic health centers with a Clinical and Translational Science Award have a KL2 career development award for junior faculty, and many also have a TL1 training program for predoctoral and postdoctoral fellows. The training across these programs varies, however junior investigators across the United States experience similar challenges. Junior investigators can get overwhelmed with the demands of building their own research program, particularly in academia. 1Often, they are sidetracked by competing demands that can derail their progress. In these situations, junior investigators experience frustration and may search for alternative career paths. By providing them with additional professional skills in the 5 domains of: (1) self-awareness; (2) selecting the right topic and securing funding; (3) getting adequate support; (4) working with others; and (5) managing yourself, your career, and your demands. We will give junior investigators additional tools to manage these demands and facilitate their own career success.