To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Sleep disturbances are prevalent in cancer patients, especially those with advanced disease. There are few published intervention studies that address sleep issues in advanced cancer patients during the course of treatment. This study assesses the impact of a multidisciplinary quality of life (QOL) intervention on subjective sleep difficulties in patients with advanced cancer.
This randomized trial investigated the comparative effects of a multidisciplinary QOL intervention (n = 54) vs. standard care (n = 63) on sleep quality in patients with advanced cancer receiving radiation therapy as a secondary endpoint. The intervention group attended six intervention sessions, while the standard care group received informational material only. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) and Epworth Sleepiness Scale (ESS), administered at baseline and weeks 4 (post-intervention), 27, and 52.
The intervention group had a statistically significant improvement in the PSQI total score and two components of sleep quality and daytime dysfunction than the control group at week 4. At week 27, although both groups showed improvements in sleep measures from baseline, there were no statistically significant differences between groups in any of the PSQI total and component scores, or ESS. At week 52, the intervention group used less sleep medication than control patients compared to baseline (p = 0.04) and had a lower ESS score (7.6 vs. 9.3, p = 0.03).
Significance of results
A multidisciplinary intervention to improve QOL can also improve sleep quality of advanced cancer patients undergoing radiation therapy. Those patients who completed the intervention also reported the use of less sleep medication.
Total vegetation control (TVC) is an essential management practice to eliminate all vegetation for the purpose of protecting infrastructure, people, or natural resources on sites where vegetation poses major fire, visibility, and infrastructure risks. TVC is implemented on sites such as railroads, power substations, airports, roadsides, and oil and gas facilities. Current research has identified that tank-mixing two effective mechanisms of action is a superior resistance management strategy compared to rotating mechanisms of action; however, effective tank mixes for TVC have not been thoroughly evaluated. A field experiment was conducted from 2013 to 2014 at five sites in Colorado to compare 32 treatment combinations to two industry standards for TVC. Research objectives were (1) to identify herbicide tank-mix combinations for TVC with multiple effective mechanisms of action for resistance management, (2) to evaluate lower use rate alternatives to minimize nontarget impacts, and (3) to determine the efficacy of fall versus spring application timings. Seven treatments were identified as top-ranking treatments, averaging 96% bare-ground (BG) across five sites and two application timings. Four out of the seven top-ranked treatments included aminocyclopyrachlor, chlorsulfuron, and indaziflam. The industry standard diuron plus imazapyr was in the top ranking, whereas the other industry standard bromacil plus diuron performed inconsistently across sites. Probability modeling was used to predict the probability of achieving 97% or 100% BG with various treatment combinations. The combination of aminocyclopyrachlor, chlorsulfuron, indaziflam, and imazapyr had the highest predicted BG probability, with 88% predicted probability of achieving 100% BG, compared to 67% and 52% predicted probabilities for the industry standards diuron plus imazapyr and bromacil plus diuron, respectively. In three of the five sites, fall applications outperformed the same treatments applied in the spring. Several top-ranking treatments represent newer, lower use rate herbicide combinations that provide multiple mechanisms of action to manage herbicide-resistant weeds and minimize nontarget impacts.
The diurnal feeding patterns of dairy cows affects the 24 h robot utilisation of pasture-based automatic milking systems (AMS). A decline in robot utilisation between 2400 and 0600 h currently occurs in pasture-based AMS, as cow feeding activity is greatly reduced during this time. Here, we investigate the effect of a temporal variation in feed quality and quantity on cow feeding behaviour between 2400 and 0600 h as a potential tool to increase voluntary cow trafficking in an AMS at night. The day was allocated into four equal feeding periods (0600 to 1200, 1200 to 1800, 1800 to 2400 and 2400 to 0600 h). Lucerne hay cubes (CP = 19.1%, water soluble carbohydrate = 3.8%) and oat, ryegrass and clover hay cubes with 20% molasses (CP = 11.8%, water soluble carbohydrate = 10.7%) were offered as the ‘standard’ and ‘preferred’ (preference determined previously) feed types, respectively. The four treatments were (1) standard feed offered ad libitum (AL) throughout 24 h; (2) as per AL, with preferred feed replacing standard feed between 2400 and 0600 h (AL + P); (3) standard feed offered at a restricted rate, with quantity varying between each feeding period (20:10:30:60%, respectively) as a proportion of the (previously) measured daily ad libitum intake (VA); (4) as per VA, with preferred feed replacing standard feed between 2400 and 0600 h (VA + P). Eight non-lactating dairy cows were used in a 4 × 4 Latin square design. During each experimental period, treatment cows were fed for 7 days, including 3 days habituation and 4 days data collection. Total daily intake was approximately 8% greater (P < 0.001) for the AL and AL + P treatments (23.1 and 22.9 kg DM/cow) as compared with the VA and VA + P treatments (21.6 and 20.9 kg DM/cow). The AL + P and VA treatments had 21% and 90% greater (P < 0.001) dry matter intake (DMI) between 2400 and 0600 h, respectively, compared with the AL treatment. In contrast, the VA + P treatment had similar DMI to the VA treatment. Our experiment shows ability to increase cow feeding activity at night by varying feed type and quantity, though it is possible that a penalty to total DMI may occur using VA. Further research is required to determine if the implementation of variable feed allocation on pasture-based AMS farms is likely to improve milking robot utilisation by increasing cow feeding activity at night.
Achieving a consistent level of robot utilisation throughout 24 h maximises automatic milking system (AMS) utilisation. However, levels of robot utilisation in the early morning hours are typically low, caused by the diurnal feeding behaviour of cows, limiting the inherent capacity and total production of pasture-based AMS. Our objective was to determine robot utilisation throughout 24 h by dairy cows, based on milking frequency (MF; milking events per animal per day) in a pasture-based AMS. Milking data were collected from January and February 2013 across 56 days, from a single herd of 186 animals (Bos taurus) utilising three Lely A3 robotic milking units, located in Tasmania, Australia. The dairy herd was categorised into three equal sized groups (n=62 per group) according to the cow’s mean daily MF over the duration of the study. Robot utilisation was characterised by an interaction (P< 0.001) between the three MF groups and time of day, with peak milking time for high MF cows within one h of a fresh pasture allocation becoming available, followed by the medium MF and low MF cows 2 and 4 h later, respectively. Cows in the high MF group also presented for milking between 2400 and 0600 h more frequently (77% of nights), compared to the medium MF group (57%) and low MF group (50%). This study has shown the formation of three distinct groups of cows within a herd, based on their MF levels. Further work is required to determine if this finding is replicated across other pasture-based AMS farms.
Multi-sire mating of a mob of ewes is commonly used in commercial sheep production systems. However, ram mating success (defined as the number of lambs sired by an individual) can vary between rams in the mating group. If this trait was repeatable and heritable, selection of rams capable of siring larger numbers of lambs could reduce the number of rams required for mating and ultimately lead to increased genetic gain. However, genetic correlations with other productive traits, such as growth and female fertility, could influence the potential for ram mating success to be used as a selection trait. In order to investigate this trait, parentage records (including accuracy of sire assignment) from 15 commercial ram breeding flocks of various breeds were utilised to examine the repeatability and heritability of ram mating success in multi-sire mating groups. In addition, genetic and phenotypic correlations with growth and female fertility traits were estimated using ASReml. The final model used for the ram mating success traits included age of the ram and mating group as fixed effects. Older rams (3+years old) had 15% to 20% greater mating success than younger rams (1 or 2 years of age). Increasing the stringency of the criteria for inclusion of both an individual lamb, based on accuracy of sire assignment, or a whole mating group, based on how many lambs had an assigned sire, increased repeatability and heritability estimates of the ram mating success traits examined. With the most stringent criteria employed, where assignment of sire accuracy was >0.95 and the total number of lambs in the progeny group that failed to have a sire assigned was<0.05, repeatability and heritability for loge(number of lambs) was 0.40±0.09 and 0.26±0.12, respectively. For proportion of lambs sired, repeatability and heritability were both 0.30±0.09. The two ram mating traits (loge(nlamb) and proportion) were highly correlated, both phenotypically and genetically (0.88±0.01 and 0.94±0.06, respectively). Both phenotypic and genetic correlations between ram mating success and growth and other female fertility traits were low and non-significant. In conclusion, there is scope to select rams capable of producing high numbers of progeny and thus increase selection pressure on rams to increase genetic gain.
Chronic suppurative otitis media is a massive public health problem in numerous low- and middle-income countries. Unfortunately, few low- and middle-income countries can offer surgical therapy.
A six-month long programme in Cambodia focused on training local surgeons in type I tympanoplasty was instigated. Qualitative educational and quantitative surgical outcomes were evaluated in the 12 months following programme completion. A four-month long training programme in mastoidectomy and homograft ossiculoplasty was subsequently implemented, and the preliminary surgical and educational outcomes were reported.
A total of 124 patients underwent tympanoplasty by the locally trained surgeons. Tympanic membrane closure at six weeks post-operation was 88.5 per cent. Pure tone audiometry at three months showed that 80.9 per cent of patients had improved hearing, with a mean gain of 17.1 dB. The trained surgeons reported high confidence in performing tympanoplasty. Early outcomes suggest the local surgeons can perform mastoidectomy and ossiculoplasty as safely as overseas-trained surgeons, with reported surgeon confidence reflecting these positive outcomes.
The training programme has demonstrated success, as measured by surgeon confidence and operative outcomes. This approach can be emulated in other settings to help combat the global burden of chronic suppurative otitis media.
Introduction: There have been an increasing number of studies published since 2011 investigating the benefits of in situ simulation as a quality improvement (QI) modality. We instituted an emergency department (ED) in situ simulation program at Kelowna General Hospital in 2015 with the aims of improving inter-professional collaboration, improving team communication, developing resident resuscitation leadership skills, educating ED professionals on resuscitation medical expertise, and identifying QI action items from each simulation session. Methods: We applied the SMART framework. Our specific, measureable, and attainable goal was to select two QI action items discovered from each simulation session. Realistic and timely follow-up on each action item was conducted by the nurse educator group who reported back to the local ED network, pharmacy, or manager depending on the action item. This ensured sustainability of our model. Results: A total of 65 individuals participated in 2015 at program inception. This increased to 213 individuals in 2017 with an average of 24 participants/session. Attendants included nurses (31%), ED physicians (20%), ED residents (18%), paramedics (10%), and medical students, respiratory therapists, pharmacists, and others (21%). Our QI action items were grouped as (1) team/communication, (2) equipment/resources, and (3) knowledge/tasks. Examples of each category were: (1) Inability to hear paramedic bedside reports resulting in reinforcement of one paramedic speaking while the team remains quiet, (2) Difficulty in looking up medication information in the resuscitation bay resulting in installation of an additional computer in the resuscitation bay, and (3) Uncertainty of local process for initiating extra corporeal membrane oxygenation (ECMO) in the ED resulting in review of team placement, patient transfer, and initiation of ECMO lines in the ED. Inter-professional team members have reported through electronic feedback on the value of these sessions, including improved inter agency cooperation and understanding. Conclusion: This quality improvement initiative used in situ simulation as a QI tool. We were able to identify latent safety threats, test new patient care protocols, find equipment issues, and foster teamwork in a sustainable way to improve the quality of care in our ED. We hope that this serves as encouragement to others who are initiating a similar program. Our main suggestions after reflection include: (1) Engage a multidisciplinary team in the development of an in situ simulation program, (2) Start with aims and objectives, (3) Foster attendance and buy in by making it convenient for people to attend, (4) Celebrate your successes through interdepartmental communication, and (5) Recruit individuals with expertise in simulation based education.
In rainfed lowland rice-based systems, increasing labour scarcity due to off-farm employment is encouraging farmers to switch from transplanting to dry direct seeding (DDS). To assure stable productivity at a level comparable with or superior to transplanting, DDS management must ensure rice seedlings have access to nutrients in order to be competitive with weeds, which must also be suppressed. This paper examined farmer perceptions of DDS using a farmer survey, and used on-farm experiments to examine responses of rainfed lowland rice to integrated nutrient–weed management, based around mechanised DDS. In the survey, weeds were the biggest problem faced by farmers in using DDS (61%). In 90% of cases, farmers reported that weeds had increased under DDS, with most farmers (78%) controlling weeds by hand. All farmers said they would use DDS in the following season (100%), due to labour savings (47%), timeliness of operations, improved productivity, low investment or a combination of these (44%). In on-farm experiments, banding nutrients with the seed at sowing enhanced early dry matter of rice, while early weed dry matter was reduced. Early weed control using ducklings or hand weeding reduced weed competition and increased rice growth, with ducklings providing additional yield benefits over hand weeding. Early increases in seedling vigour of rice, and in weed suppression, carried through to greater dry matter and yield of rice at maturity. Integrated nutrient–weed management in mechanised DDS increased DDS yields, reduced DDS yield variability and contributed to sustainability of DDS rice systems.
In this essay, we discuss the under-representation of women in leadership positions in global health (GH) and the importance of mentorship to advance women's standing in the field. We then describe the mentorship model of GROW, Global Research for Women. We describe the theoretical origins of the model and an adapted theory of change explaining how the GROW model for mentorship advances women's careers in GH. We present testimonials from a range of mentees who participated in a pilot of the GROW model since 2015. These mentees describe the capability-enhancing benefits of their mentorship experience with GROW. Thus, preliminary findings suggest that the GROW mentorship model is a promising strategy to build women's leadership in GH. We discuss supplemental strategies under consideration and next steps to assess the impact of GROW, providing the evidence to inform best practices for curricula elsewhere to build women's leadership in GH.
Early nutrition of the neonatal pig has a major impact on its survival and subsequent development (Cieslak et al., 1983). The success of maternal nutrition trials has been limited in improving the survival and growth performance of piglets. Milk yield and composition has been altered (Jackson et al., 1995; Averette et al., 1999), which subsequently enhanced piglet health and growth performance but feeding supplemental fat had little or no effect on the birth weight of piglets. The aim of this study was to examine the effect of supplementing palm and/or soya oil directly to the piglet on its subsequent growth performance.
A substantial and continual economic loss within the pig industry is the 5-20% pre-weaning mortality rate that occurs during the neonatal period (MLC, 2002). The principal causes of piglet death are low birth weight in conjunction with insufficient amounts of body fat reserves (Herpin et al., 1993; Varley, 1995). Studies by Rooke et al. (2000) have demonstrated that the fatty acid profiles of the sows diet during late pregnancy and lactation is an important factor influencing piglet performance. The benefits of dietary manipulations aimed at improving piglet survival, however, remain controversial. The aim of this study was to examine the effect of supplementing the maternal diet with palm and/or soya oil during late gestation on piglet growth performance.
In Ireland, National Clinical Programmes are being established to improve and standardise patient care throughout the Health Service Executive. In line with internationally recognised guidelines on the treatment of first episode psychosis the Early Intervention in Psychosis (EIP) programme is being drafted with a view to implementation by mental health services across the country. We undertook a review of patients presenting with a first episode of psychosis to the Dublin Southwest Mental Health Service before the implementation of the EIP. This baseline information will be used to measure the efficacy of our EIP programme.
Patients who presented with a first episode psychosis were retrospectively identified through case note reviews and consultation with treating teams. We gathered demographic and clinical information from patients as well as data on treatment provision over a 2-year period from the time of first presentation. Data included age at first presentation, duration of untreated psychosis, diagnosis, referral source, antipsychotic prescribing rates and dosing, rates of provision of psychological interventions and standards of physical healthcare monitoring. Outcome measures with regards to rates of admission over a 2-year period following initial presentation were also recorded.
In total, 66 cases were identified. The majority were male, single, unemployed and living with their family or spouse. The mean age at first presentation was 31 years with a mean duration of untreated psychosis of 17 months. Just under one-third were diagnosed with schizophrenia. Approximately half of the patients had no contact with a health service before presentation. The majority of patients presented through the emergency department. Two-thirds of all patients had a hospital admission within 2 years of presentation and almost one quarter of patients had an involuntary admission. The majority of patients were prescribed antipsychotic doses within recommended British National Formulary guidelines. Most patients received individual support through their keyworker and family intervention was provided in the majority of cases. Only a small number received formal Cognitive-Behavioural Therapy. Physical healthcare monitoring was insufficiently recorded in the majority of patients.
There is a shortage of information on the profile and treatment of patients presenting with a first episode of psychosis in Ireland. This baseline information is important in evaluating the efficacy of any new programme for this patient group. Many aspects of good practice were identified within the service in particular with regards to the appropriate prescribing of antipsychotic medication and the rates of family intervention. Deficiencies remain however in the monitoring of physical health and the provision of formal psychological interventions to patients. With the implementation of an EIP programme it is hoped that service provision would improve nationwide and to internationally recognised standards.
Introduction: Palliative care is a broad approach to care for patients with serious or life-threatening illnesses. This includes relief of symptoms, such as pain, that interfere with a patient’s quality of life. It therefore falls firmly within the realm of emergency medicine (EM). 94% of emergency physicians report a need for education in dealing with death and dying. Nevertheless, there are no generally agreed upon competencies for Canadian EM residents with regard to palliative care and end of life care in the emergency department (ED). We performed a cross-sectional study of Canadian EM residency programs to measure the existing curricula in palliative and end of life care. Our primary outcome was the prevalence of structured educational programs for palliative and end of life care. Methods: An e-survey was e-mailed to all program directors of both CCFP(EM) and EM post-graduate training programs countrywide, using FluidSurveysTM. It included questions regarding current palliative and end of life care curricula from formal rotations to seminars and online modules. The survey was developed in consultation with the author group including specialists in education, palliative care medicine, emergency medicine, and medical education. Hired translators were employed to include French speaking programs in Canada. This study had ethical approval: Interior Health REB and UBC CREB certificate 2016-17-026-H. Results: The survey was open from October 12th to December 19th, 2016. During that time, we received 26 responses including 5 French speaking programs, for a response rate of 72.2%. The primary outcome was present in 38.5% of programs. There was no difference between FRCP and CCFP(EM) programs in the occurrence of the primary outcome (p=1; Fisher’s Exact Text). However, CCFP(EM) program directors commented that many of their residents had completed palliative care rotations in their family medicine training. The largest barriers to education included time (84.6%), curriculum development (80.8%), and availability of instructors (50.0%). Conclusion: Our preliminary analysis shows that few Canadian post-graduate EM programs have a structured educational program pertaining to palliative and end of life care. Current barriers to education that can be addressed in future curricular initiatives include lack of time, curriculum development, and instructor availability.
Introduction/Innovation Concept: In 2015, the Royal College of Physicians and Surgeons of Canada set out to redefine the CanMEDS roles including replacement of the “manager” role to that of the “leader”. This was to highlight the fact that skills in leadership are crucially important as ongoing health care improvement occurs. This educational innovation was born out of a need for formal education in leadership and administration in post graduate emergency medicine training. Methods: Few post graduate emergency medicine training programs in Canada have leadership and administrative curricula involving either longitudinal or discrete 4 week rotations. We sought to create an evidence based leadership and administrative experience based on the CanMEDS roles. We adapted components of pre-existing rotations from other universities and selected competencies from Thoma et al in order to compile a list of objectives. This was coupled with a reading list, various departmental, hospital, and regional meetings, a physician leadership training seminar, a departmental presentation, and a leadership project. Curriculum, Tool, or Material: The curriculum involved 4 weeks combining 8 emergency department (ED) clinical shifts with a leadership and administration component. The latter involved clinical interdepartmental meetings, a hospital medical advisory council (MAC) meeting, a provincial medical directors meeting, a health authority MAC meeting, and taking part in planning for an ED quality improvement initiative focused on triage. Attendance at a 2-day physician administrator leadership training seminar was also included. The reading list included books on leadership and references to ED quality improvement. In addition, exposure to a B.C. Ministry of Health document entitled, “Setting Priorities for the Health Care System,” the KGH Medical Staff Rules, and the B.C. Health Quality Matrix occurred. A summary presentation to the full ED on change management and leadership in residency occurred at completion. Conclusion: This innovative leadership and administration elective was the culmination of a need to see more formal post graduate leadership training in residency. The rotation was based on the CanMEDS framework, particularly the “leader” competency, and was based on recent evidence regarding leadership and administration competencies in emergency medicine. We hope this serves as a potential model for other rotation based electives or core rotations that desire to blend leadership competencies with clinical emergency medicine.
The co-existence of stroke and HIV has increased in recent years, but the impact of HIV on post-stroke outcomes is poorly understood. We examined the impact of HIV on inpatient mortality, length of acute hospital stay and complications (pneumonia, respiratory failure, sepsis and convulsions), in hospitalized strokes in Thailand. All hospitalized strokes between 1 October 2004 and 31 January 2013 were included. Data were obtained from a National Insurance Database. Characteristics and outcomes for non-HIV and HIV patients were compared and multivariate logistic and linear regression models were constructed to assess the above outcomes. Of 610 688 patients (mean age 63·4 years, 45·4% female), 0·14% (866) had HIV infection. HIV patients were younger, a higher proportion were male and had higher prevalence of anaemia (P < 0·001) compared to non-HIV patients. Traditional cardiovascular risk factors, hypertension and diabetes, were more common in the non-HIV group (P < 0·001). After adjusting for age, sex, stroke type and co-morbidities, HIV infection was significantly associated with higher odds of sepsis [odds ratio (OR) 1·75, 95% confidence interval (CI) 1·29–2·4], and inpatient mortality (OR 2·15, 95% CI 1·8–2·56) compared to patients without HIV infection. The latter did not attenuate after controlling for complications (OR 2·20, 95% CI 1·83–2·64). HIV infection is associated with increased odds of sepsis and inpatient mortality after acute stroke.
“Fox Glacier”, Yukon Territory, has a history of surging and is at present in a quiescent period. In 1968 a gravity survey was carried out over the glacier, in order to find ice depths. The results indicate the glacier is thin with a maximum depth of 88 m.
A continuum mixture model of coupled ice-sheet/ice-stream dynamics has been developed within a conventional three-dimensional finite-difference model framework. The ice mass is areally divided into sheet-ice and stream-ice components. Dynamic evolution of each component is solved with coupling terms to describe mass exchange between flows. In this way, ice-stream fluxes can be incorporated in a rigorous dynamical model with only a doubling of computational cost. This paper presents simple model tests using the EISMINT experimental ice block, a 1500 km × 1500 km ice sheet which rests on a flat bed. Ice-stream behaviour is investigated for a range of coupling rules and activation scenarios. In simple tests presented here, we find that the viscous response time of source ice feeding the ice stream may be a factor limiting ice-stream vigour and longevity.