To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Sleep disturbances are prevalent in cancer patients, especially those with advanced disease. There are few published intervention studies that address sleep issues in advanced cancer patients during the course of treatment. This study assesses the impact of a multidisciplinary quality of life (QOL) intervention on subjective sleep difficulties in patients with advanced cancer.
This randomized trial investigated the comparative effects of a multidisciplinary QOL intervention (n = 54) vs. standard care (n = 63) on sleep quality in patients with advanced cancer receiving radiation therapy as a secondary endpoint. The intervention group attended six intervention sessions, while the standard care group received informational material only. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) and Epworth Sleepiness Scale (ESS), administered at baseline and weeks 4 (post-intervention), 27, and 52.
The intervention group had a statistically significant improvement in the PSQI total score and two components of sleep quality and daytime dysfunction than the control group at week 4. At week 27, although both groups showed improvements in sleep measures from baseline, there were no statistically significant differences between groups in any of the PSQI total and component scores, or ESS. At week 52, the intervention group used less sleep medication than control patients compared to baseline (p = 0.04) and had a lower ESS score (7.6 vs. 9.3, p = 0.03).
Significance of results
A multidisciplinary intervention to improve QOL can also improve sleep quality of advanced cancer patients undergoing radiation therapy. Those patients who completed the intervention also reported the use of less sleep medication.
Background: There is an emerging evidence base that mindfulness for psychosis is a safe and effective intervention. However, empirical data on the within-session effects of mindfulness meditation was hitherto lacking. Aims: The aim of the study was to assess the impact of taking part in a mindfulness for psychosis group, using a within-session self-report measure of general stress, and symptom-related distress. Method: Users of a secondary mental health service (n = 34), who experienced enduring psychotic symptoms, took part in an 8-week mindfulness for psychosis group in a community setting. Mindfulness meditations were limited to 10 minutes and included explicit reference to psychotic experience arising during the practice. Participants self-rated general stress, and symptom-related distress, before and after each group session using a visual analogue scale. Results: Average ratings of general stress and symptom-related distress decreased from pre- to post-session for all eight sessions, although not all differences were statistically significant. There was no increase in general stress, or symptom-related distress across any session. Conclusions: There was evidence of positive effects and no evidence of any harmful effects arising from people with psychotic symptoms taking part in a mindfulness for psychosis session.
The US Navy utilizes numerous resources to encourage smoking cessation. Despite these efforts, cigarette smoking among service members remains high. Electronic cigarettes (EC) have provided an additional cessation resource. Little is known regarding the utilization efficacy of these cessation resources in the US Navy.
This study sought to explore the utilization and efficacy of ECs and other smoking cessation resources.
An anonymous cross-sectional survey was conducted at a military clinic from 2015 to 2016. Participants were active duty in the US Navy and reported demographics, smoking behaviors, and utilization of cessation resources.
Of the 977 participants in the study, 14.9% were current and 39.4% were former smokers. Most current smokers (83.6%) previously attempted cessation, smoked an average of 2–5 cigarettes per day (34.7%), and smoked every day of the month (26.4%). The number of daily cigarettes smoked and number of days cigarettes were smoked per month was not significantly different between cigarette-only smokers and EC dual users (p = 0.92, p = 0.75, respectively). Resources used by current and former smokers include: ‘cold turkey’ (44.6%, 57.1%, respectively), ECs (22.3%, 24.7%), nicotine patch (8.3%, 1.3%), medicine (6.6%, 3.9%), nicotine gum (5.8%, 10.4%), and quit programs (2.5%, 2.6).
Current and former cigarette smokers utilized similar resources to quit smoking. Electronic cigarettes are being used for cessation but do not significantly reduce the number of cigarettes smoked on a daily or monthly basis. Future studies may benefit from exploring the use of cessation resources and ECs within the military as a whole.
This study employs a comparative approach using Greek models of historical enquiry, especially those of Herodotus, to illustrate how Romans prior to the Punic Wars, and indeed as early as the fifth and fourth centuries BC, might have developed their own historical consciousness and historical traditions concerning their early past in much the same way as we know the Greeks had done by the fifth century BC. What follows is not at all new. Many have identified Roman historical and historiographical roots, connections, and even parallels with Greek history and historians.1 What follows reiterates those connections, explicitly by assessing how Herodotus presented his inquiries to his Greek audience, laying the foundations for the discipline of historia, and then by examining specifically the story of the Fabii at the Cremera in Livy, Dionysius and Diodorus. Through this one historical example, I hope to show that the roots of genuine historical thought can be found in the sources of our sources for early Roman traditions. Despite the fact that these traditions appear in works written much later than the events they describe, the nature of the stories preserved in our extant accounts suggests similar historiographical roots and interest as those preserved by Herodotus for the Greeks in the stories he told in his Histories.
To measure transmission frequencies and risk factors for household acquisition of community-associated and healthcare-associated (HA-) methicillin-resistant Staphylococcus aureus (MRSA).
Prospective cohort study from October 4, 2008, through December 3, 2012.
Seven acute care hospitals in or near Toronto, Canada.
Total of 99 MRSA-colonized or MRSA-infected case patients and 183 household contacts.
Baseline interviews were conducted, and surveillance cultures were collected monthly for 3 months from household members, pets, and 8 prespecified high-use environmental locations. Isolates underwent pulsed-field gel electrophoresis and staphylococcal cassette chromosome mec typing.
Overall, of 183 household contacts 89 (49%) were MRSA colonized, with 56 (31%) detected at baseline. MRSA transmission from index case to contacts negative at baseline occurred in 27 (40%) of 68 followed-up households. Strains were identical within households. The transmission risk for HA-MRSA was 39% compared with 40% (P=.95) for community-associated MRSA. HA-MRSA index cases were more likely to be older and not practice infection control measures (P=.002–.03). Household acquisition risk factors included requiring assistance and sharing bath towels (P=.001–.03). Environmental contamination was identified in 78 (79%) of 99 households and was more common in HA-MRSA households.
Household transmission of community-associated and HA-MRSA strains was common and the difference in transmission risk was not statistically significant.
The forward premium anomaly (exchange rate changes are negatively related to interest rate differentials) is one of the most robust puzzles in financial economics. We recast the underlying parity relation in terms of lagged forward interest rate differentials, documenting a reversal of the anomalous sign on the coefficient in the traditional specification. We show that this novel evidence is consistent with recent empirical models of exchange rates that imply exchange rate changes depend on two key variables: the interest rate differential and the magnitude of the deviation of the current exchange rate from that implied by purchasing power parity.
Background: Rural living has been demonstrated to have an effect on a person’s overall health status, and rural residing individuals often have decreased access to health and specialized rehabilitation services. Aim: The aim of this study was to determine if there are differences in recovery from stroke between urban and rural-dwelling stroke survivors accessing an in-home, community-based, interdisciplinary, stroke rehabilitation program. Methods: Data from a cohort of 1222 stroke survivors receiving care from the Community Stroke Rehabilitation Teams between January 2009 and June 2013 was analyzed. This program delivers stroke rehabilitation care directly in a person’s home and community. Functional and psychosocial outcomes were evaluated at baseline, discharge, and six -month follow-up. A series of multiple linear regression analyses was performed to determine if rural versus urban status was a significant predictor of discharge and 6-month health outcomes. Results: The mean age of the rural cohort was 68.8 (±13.1) years (53.6% male), and the urban cohort was 68.4 (±13.0) years (44.8% male). A total of 278 (35.4%) individuals were classified as living in a rural area using the Rurality Index for Ontario. In multivariate linear regression analysis, no significant differences on the Functional Independence Measure, the Stroke Impact Scale, the Hospital Anxiety and Depression Scale, or the Reintegration to Normal Living Index were found between urban and rural cohorts. Conclusions: When provided with access to a home-based, specialized stroke rehabilitation program, rural dwelling stroke survivors make and maintain functional gains comparable to their urban-living counterparts.
The objective of the present study was to investigate associations between sugar intake and overweight using dietary biomarkers in the Norfolk cohort of the European Prospective Investigation into Cancer and Nutrition (EPIC-Norfolk).
Prospective cohort study.
EPIC-Norfolk in the UK, recruitment between 1993 and 1997.
Men and women (n 1734) aged 39–77 years. Sucrose intake was assessed using 7 d diet diaries. Baseline spot urine samples were analysed for sucrose by GC-MS. Sucrose concentration adjusted by specific gravity was used as a biomarker for intake. Regression analyses were used to investigate associations between sucrose intake and risk of BMI>25·0 kg/m2 after three years of follow-up.
After three years of follow-up, mean BMI was 26·8 kg/m2. Self-reported sucrose intake was significantly positively associated with the biomarker. Associations between the biomarker and BMI were positive (β=0·25; 95 % CI 0·08, 0·43), while they were inverse when using self-reported dietary data (β=−1·40; 95 % CI −1·81, −0·99). The age- and sex-adjusted OR for BMI>25·0 kg/m2 in participants in the fifth v. first quintile was 1·54 (95 % CI 1·12, 2·12; Ptrend=0·003) when using biomarker and 0·56 (95 % CI 0·40, 0·77; Ptrend<0·001) with self-reported dietary data.
Our results suggest that sucrose measured by objective biomarker but not self-reported sucrose intake is positively associated with BMI. Future studies should consider the use of objective biomarkers of sucrose intake.
The synthetic auxin herbicides, aminocyclopyrachlor and clopyralid, control dicotyledonous weeds in turf. Clippings of turfgrass treated with synthetic auxin herbicides have injured off-target plants exposed to herbicide-laden clippings. Labels of aminocyclopyrachlor and clopyralid recommend that clippings of treated turfgrass remain on the turf following a mowing event. Alternative uses for synthetic auxin-treated turfgrass clippings are needed because large quantities of clippings on the turf surface interfere with the functionality and aesthetics of golf courses, athletic fields, and residential turf. A white clover bioassay was conducted to determine the persistence and bioavailability of aminocyclopyrachlor and clopyralid in turfgrass clippings. Aminocyclopyrachlor and clopyralid were each applied at 79 g ae ha−1 to mature tall fescue at 56, 28, 14, 7, 3.5, and 1.75 d before clipping collection (DBCC). Clippings were collected, and the treated clippings were recycled onto adjacent white clover plots to determine herbicidal persistence and potential for additional weed control. Clippings of tall fescue treated with aminocyclopyrachlor produced a nonlinear regression pattern of response on white clover. Calculated values for 50% response (GR50) for visual control, for normalized difference vegetative index (NDVI), and for reduction in harvested biomass were 20.5, 17.3, and 18.7 DBCC, respectively, 8 wk after clippings were applied. Clippings of tall fescue treated with clopyralid did not demonstrate a significant pattern for white clover control, presumably because clopyralid was applied at a less-than-label rate. The persistence and bioavailability of synthetic auxin herbicides in clippings harvested from previously treated turfgrass creates the opportunity to recycle clippings for additional weed control.
Tissue plasminogen activator has been found to significantly improve patient outcomes post stroke. Previous economic evaluations have adjusted for fewer admissions to inpatient rehabilitation but not for decreased length of stay in rehabilitation. Our objective was to estimate the potential cost savings associated with a decreased length of stay in inpatient rehabilitation for patients who receive tissue plasminogen activator compared to those who do not, in a Canadian context.
Decreased length of stay in inpatient rehabilitation for patients who received tissue plasminogen activator compared to controls was reported previously in a population of 1962 patients admitted to hospital with an ischemic stroke in Ontario between July 1, 2003 and March 31, 2008. Average per diem cost savings associated with the use of tissue plasminogen activator were calculated using a literature based cost estimate. Sensitivity analysis varying the length of stay in inpatient rehabilitation was performed.
The estimated mean per diem cost of inpatient rehabilitation derived from the literature was $626. Based on previously reported estimates for reduced length of stay, receipt of tissue plasminogen activator was estimated to result in savings of $939 per patient during inpatient rehabilitation. Sensitivity analysis suggested that these cost savings could range from $501 to $1377 per patient on average.
Future economic evaluations of tissue plasminogen activator should consider adjusting for shortened length of stay in inpatient rehabilitation for patients who receive tissue plasminogen activator.
Concern has been expressed over future biogeographical expansion and habitat capitalization by species of the phylum Cnidaria, as this may have negative implications on human activities and ecosystems. There is, however, a paucity of knowledge and understanding of jellyfish ecology, in particular species distribution and seasonality. Recent studies in the UK have principally focused on the Celtic, Irish and North Seas, but all in isolation. In this study we analyse data from a publicly-driven sightings scheme across UK coastal waters (2003–2011; 9 years), with the aim of increasing knowledge on spatial and temporal patterns and trends. We describe inter-annual variability, seasonality and patterns of spatial distribution, and compare these with existing historic literature. Although incidentally-collected data lack quantification of effort, we suggest that with appropriate data management and interpretation, publicly-driven, citizen-science-based, recording schemes can provide for large-scale (spatial and temporal) coverage that would otherwise be logistically and financially unattainable. These schemes may also contribute to baseline data from which future changes in patterns or trends might be identified. We further suggest that findings from such schemes may be strengthened by the inclusion of some element of effort-corrected data collection.
Modern treatment of newly diagnosed MM has led to improved responses and markedly improved survival[1,2]. However, despite excellent responses and disease control most patients will eventually relapse and require further therapy. Management of relapsed disease is therefore a critical aspect of overall care. This chapter provides a comprehensive overview of the determinants of and general approaches to therapy as well as a review of specific treatment regimens.
Definition of relapsed and relapsed/refractory MM
The European Group for Blood and Marrow Transplantation (EBMT) criteria and International Myeloma Working Group (IMWG) uniform criteria define progressive disease as ≥25% increase (or reappearance from complete response) in the measurable biochemical component (serum monoclonal protein, urine Bence Jones protein or Serum Free Light chain), an increase in bone marrow plasma cells to >10% or the development of new lytic bone lesions/soft tissue plasmacytomas. Clinical relapse is defined as the development of progressive disease and/or myeloma associated end organ dysfunction (CRAB criteria). Primary refractory myeloma refers to disease that fails to achieve at least a minimal response (MR) with initial therapy whilst relapsed and refractory MM is defined as disease that is non-responsive to salvage therapy, or progresses within 60 days of last treatment in patients who previously achieved at least a minimal response (MR).
Is the nature of decision-making capacity (DMC) for treatment significantly different in medical and psychiatric patients?
To compare the abilities relevant to DMC for treatment in medical and psychiatric patients who are able to communicate a treatment choice.
A secondary analysis of two cross-sectional studies of consecutive admissions: 125 to a psychiatric hospital and 164 to a medical hospital. The MacArthur Competence Assessment Tool – Treatment and a clinical interview were used to assess decision-making abilities (understanding, appreciating and reasoning) and judgements of DMC. We limited analysis to patients able to express a choice about treatment and stratified the analysis by low and high understanding ability.
Most people scoring low on understanding were judged to lack DMC and there was no difference by hospital (P=0.14). In both hospitals there were patients who were able to understand yet lacked DMC (39% psychiatric v. 13% medical in-patients, P<0.001). Appreciation was a better ‘test’ of DMC in the psychiatric hospital (where psychotic and severe affective disorders predominated) (P<0.001), whereas reasoning was a better test of DMC in the medical hospital (where cognitive impairment was common) (P=0.02).
Among those with good understanding, the appreciation ability had more salience to DMC for treatment in a psychiatric setting and the reasoning ability had more salience in a medical setting.
Patients experience reductions in quality of life (QOL) while receiving cancer treatment and several approaches have been proposed to address QOL issues. In this project, the QOL differences between older adult (age 65+) and younger adult (age 18–64) advanced cancer patients in response to a multidisciplinary intervention designed to improve QOL were examined.
This study was registered on ClinicalTrials.gov, NCT01360814. Newly diagnosed advanced cancer patients undergoing radiation therapy were randomized to active QOL intervention or control groups. Those in the intervention group received six multidisciplinary 90-minute sessions designed to address the five major domains of QOL. Outcomes measured at baseline and weeks 4, 27, and 52 included QOL (Linear Analogue Self-Assessment (LASA), Functional Assessment of Cancer Therapy–General (FACT-G)) and mood (Profile of Mood States (POMS)). Kruskall–Wallis methodology was used to compare scores between older and younger adult patients randomized to the intervention.
Of 131 patients in the larger randomized controlled study, we report data on 54 evaluable patients (16 older adults and 38 younger adults) randomized to the intervention. Older adult patients reported better overall QOL (LASA 74.4 vs. 62.9, p = 0.040), higher social well-being (FACT-G 91.1 vs. 83.3, p = 0.045), and fewer problems with anger (POMS anger–hostility 95.0 vs. 86.4, p = 0.028). Long-term benefits for older patients were seen in the anger–hostility scale at week 27 (92.2 vs. 84.2, p = 0.027) and week 52 (96.3 vs. 85.9, p = 0.005).
Older adult patients who received a multidisciplinary intervention to improve QOL while undergoing advanced cancer treatments benefited differently in some QOL domains, compared to younger adult patients. Future studies can provide further insight on how to tailor QOL interventions for these age groups.
Aminocyclopyrachlor (AMCP) is a newly developed synthetic auxin herbicide for broadleaf weed control in turfgrass systems. AMCP has been observed to undergo rapid photodecomposition in shallow water when exposed to sunlight. Most herbicide applications on golf courses occur during the morning when dew is still present on the turfgrass canopy. These conditions could result in efficacy loss if photolysis occurred while AMCP is suspended in dew droplets. Research was conducted to determine the effect of ambient moisture on AMCP efficacy. AMCP (79 and 105 g ae ha−1), aminopyralid (280 g ae ha −1), and two AMCP granular formulations (84 g ha−1) were applied to dew-covered (WET) and dew-excluded (DRY) ‘Tifway' bermudagrass plots. Herbicide treatments applied to WET plots had greater visually rated bermudagrass injury than respective treatments applied to DRY plots at 7 and 21 d after treatment (DAT), with the exception of aminopyralid at 21 DAT. Normalized difference vegetative index on turfgrass quality complemented visual ratings, indicating greater turfgrass quality reductions when applied to WET vs. DRY plots. These results indicate that AMCP applications made to dew-covered turfgrass can increase herbicidal efficacy, and no significant losses due to photodegradation were observed.
Systemic risk and the financial crisis of 2007 to 2009
In the fall and winter of 2008 to 2009, the worldwide economy and financial markets fell off a cliff. The stock market fell 42 percent in the United States and, on a dollar-adjusted basis, the market dropped 46 percent in the United Kingdom, 49 percent in Europe at large, 35 percent in Japan, and around 50 percent in the larger Latin American countries. Likewise, global gross domestic product (GDP) fell by 0.8 percent (the first contraction in decades), with the decline in advanced economies a sharp 3.2 percent. Furthermore, international trade fell a whopping 12 percent.
When economists bandy about the term systemic risk, this is what they mean. Financial firms play a critical role in the economy, acting as inter-mediaries between parties that need to borrow and parties willing to lend or invest. Without such intermediation, it is difficult for companies to get credit and conduct business, and for people to get student loans and automobile loans, to save, and to perform a range of other financial transactions. Systemic risk emerges when the financial sector as a whole has too little capital to cover its liabilities. This leads to the widespread failure of financial institutions and/or the freezing of capital markets, which greatly impairs financial intermediation, both in terms of the payments system and in terms of lending to corporations and households.
Abstract The most important lesson from the financial crisis of 2007 to 2009 has been that failures of some large financial institutions can impose costs on the entire system. We call these systemically important financial institutions (SIFIs). Their failures invariably put regulators in a compromised situation since, absent prearranged resolution plans, they are forced to rescue the failed institutions to preserve a functioning financial system. In the recent crisis, this has involved protecting not just insured creditors, but sometimes uninsured creditors and even shareholders. The anticipation that these bailouts will occur compromises market discipline in good times, encouraging excessive leverage and risk taking. This reinforces the systemic risk in the system. It is widely accepted that systemic risk needs to be contained by making it possible for these institutions to fail, thus restraining their incentives to take excessive risks in good times. First and foremost, however, regulators need to ascertain which institutions are, in fact, systemically important. Indeed, the systemic risk of an individual institution has not yet been measured or quantified by regulators in an organized manner, even though systemic risk has always been one of the justifications for our elaborate regulatory apparatus.
There are some institutions that follow highly cyclical activities and are thus heavily correlated with aggregate economic conditions. If these institutions are also highly levered, especially with short-term debt, then they face runs in the event of sufficiently adverse news about their condition.