To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The 2017 solar eclipse was associated with mass gatherings in many of the 14 states along the path of totality. The Kentucky Department for Public Health implemented an enhanced syndromic surveillance system to detect increases in emergency department (ED) visits and other health care needs near Hopkinsville, Kentucky, where the point of greatest eclipse occurred.
EDs flagged visits of patients who participated in eclipse events from August 17–22. Data from 14 area emergency medical services and 26 first-aid stations were also monitored to detect health-related events occurring during the eclipse period.
Forty-four potential eclipse event-related visits were identified, primarily injuries, gastrointestinal illness, and heat-related illness. First-aid stations and emergency medical services commonly attended to patients with pain and heat-related illness.
Kentucky’s experience during the eclipse demonstrated the value of patient visit flagging to describe the disease burden during a mass gathering and to investigate epidemiological links between cases. A close collaboration between public health authorities within and across jurisdictions, health information exchanges, hospitals, and other first-response care providers will optimize health surveillance activities before, during, and after mass gatherings.
Marine plastic pollution is a global environmental concern. With reference to approaches in contemporary archaeology, object biographies and psychology, this article presents the application of a novel participatory (‘World Café’) methodology that aims both to understand how marine plastic pollution occurs and to demonstrate the value of the approach for encouraging behaviour change. As proof of concept, the authors present the preliminary results of fieldwork involving local people in the Galápagos archipelago to demonstrate the benefits of an archaeological approach in developing new frameworks to help mitigate this critical environmental threat.
Psychosocial interventions that mitigate psychosocial distress in cancer patients are important. The primary aim of this study was to examine the feasibility and acceptability of an adaptation of the Mindful Self-Compassion (MSC) program among adult cancer patients. A secondary aim was to examine pre–post-program changes in psychosocial wellbeing.
The research design was a feasibility and acceptability study, with an examination of pre- to post-intervention changes in psychosocial measures. A study information pack was posted to 173 adult cancer patients 6 months–5 years post-diagnosis, with an invitation to attend an eight-week group-based adaptation of the MSC program.
Thirty-two (19%) consented to the program, with 30 commencing. Twenty-seven completed the program (mean age: 62.93 years, SD 14.04; 17 [63%] female), attending a mean 6.93 (SD 1.11) group sessions. There were no significant differences in medico-demographic factors between program-completers and those who did not consent. However, there was a trend toward shorter time since diagnosis in the program-completers group. Program-completers rated the program highly regarding content, relevance to the concerns of cancer patients, and the likelihood of recommending the program to other cancer patients. Sixty-three percent perceived that their mental wellbeing had improved from pre- to post-program; none perceived a deterioration in mental wellbeing. Small-to-medium effects were observed for depressive symptoms, fear of cancer recurrence, stress, loneliness, body image satisfaction, mindfulness, and self-compassion.
Significance of results
The MSC program appears feasible and acceptable to adults diagnosed with non-advanced cancer. The preliminary estimates of effect sizes in this sample suggest that participation in the program was associated with improvements in psychosocial wellbeing. Collectively, these findings suggest that there may be value in conducting an adequately powered randomized controlled trial to determine the efficacy of the MSC program in enhancing the psychosocial wellbeing of cancer patients.
Historical scholarship has interpreted the Public Dance Halls Act, 1935 in a relatively uniform manner. Most works on the subject have emphasised the expanding influence of Catholic church authorities over dancing following the enactment of the legislation, as well as the increasing restrictions placed on the freedom of dancers. The act has been viewed as one element in a sequence of pieces of legislation passed by successive Free State governments that aimed to limit and control citizens, including the Censorship of Films Act, 1923, and the Censorship of Publications Act, 1929. Using previously unexamined Department of Justice records, this article questions the dominant interpretation of the Public Dance Halls Act. It analyses whether dances moved predominantly into parochial halls, as has been the common understanding, and also considers whether the supposedly harsh restrictions imposed on dancers were actually enforced or observed. The article also proposes that two largely unexamined facets of the legislation and its subsequent implementation be given more consideration. Safety concerns played a sizeable part in shaping dancing regulations, as did the interests and worries of local communities. The article concludes by suggesting that lacunae in the historiography of dance halls in the 1930s are emblematic of wider gaps in Irish social and cultural history and recommends avenues for future research.
Connectedness is a central dimension of personal recovery from severe mental illness (SMI). Research reports that people with SMI have lower social capital and poorer-quality social networks compared to the general population.
To identify personal well-being network (PWN) types and explore additional insights from mapping connections to places and activities alongside social ties.
We carried out 150 interviews with individuals with SMI and mapped social ties, places and activities and their impact on well-being. PWN types were developed using social network analysis and hierarchical k-means clustering of this data.
Three PWN types were identified: formal and sparse; family and stable; and diverse and active. Well-being and social capital varied within and among types. Place and activity data indicated important contextual differences within social connections that were not found by mapping social networks alone.
Place locations and meaningful activities are important aspects of people's social worlds. Mapped alongside social networks, PWNs have important implications for person-centred recovery approaches through providing a broader understanding of individual's lives and resources.
Anthony M. Kwasnica, Smeal College of Business, The Pennsylvania State University,
John O. Ledyard, Division of the Humanities and Social Sciences, California Institute of Technology,
David P. Porter, Economic Science Institute, Chapman University,
Christine DeMartini, Division of the Humanities and Social Sciences, California Institute of Technology
Theory, experiment and practice suggest that, when bidder valuations for multiple objects are super-additive, combinatorial auctions are needed to increase efficiency, seller revenue, and bidder willingness to participate (Bykowsky et al. 2000, Rassenti et al. 1982, Ledyard et al. 2002). A combinatorial auction is an auction in which bidders are allowed to express bids in terms of packages of objects. The now famous FCC spectrum auctions are a good example of the relevance of these issues. In 41 auction events from 1994 to 2003, the FCC used what is known as a Simultaneous Multiple Round (SMR) auction to allocate spectrum and raise over $40 billion in revenue. This auction format does not allow package bidding. The FCC auctions also divide the spectrum by geographic location. It is reasonable to expect that some bidders might receive extra benefits by obtaining larger, more contiguous portions of the spectrum. A firm might enjoy cost savings if they could purchase two adjacent locations. However, without package bidding, a bidder cannot express that preference, potentially lowering the efficiency and revenue of the auction. If the bidder attempts to acquire both licenses through bidding on the licenses individually, they might be forced to expose themselves to potential losses. The high number of bidder defaults on payments might, in part, be evidence of losses caused by the lack of package bidding. In response to these difficulties, the FCC plans to allow package bidding in future auctions (Federal Communications Commission 2002, Dunford et al. 2001). In particular, the FCC in its auction #31 for the upper 700 MHz band, affords bidders the ability to submit bids for packages of licenses. The particular design presented in this paper was developed prior to the FCC package auction design. Indeed one of the major features of the FCC design was clearly influenced by the pricing rules we developed herein.
Experimental Comparisons of Auction Designs
John O. Ledyard, Division of the Humanities and Social Sciences, California Institute of Technology,
David P. Porter, Economic Science Institute, Chapman University,
Antonio Rangel, Division of the Humanities and Social Sciences, California Institute of Technology
During the discussion and evaluation of proposals for the design of the Federal Communications Commission (FCC) mechanism to sell the spectrum, over 130 auctions were run under controlled conditions at Caltech for the National Telecommunications and Information Administration (NTIA), the FCC, and others. While these data were used in those debates, we do not intend to relive that process here. Instead, in this paper, we reexamine these data and try to extract some useful information for those who may, in the future, be involved in the difficult task of creating mechanisms to auction multiple items.
The two major design questions we can say something about are (1) should the items be auctioned off sequentially or simultaneously? and (2) should package bidding be allowed? Ourmain conclusion is that, over a very wide range of environments, package bidding mechanisms (weakly) dominate simultaneous mechanisms, which in turn (weakly) dominate sequential mechanisms. This conclusion is based on three observations derived from a close look at the data.
First, in environments with multiple items to be allocated, if those items are homogeneous and substitutes, then little coordination between buyers is needed and the only role of the mechanism is to sort bidders with high values from bidders with low values. Both the sequential and simultaneous mechanisms seem to work very well at finding efficient allocations in these “easy” environments.
Second, in environments with multiple items to be allocated, if those items are heterogeneous, then some coordination among bidders is necessary to achieve high-value allocations even if there are only low synergy values. Simultaneous auctions provide a first step at this coordination that sequential auctions might have difficulty in providing.
Third, in environments with heterogeneous goods exhibiting complementarities, significant coordination is required for an auction or allocation mechanism to perform well with respect to efficiency or revenue. Sequential auctions perform poorly. Simultaneity is clearly necessary but not sufficient to attain high efficiencies.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
Transient Ischaemic Attack (TIA) is a neurologic event with symptom resolution within 24 hours. Early specialist assessment of TIA reduces risk of stroke and death. National United Kingdom (UK) guidelines recommend patients with TIA are seen in specialist clinics within 24 hours (high risk) or seven days (low risk).
We aimed to develop a complex intervention for patients with low risk TIA presenting to the emergency ambulance service. The intervention is being tested in the TIER feasibility trial, in line with Medical Research Council (MRC) guidance on staged development and evaluation of complex interventions.
We conducted three interrelated activities to produce the TIER intervention:
• Survey of UK Ambulance Services (n = 13) to gather information about TIA pathways already in use
• Scoping review of literature describing prehospital care of patients with TIA
• Synthesis of data and definition of intervention by specialist panel of: paramedics; Emergency Department (ED) and stroke consultants; service users; ambulance service managers.
The panel used results to define the TIER intervention, to include:
1. Protocol for paramedics to assess patients presenting with TIA and identify and refer low risk patients for prompt (< 7day) specialist review at TIA clinic
2. Patient Group Directive and information pack to allow paramedic administration of aspirin to patients left at home with referral to TIA clinic
3. Referral process via ambulance control room
4. Training package for paramedics
5. Agreement with TIA clinic service provider including rapid review of referred patients
We followed MRC guidance to develop a clinical intervention for assessment and referral of low risk TIA patients attended by emergency ambulance paramedic. We are testing feasibility of implementing and evaluating this intervention in the TIER feasibility trial which may lead to fully powered multicentre randomized controlled trial (RCT) if predefined progression criteria are met.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
The anticipated release of EnlistTM cotton, corn, and soybean cultivars likely will increase the use of 2,4-D, raising concerns over potential injury to susceptible cotton. An experiment was conducted at 12 locations over 2013 and 2014 to determine the impact of 2,4-D at rates simulating drift (2 g ae ha−1) and tank contamination (40 g ae ha−1) on cotton during six different growth stages. Growth stages at application included four leaf (4-lf), nine leaf (9-lf), first bloom (FB), FB + 2 wk, FB + 4 wk, and FB + 6 wk. Locations were grouped according to percent yield loss compared to the nontreated check (NTC), with group I having the least yield loss and group III having the most. Epinasty from 2,4-D was more pronounced with applications during vegetative growth stages. Importantly, yield loss did not correlate with visual symptomology, but more closely followed effects on boll number. The contamination rate at 9-lf, FB, or FB + 2 wk had the greatest effect across locations, reducing the number of bolls per plant when compared to the NTC, with no effect when applied at FB + 4 wk or later. A reduction of boll number was not detectable with the drift rate except in group III when applied at the FB stage. Yield was influenced by 2,4-D rate and stage of cotton growth. Over all locations, loss in yield of greater than 20% occurred at 5 of 12 locations when the drift rate was applied between 4-lf and FB + 2 wk (highest impact at FB). For the contamination rate, yield loss was observed at all 12 locations; averaged over these locations yield loss ranged from 7 to 66% across all growth stages. Results suggest the greatest yield impact from 2,4-D occurs between 9-lf and FB + 2 wk, and the level of impact is influenced by 2,4-D rate, crop growth stage, and environmental conditions.
Studies using acute tryptophan depletion (ATD) to examine the effects of a rapid reduction in serotonin function have shown a reduction in global cognitive status during ATD in Alzheimer's disease (AD) and Parkinson's disease (PD). Based on the severe cholinergic loss evident in dementia with Lewy bodies (DLB) and Parkinson's disease and dementia (PDD), we predicted that a reduction of global cognitive status during ATD would be greater in these conditions than in AD.
Patients having DLB or PDD underwent ATD in a double-blind, placebo-controlled, randomized, counterbalanced, crossover design.
While the study intended to test 20 patients, the protocol was poorly tolerated and terminated after six patients attempted, but only four patients – three with DLB and one with PDD – completed the protocol. The Modified Mini-Mental State Examination (3MSE) score was reduced in all three DLB patients and unchanged in the PDD and dementia patient during ATD compared with placebo.
This reduction in global cognitive function and the poor tolerability may fit with the hypothesis that people with dementia with Lewy bodies have sensitivity to the effects of reduced serotonin function.
We have developed high damage threshold filters to modify the spatial profile of a high energy laser beam. The filters are formed by laser ablation of a transmissive window. The ablation sites constitute scattering centers which can be filtered in a subsequent spatial filter. By creating the filters in dielectric materials, we see an increased laser-induced damage threshold from previous filters created using ‘metal on glass’ lithography.
The Z-backlighter laser facility primarily consists of two high energy, high-power laser systems. Z-Beamlet laser (ZBL) (Rambo et al., Appl. Opt. 44, 2421 (2005)) is a multi-kJ-class, nanosecond laser operating at 1054 nm which is frequency doubled to 527 nm in order to provide x-ray backlighting of high energy density events on the Z-machine. Z-Petawatt (ZPW) (Schwarz et al., J. Phys.: Conf. Ser. 112, 032020 (2008)) is a petawatt-class system operating at 1054 nm delivering up to 500 J in 500 fs for backlighting and various short-pulse laser experiments (see also Figure 10 for a facility overview). With the development of the magnetized liner inertial fusion (MagLIF) concept on the Z-machine, the primary backlighting missions of ZBL and ZPW have been adjusted accordingly. As a result, we have focused our recent efforts on increasing the output energy of ZBL from 2 to 4 kJ at 527 nm by modifying the fiber front end to now include extra bandwidth (for stimulated Brillouin scattering suppression). The MagLIF concept requires a well-defined/behaved beam for interaction with the pressurized fuel. Hence we have made great efforts to implement an adaptive optics system on ZBL and have explored the use of phase plates. We are also exploring concepts to use ZPW as a backlighter for ZBL driven MagLIF experiments. Alternatively, ZPW could be used as an additional fusion fuel pre-heater or as a temporally flexible high energy pre-pulse. All of these concepts require the ability to operate the ZPW in a nanosecond long-pulse mode, in which the beam can co-propagate with ZBL. Some of the proposed modifications are complete and most of them are well on their way.