To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The lack of radiation knowledge among the general public continues to be a challenge for building communities prepared for radiological emergencies. This study applied a multi-criteria decision analysis (MCDA) to the results of an expert survey to identify priority risk reduction messages and challenges to increasing community radiological emergency preparedness.
Professionals with expertise in radiological emergency preparedness, state/local health and emergency management officials, and journalists/journalism academics were surveyed following a purposive sampling methodology. An MCDA was used to weight criteria of importance in a radiological emergency, and the weighted criteria were applied to topics such as sheltering-in-place, decontamination, and use of potassium iodide. Results were reviewed by respondent group and in aggregate.
Sheltering-in-place and evacuation plans were identified as the most important risk reduction measures to communicate to the public. Possible communication challenges during a radiological emergency included access to accurate information; low levels of public trust; public knowledge about radiation; and communications infrastructure failures.
Future assessments for community readiness for a radiological emergency should include questions about sheltering-in-place and evacuation plans to inform risk communication.
In this article David Barnett documents a practice-as-research project that employed Brechtian approaches to stage dramatic material. The Crucible by Arthur Miller is a realist text in which the protagonist, John Proctor, redeems himself for the sin of adultery by taking a heroic stand against the Salem witch-hunts. Existing scholarship has revealed a series of gendered biases in the form and content of the play, yet these findings have never been systematically realized in performance. While appearing to defend democratic values, the play’s dramaturgical strategies coerce agreement, and this represents a fundamental contradiction. Brecht offers a method that preserves the written dialogue, but interprets it critically onstage, deploying a range of devices derived from a materialist and dialectical interpretation. The aim of the production was to re-present a play with a familiar production history and problematize the political bases on which it conventionally rested. The article discusses the rationale for the theory and practice of contemporary Brechtian theatre and offers the production as a model for future critical realizations of other realist plays. David Barnett is Professor of Theatre at the University of York. His publications include A History of the Berliner Ensemble (CUP, 2015), Brecht in Practice: Theatre, Theory and Performance (Bloomsbury, 2014), amd Rainer Werner Fassbinder and the German Theatre (CUP, 2005).
Digital markets offer abundant free content but exhibit extreme concentration among content aggregation intermediaries. These characteristics are linked. Weak copyright environments select against stand-alone content-delivery structures and select for bundled aggregation structures in which free content for users promotes positively priced advertising and data-collection services for firms. Dominant intermediaries promote commoditization, and the reallocation of market rents from content producers to content aggregators, through litigation and free content distribution that weaken copyright protections. The potential net welfare effects raise concern. Network effects, compounded by weak inventory constraints, scale economies, and learning effects, promote winner-takes-all outcomes in the intermediary services market while weak copyright may generate output distortions in the content production market.
In this article David Barnett explores the Berliner Ensemble's production in 1956 of Synge's classic The Playboy of the Western World. Although it was directed by Peter Palitzsch and Manfred Wekwerth, Bertolt Brecht, the company's co-founder, loomed large in planning and rehearsal. This staging serves as an example of how a politicized approach to theatre-making can bring out relationships, material conditions, and power structures that the play's production history has often ignored. In addition, Barnett aims to show how Brechtian methods can be applied more generally to plays not written in the Brechtian tradition and the effects they can achieve. David Barnett is Professor of Theatre at the University of York. He is the author of Heiner Müller's ‘The Hamletmachine’ (Routledge, 2016), A History of the Berliner Ensemble (Cambrige, 2015) and Brecht in Practice: Theatre, Theory, and Performance (Bloomsbury, 2014). His recent AHRC-funded ‘Brecht in Practice: Staging Drama Dialectically’, led to a Brechtian production of Patrick Marber's Closer, and he offers theatre-makers and teachers workshops on using Brecht's method on stage and in the classroom.
Documentation is often descriptive, yet Brecht's approach to the task was to unpack as much of the process as possible in order to share his method and to make it available to other theatre-makers. This article initially examines how such documentation functioned and goes on to consider how and why it gradually declined over time. By the mid-1960s, the Berliner Ensemble had entered a period of crisis, yet the development of post-Brechtian practices in the early 1970s did not lead to a resurgence of documentation. However, at certain points, interest in manufacturing the carefully taken Notate resurfaced. The article thus investigates the relationship between documenting and transmitting Brechtian practices. Documentation becomes a yardstick by which to measure the fêted company's self-understanding as Brecht's theatre.
Imaging biomarkers for Alzheimer's disease include medial temporal lobe
atrophy (MTLA) depicted on computed tomography (CT) or magnetic resonance
imaging (MRI) and patterns of reduced metabolism on fluorodeoxyglucose
positron emission tomography (FDG-PET).
To investigate whether MTLA on head CT predicts the diagnostic usefulness
of an additional FDG-PET scan.
Participants had a clinical diagnosis of Alzheimer's disease
(n = 37) or dementia with Lewy bodies (DLB;
n = 30) or were similarly aged controls
(n = 30). We visually rated MTLA on coronally
reconstructed CT scans and, separately and blind to CT ratings, abnormal
appearances on FDG-PET scans.
Using a pre-defined cut-off of MTLA ⩾5 on the Scheltens (0–8) scale, 0/30
controls, 6/30 DLB and 23/30 Alzheimer's disease had marked MTLA. FDG-PET
performed well for diagnosing Alzheimer's disease v. DLB
in the low-MTLA group (sensitivity/specificity of 71%/79%), but in the
high-MTLA group diagnostic performance of FDG-PET was not better than
In the presence of a high degree of MTLA, the most likely diagnosis is
Alzheimer's disease, and an FDG-PET scan will probably not provide
significant diagnostic information. However, in cases without MTLA, if
the diagnosis is unclear, an FDG-PET scan may provide additional
clinically useful diagnostic information.
A prey–predator experimental setup was conducted in a shallow coastal ecosystem characterized by a bare intertidal mudflat to test if benthic biofilm resuspension causing microalgae inputs and carbon export toward nanoflagellates would favour the highest planktonic trophic level (i.e. mesozooplankton) when nutrient concentrations are high in the water column. Mesozooplankton predation and somatic production were studied by comparing the evolution of the prey assemblage (diversity and abundances) in the presence and absence of these predators during 24 h experiments. The results were then statistically analysed according to the cross-calculation method. Biofilm resuspension caused (i) a direct input of benthic microorganisms that had changed prey structure in term of diversity and/or size and (ii) a differential growth ability between prey taxa. Both reasons implied a bottom-up control on both micro- and mesozooplankton. The carbon export toward heterotrophic nanoflagellates favoured pelagic ciliate growth while mesozooplankton benefited from largest diatoms with high growth rates, both benthic and R-strategist pelagic species. Even if these microbial and herbivorous pathways are controlled by benthic inputs, they seemed to be totally disconnected since ciliates represented only a small part of mesozooplankton diet. The sensitivity of mesozooplankton production appeared species-dependent with the most tolerant taxa dominating the zooplankton assemblages. This suggests a role of the intensities and the frequencies of biofilm resuspension on the spatio-temporal structuring of mesozooplankton in macrotidal coastal ecosystems.
The ability to perform microbial detection and characterization in-field at extreme environments, rather than on returned samples, has the potential to improve the efficiency, relevance and quantity of data from field campaigns. To date, few examples of this approach have been reported. Therefore, we demonstrate that the approach is feasible in subglacial environments by deploying four techniques for microbial detection: real-time polymerase chain reaction; microscopic fluorescence cell counts, adenosine triphosphate bioluminescence assay and recombinant Factor C assay (to detect lipopolysaccharide). Each technique was applied to 12 subglacial ice samples, 12 meltwater samples and two snow samples from Engabreen, Northern Norway. Using this multi-technique approach, the detected biomarker levels were as expected, being highest in debris-rich subglacial ice, moderate in glacial meltwater and low in clean ice (debris-poor) and snow. Principal component analysis was applied to the resulting dataset and could be performed in-field to rapidly aid the allocation of resources for further sample analysis. We anticipate that in-field data collection will allow for multiple rounds of sampling, analysis, interpretation and refinement within a single field campaign, resulting in the collection of larger and more appropriate datasets, ultimately with more efficient science return.
Positron emission tomography (PET) and single photon emission computed tomography (SPECT) brain imaging are widely used as diagnostic tools for suspected dementia but no studies have directly compared participant views of the two procedures. We used a range of methods to explore preferences for PET and SPECT.
Patients and controls (and accompanying carers) completed questionnaires immediately after undergoing PET and SPECT brain scans. Pulse rate data were collected during each scan. Scan attributes were prioritized using a card sorting exercise; carers and controls additionally answered willingness to pay (WTP) questions.
Few differences were found either between the scans or groups of participants, although carers marginally preferred SPECT. Diagnostic accuracy was prioritized over other scan characteristics. Mean heart rate during both scans was lower than baseline heart rate measured at home (p < 0.001).
Most participants viewed PET and SPECT scans as roughly equivalent and did not have a preference for either scan. Carer preference for SPECT is likely to reflect their desire to be with the patient (routine practice for SPECT but not for PET), suggesting that they should be able to accompany vulnerable patients throughout imaging procedures wherever possible. Pulse rate data indicated that brain imaging was no more stressful than a home visit (HV) from a researcher. The data do not support the anecdotal view that PET is a more burdensome procedure and the use of PET or SPECT scans in dementia should be based on diagnostic accuracy of the technique.
The field of Mobile health (mHealth), which includes mobile phone applications (apps), is growing rapidly and has the potential to transform healthcare by increasing its quality and efficiency. The present paper focuses particularly on mobile technology for body weight management, including mobile phone apps for weight loss and the available evidence on their effectiveness. Translation of behaviour change theory into weight management strategies, including integration in mobile technology is also discussed. Moreover, the paper presents and discusses the myPace platform as a case in point. There is little clinical evidence on the effectiveness of currently available mobile phone apps in enabling behaviour change and improving health-related outcomes, including sustained body weight loss. Moreover, it is unclear to what extent these apps have been developed in collaboration with health professionals, such as dietitians, and the extent to which apps draw on and operationalise behaviour change techniques has not been explored. Furthermore, presently weight management apps are not built for use as part of dietetic practice, or indeed healthcare more widely, where face-to-face engagement is fundamental for instituting the building blocks for sustained lifestyle change. myPace is an innovative mobile technology for weight management meant to be embedded into and to enhance dietetic practice. Developed out of systematic, iterative stages of engagement with dietitians and consumers, it is uniquely designed to complement and support the trusted health practitioner–patient relationship. Future mHealth technology would benefit if engagement with health professionals and/or targeted patient groups, and behaviour change theory stood as the basis for technology development. Particularly, integrating technology into routine health care practice, rather than replacing one with the other, could be the way forward.