To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Long-term behaviour changes are critical to answering societal and individual challenges surrounding areas such as sustainability and health. Current understanding of how to bring about sustained behaviour is focused on the identification of Behaviour Change Techniques (BCTs) without explicit guidance on how these should be matched with technological solutions. Based on this gap we set out to answer the research question: What is the relationship between BCTs and interactive immersive technologies with respect to long-term, sustainable behaviour? To this end, we report a literature review on technology trends in the fields of human computer interaction, human robot interaction, and game design. Based on this review we develop three main contributions with implications for design theory and practice. First, we propose a number of characteristics and mechanisms in emerging immersive technologies. Second, we highlight technological pathways connected to specific BCT clusters likely to be disrupted: technology as a conveyor of information, an augmenter of feedback, and as an embodiment of empathy. Third, we explore these connections between BCT clusters and the actual technological interventions.
The ongoing pandemic disaster of coronavirus erupted with the first confirmed cases in Wuhan, China, in December 2019, caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV2) novel coronavirus, the disease referred to as coronavirus disease 2019, or COVID-19. The World Health Organization (WHO) confirmed the outbreak and determined it a global pandemic. The current pandemic has infected nearly 300 million people and killed over 3 million. The current COVID-19 pandemic is smashing every public health barrier, guardrail, and safety measure in underdeveloped and the most developed countries alike, with peaks and troughs across time. Greatly impacted are those regions experiencing conflict and war. Morbidity and mortality increase logarithmically for those communities at risk and that lack the ability to promote basic preventative measures. States around the globe struggle to unify responses, make gains on preparedness levels, identify and symptomatically treat positive cases, and labs across the globe frantically rollout various vaccines and effective surveillance and therapeutic mechanisms. The incidence and prevalence of COVID-19 may continue to increase globally as no unified disaster response is manifested and disinformation spreads. During this failure in response, virus variants are erupting at a dizzying pace. Ungoverned spaces where nonstate actors predominate and active war zones may become the next epicenter for COVID-19 fatality rates. As the incidence rates continue to rise, hospitals in North America and Europe exceed surge capacity, and immunity post infection struggles to be adequately described. The global threat in previously high-quality, robust infrastructure health-care systems in the most developed economies are failing the challenge posed by COVID-19; how will less-developed economies and those health-care infrastructures that are destroyed by war and conflict fare until adequate vaccine penetrance in these communities or adequate treatment are established? Ukraine and other states in the Black Sea Region are under threat and are exposed to armed Russian aggression against territorial sovereignty daily. Ukraine, where Russia has been waging war since 2014, faces this specific dual threat: disaster response to violence and a deadly infectious disease. To best serve biosurveillance, aid in pandemic disaster response, and bolster health security in Europe, across the North Atlantic Treaty Alliance (NATO) and Black Sea regions, increased NATO integration, across Ukraine’s disaster response structures within the Ministries of Health, Defense, and Interior must be reinforced and expanded to mitigate the COVID-19 disaster.
The purpose of the study was to investigate if differences in levels of knowledge existed between Danish and English training and specialist psychiatrists. This is important in the context of the free (and growing) movement of the medical workforce across European Union (EU) countries’ borders.
A complete balanced two-way factorial study design was used. Ten training and ten specialist psychiatrists were recruited in each country from reputable, university hospitals. They answered 50 multiple choice questions (MCQs), translated into the appropriate language, consisting of four subcategories of questions: psychology (15 MCQs), psychopharmacology (10 MCQs), neuroscience (five MCQs) and psychopathology (20 MCQs). No memory or other types of aids were allowed at the knowledge test. A two-way analysis of variance was used to analyse the total knowledge score (number of correct answers) and the component subscores. Levene’s test of equality of error variances was used to test for variance homogeneity.
There were significant differences in total knowledge and psychology knowledge by country and level of training. UK doctors scored 3.10 points higher than Danish doctors, with 95% confidence interval (0.97, 5.23). The knowledge of the specialists was also significantly superior to that of the training psychiatrists, with 2.30 higher score, 95% confidence interval (0.17, 4.43). In the sub-categories only the scores in the psychology section were significantly different. UK doctors scored 2.30 higher than Danish doctors, with 95% confidence interval (1.15, 3.45). Specialists scored 1.20 higher than non-specialists with 95% confidence interval (0.05, 2.35).
The results indicate that there is a significant difference in level of knowledge between psychiatrists in these two EU-countries, England and Denmark. This difference seemed to be chiefly the result of different knowledge of psychology. The disparity could be a result of the fundamentally different post-graduate training system in psychiatry in the two countries. Surprisingly, the differences in total knowledge and psychology knowledge between countries were larger than the differences between levels of training. The difference in knowledge is worrying taking into consideration that there is free movement of the workforce, including doctors, across the EU. The results here need further confirmation in future studies with greater numbers, more countries involved and perhaps additional measurements to MCQs.
Over the past decade, the World Health Summit (WHS) has provided a global platform for policy-makers and decision-makers to interact with academics and practitioners on global health. Recently the WHS adopted health security into their agenda for transnational disease risks (eg, Ebola and antimicrobial resistance) that increasingly threaten multiple sectors. Global health engagement (GHE) focuses efforts across interdisciplinary and interorganizational lines to identify critical threats and provide rapid deployment of key resources at the right time for addressing health security risks. As a product of subject matter experts convening at the WHS, a special side-group has organically risen with leadership and coordination from the German Institute for Defense and Strategic Studies in support of GHE activities across governmental, academic, and industry partners. Through novel approaches and targeted methodology that maximize outcomes and streamline global health operational process, the Global Health Security Alliance (GloHSA) was born. This short conference report describes in more detail the GloHSA.
New dietary-based concepts are needed for treatment and effective prevention of overweight and obesity. The primary objective was to investigate if reduction in appetite is associated with improved weight loss maintenance. This cohort study was nested within the European Commission project Satiety Innovation (SATIN). Participants achieving ≥8% weight loss during an initial 8-week low-energy formula diet were included in a 12-week randomised double-blind parallel weight loss maintenance intervention. The intervention included food products designed to reduce appetite or matching controls along with instructions to follow national dietary guidelines. Appetite was assessed by ad libitum energy intake and self-reported appetite evaluations using visual analogue scales during standardised appetite probe days. These were evaluated at the first day of the maintenance period compared with baseline (acute effects after a single exposure of intervention products) and post-maintenance compared with baseline (sustained effects after repeated exposures of intervention products) regardless of randomisation. A total of 181 participants (forty-seven men and 134 women) completed the study. Sustained reduction in 24-h energy intake was associated with improved weight loss maintenance (R 0·37; P = 0·001), whereas the association was not found acutely (P = 0·91). Suppression in self-reported appetite was associated with improved weight loss maintenance both acutely (R −0·32; P = 0·033) and sustained (R −0·33; P = 0·042). Reduction in appetite seems to be associated with improved body weight management, making appetite-reducing food products an interesting strategy for dietary-based concepts.
In the mid-1980s, the Anthropology Division of the American Museum of Natural History (AMNH) began the creation of digital resources as a means of collections access. Much of the database work was a secondary component of projects funded by outside grants and driven by new accountability mandates. The ongoing upgrading process was sporadic in its progress, but it still accomplished the primary goals of improved housing for collections and an exhaustive database. This paper discusses how the historical complications of the data, the scale of the database, its irregular schedule of funding, and deadline-driven projects resulted in inconsistency in data and difficulty in use. Although the examples provided will be specific to the AMNH Anthropology database, the circumstances and issues are common to many databases and the approaches presented broadly applicable. The discussion includes the practices used to mitigate the negative impact of these problems and the way the Division is positioning itself for the future, even as the database continues to provide unprecedented public and institutional access to and utility for the AMNH Anthropology collections.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
We numerically model the dynamics of the Enceladus plume ice grains and define our nominal plume model as having a particle size distribution n(R) ~ R−q with q = 4 and a total particulate mass rate of 16 kg s−1. This mass rate is based on average plume brightness observed by Cassini across a range of orbital positions. The model predicts sample volumes of ~1600 µg for a 1 m2 collector on a spacecraft making flybys at 20–60 km altitudes above the Enceladus surface. We develop two scenarios to predict the concentration of amino acids in the plume based on these assumed sample volumes. We specifically consider Glycine, Serine, α-Alanine, α-Aminoisobutyric acid and Isovaline. The first ‘abiotic’ model assumes that Enceladus has the composition of a comet and finds abundances between 2 × 10−6 to 0.003 µg for dissolved free amino acids and 2 × 10−5 to 0.3 µg for particulate amino acids. The second ‘biotic’ model assumes that the water of Enceladus's ocean has the same amino acid composition as the deep ocean water on Earth. We compute the expected captured mass of amino acids such as Glycine, Serine, and α-Alanine in the ‘biotic’ model to be between 1 × 10−5 to 2 × 10−5 µg for dissolved free amino acids and dissolved combined amino acids and about 0.0002 µg for particulate amino acids. Both models consider enhancements due to bubble bursting. Expected captured mass of amino acids is calculated for a 1 m2 collector on a spacecraft making flybys with a closest approach of 20 km during mean plume activity for the given nominal particle size distribution.
The illegal killing and taking of wild birds remains a major threat on a global scale. However, there are few quantitative data on the species affected and countries involved. We quantified the scale and scope of this issue in Northern and Central Europe and the Caucasus, using a diverse range of data sources and incorporating expert knowledge. The issue was reported to be widespread across the region and affects almost all countries/territories assessed. We estimated that 0.4–2.1 million birds per year may be killed/taken illegally in the region. The highest estimate of illegal killing in the region was for Azerbaijan (0.2-1.0 million birds per year). Out of the 20 worst locations identified, 13 were located in the Caucasus. Birds were reported to be illegally killed/taken primarily for sport and food in the Caucasus and for sport and predator/pest control in both Northern and Central Europe. All of the 28 countries assessed are parties to the Bern Convention and 19 are also European Union Member States. There are specific initiatives under both these policy instruments to tackle this threat, yet our data showed that illegal killing and taking is still occurring and is not restricted to Mediterranean European countries. Markedly increased effort is required to ensure that existing legislation is adequately implemented and complied with/enforced on the ground. Our study also highlighted the paucity of data on illegal killing and taking of birds in the region. It is a priority, identified by relevant initiatives under the Bern Convention and the European Union, to implement systematic monitoring of illegal killing and taking and to collate robust data, allowing stakeholders to set priorities, track trends and monitor the effectiveness of responses.
Rural communities face barriers to disaster preparedness and considerable risk of disasters. Emergency preparedness among rural communities has improved with funding from federal programs and implementation of a National Incident Management System. The objective of this project was to design and implement disaster exercises to test decision making by rural response partners to improve regional planning, collaboration, and readiness. Six functional exercises were developed and conducted among three rural Nebraska (USA) regions by the Center for Preparedness Education (CPE) at the University of Nebraska Medical Center (Omaha, Nebraska USA). A total of 83 command centers participated. Six functional exercises were designed to test regional response and command-level decision making, and each 3-hour exercise was followed by a 3-hour regional after action conference. Participant feedback, single agency debriefing feedback, and regional After Action Reports were analyzed. Functional exercises were able to test command-level decision making and operations at multiple agencies simultaneously with limited funding. Observations included emergency management jurisdiction barriers to utilization of unified command and establishment of joint information centers, limited utilization of documentation necessary for reimbursement, and the need to develop coordinated public messaging. Functional exercises are a key tool for testing command-level decision making and response at a higher level than what is typically achieved in tabletop or short, full-scale exercises. Functional exercises enable evaluation of command staff, identification of areas for improvement, and advancing regional collaboration among diverse response partners.
ObaidJM, BaileyG, WheelerH, MeyersL, MedcalfSJ, HansenKF, SangerKK, LoweJJ. Utilization of Functional Exercises to Build Regional Emergency Preparedness among Rural Health Organizations in the US. Prehosp Disaster Med. 2017;32(2):224–230.
This report outlines a 3-year health care coalition effort to advance and test community capacity for a large-scale hospital evacuation. The multi-year effort utilized a variety workshops, seminars, webinars, tabletops, functional exercises, and culminated with a full-scale exercise testing hospital evacuation. While most hospital evacuation exercises focus on internal movement of patients, this exercise process tested command-level decision making and it tested external partners such as transportation agencies, law enforcement, receiving hospitals, and local emergency management. This process delivered key coalition-building activities and offered a variety of training and exercise opportunities to assist a number of organizations, all at different stages of hospital evacuation planning. The 2012 Hospital Preparedness Program outlined the incorporation of health care coalition activities to transform individual organization preparedness to community-level readiness. This report outlines a health care coalition effort to deliver training and exercises to advance community capacity for a large-scale hospital evacuation.
LoweJJ, HansenKF, SangerKK, ObaidJM. A 3-year Health Care Coalition Experience in Advancing Hospital Evacuation Preparedness. Prehosp Disaster Med. 2016;31(6):658–662.
Background: The intelligent bed is a medical bed with several home healthcare functions. It includes, among others, an “out of bed” detector, a moisture detector, and a catheter bag detector. The design purpose of the intelligent bed is to assist patients in their daily living, facilitate the work of clinical staff, and improves the quality of care. The aim of this sub-study of the iCare project was to explore how health professionals (HPs) experience and use the intelligent bed in patients’ homes.
Methods: The overall research design is inspired by case study methodology. A triangulation of data collection techniques has been used: log book, documentation study, participant observations (n = 45 hr), and qualitative interviews (n = 23). The data have been analyzed by means of Nvivo 9.0.
Findings: We identified several themes: HP transformation from passive technology recipient to innovator; individualized care; work flow redesign; and sensor technology intruding on patient privacy.
Conclusions: It is suggested that functions of the intelligent bed can result in more individualized care, workflow redesign, and time savings for the health professionals in caring for elderly patients. However, the technology intruded on patients’ privacy.
The university occupies a peculiar space in democratic societies with market economies. Higher education serves the cause of democracy by fostering a more able and enlightened citizenry and the needs of the economy by producing a more skilled and creative workforce. The university likewise depends on the state and the market for its resources, for the tuitions, the grants, the contracts, the licenses, the royalties, and the gifts that are the lifeblood of every institution of higher learning.
Historically, economic development has been strongly correlated with increasing energy use and growth of greenhouse gas (GHG) emissions. Renewable energy (RE) can help decouple that correlation, contributing to sustainable development (SD). In addition, RE offers the opportunity to improve access to modern energy services for the poorest members of society, which is crucial for the achievement of any single of the eight Millennium Development Goals.
Theoretical concepts of SD can provide useful frameworks to assess the interactions between SD and RE. SD addresses concerns about relationships between human society and nature. Traditionally, SD has been framed in the three-pillar model—Economy, Ecology, and Society—allowing a schematic categorization of development goals, with the three pillars being interdependent and mutually reinforcing. Within another conceptual framework, SD can be oriented along a continuum between the two paradigms of weak sustainability and strong sustainability. The two paradigms differ in assumptions about the substitutability of natural and human-made capital. RE can contribute to the development goals of the three-pillar model and can be assessed in terms of both weak and strong SD, since RE utilization is defined as sustaining natural capital as long as its resource use does not reduce the potential for future harvest.
A total of sixty surgically castrated male pigs (Large White × Landrace) weighing 31·2 (sd 4·3) kg were used in a randomised block experiment to examine the effect of added dietary inulin (0, 20, 40 and 80 g/kg) on the occurrence of swine dysentery (SD) and on fermentation characteristics in the large intestine after experimental challenge with the causative spirochaete Brachyspira hyodysenteriae. The pigs were allowed to adapt to the diets for 2 weeks before each pig was challenged orally four times with a broth culture containing B. hyodysenteriae on consecutive days. Increasing dietary levels of inulin linearly (P = 0·001) reduced the risk of pigs developing SD; however, eight out of fifteen pigs fed the diet with 80 g/kg inulin still developed the disease. The pH values in the caecum (P = 0·072) tended to decrease, and in the upper colon, the pH values did decrease (P = 0·047) linearly with increasing inulin levels in the diets, most probably due to a linear increase in the concentration of total volatile fatty acids in the caecum (P = 0·018), upper colon (P = 0·001) and lower colon (P = 0·013). In addition, there was a linear reduction in the proportion of the branched-chain fatty acids isobutyric acid and isovaleric acid in the caecum (P = 0·015 and 0·026) and upper colon (P = 0·011 and 0·013) with increasing levels of dietary inulin. In conclusion, the present study showed that a diet supplemented with a high level of inulin (80 g/kg) but not lower levels reduced the risk of pigs developing SD, possibly acting through a modification of the microbial fermentation patterns in the large intestine.
This article offers a first-ever comprehensive empirical assessment of a key Progressive reform, the direct primary, and its impact on competition in American elections. We begin with a review of the problems Progressives diagnosed in the American electoral system and reasons to expect the direct primary to be a pro-competitive, democratizing reform. We then consider prior research into the direct primary and electoral contestation and describe the database of primary and general election outcomes that we have constructed to trace competition in primaries for federal and statewide offices. Finally, we examine the historical trajectory of competition in primary elections, starting with the first decades after the introduction of the reform and then the succeeding decades.
Consistent with the hopes of reformers, we find primary elections indeed provided a forum for contestation for federal and statewide elections. Although primaries were never broadly competitive, even at the outset, they accounted for about a third of the serious electoral tests faced by statewide officeholders and about a fifth faced by U.S. representatives. The role of primaries as a venue for robust contestation, however, was short-lived, as the competitiveness of federal and statewide primaries decreased sharply starting in the 1940s. The last section of this article explores whether two recent developments in American elections—the extension of two-party competition and the rise in the value of incumbency—conspired to temper the contribution of direct primaries to electoral competition.