To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
When Hurricane Harvey landed along the Texas coast on August 25, 2017, it caused massive flooding and damage and displaced tens of thousands of residents of Harris County, Texas. Between August 29 and September 23, Harris County, along with community partners, operated a megashelter at NRG Center, which housed 3365 residents at its peak. Harris County Public Health conducted comprehensive public health surveillance and response at NRG, which comprised disease identification through daily medical record reviews, nightly “cot-to-cot” resident health surveys, and epidemiological consultations; messaging and communications; and implementation of control measures including stringent isolation and hygiene practices, vaccinations, and treatment. Despite the lengthy operation at the densely populated shelter, an early seasonal influenza A (H3) outbreak of 20 cases was quickly identified and confined. Influenza outbreaks in large evacuation shelters after a disaster pose a significant threat to populations already experiencing severe stressors. A holistic surveillance and response model, which consists of coordinated partnerships with onsite agencies, in-time epidemiological consultations, predesigned survey tools, trained staff, enhanced isolation and hygiene practices, and sufficient vaccines, is essential for effective disease identification and control. The lessons learned and successes achieved from this outbreak may serve for future disaster response settings. (Disaster Med Public Health Preparedness. 2019;13:97-101)
Ventilation with a bag valve mask (BVM) is a challenging but critical skill for airway management in the prehospital setting.
Tidal volumes received during single rescuer ventilation with a modified BVM with supplemental external handle will be higher than those delivered using a standard BVM among health care volunteers in a manikin model.
This study was a randomized crossover trial of adult health care providers performing ventilation on a manikin. Investigators randomized participants to perform single rescuer ventilation, first using either a BVM modified by addition of a supplemental external handle or a standard unmodified BVM (Spur II BVM device; Ambu; Ballerup, Denmark). Participants performed mask placement and delivery of 10 breaths per minute for three minutes, as guided by a metronome. After a three-minute rest period, they performed ventilation using the alternative device. The primary outcome measure was mean received tidal volume as measured by the manikin (IngMar RespiTrainer model; IngMar Medical; Pittsburgh, Pennsylvania USA). Secondary outcomes included subject device preference.
Of 70 recruited participants, all completed the study. The difference in mean received tidal volume between ventilations performed using the modified BVM with external handle versus standard BVM was 20 ml (95% CI, -16 to 56 ml; P=.28). There were no significant differences in mean received tidal volume based on the order of study arm allocation. The proportion of participants preferring the modified BVM over the standard BVM was 47.1% (95% CI, 35.7 to 58.6%).
The modified BVM with added external handle did not result in greater mean received tidal volume compared to standard BVM during single rescuer ventilation in a manikin model.
ReedP, ZobristB, CasmaerM, SchauerSG, KesterN, AprilMD. Single Rescuer Ventilation Using a Bag Valve Mask with Removable External Handle: A Randomized Crossover Trial. Prehosp Disaster Med. 2017;32(6):625–630.
In recent years two lines of research on USSR power and personnel have challenged some long-standing interpretations of the bases of Soviet political activity. In one line, historical studies dealing with the Stalin era have called into question the conventional emphasis, epitomized in the totalitarian model, of a single leader who commands an army of loyal apparatchiki and monopolizes the political agenda. A number of scholars have shown that chaos and confusion in personnel matters were the salient characteristics of this period, rather than a coordinated system for the recruitment, placement, and promotion of cadres—an image suggested by both the totalitarian model and Stalinist boasting of a “monolithic party,” a “unified state structure,” and so forth. In substantive policy, the actual results in implementing regime directives in the Stalin period regularly bore no better than the faintest resemblance to the announced policy. Absent the well-oiled machine highlighted in images of the “totalitarian party,” the regime's failure to control real policy results seems to have followed as a necessary consequence.
Under pressure to produce tangible results whilst confronting an elusive enemy, COLAR became a well-oiled body-counting machine, seemingly increasing its combat effectiveness, according to the reported KIAs, particularly between 2002 and 2008. These kills responded to organisational dynamics: enemy deaths were aggressively ordered down the chain of command; covered corpses were exposed in local news media as war trophies; commanders rewarded units reporting KIAs; kill ratios were used to compare the effectiveness of different tactical units; young officers were pitted against each other and promoted based on KIAs; rewards and decorations were granted based on enemy kills; and KIAs were reported by the Colombian Ministry of Defence (MoD) as the indicator of the results-based war effort. The official death tallies demonstrate the impact of the pressure to kill: whereas in 1999, the armed forces reportedly ‘defused’ 818 guerrillas and 35 paramilitary; by 2002 the yearly toll of reported KIAs had more than doubled – a total of 1,690 guerrillas and 85 paramilitaries were reportedly neutralised in action by the armed forces.
This section provides a brief description of sham-KIAs. First, the practice is examined in the context of the overall kill rates reported by the MoD and numerical approximations to the sham-KIAs are presented. The second subsection offers a qualitative approximation to the killing spree and introduces categories in order to describe the variation noted in the caseload of sham-KIAs, as a result of differences observed in the modalities of commission and the selection of victims (targeting).
KILLING IN THE COLOMBIAN WAR AND SHAM-KIAs
Death is not as common in Colombia's counter-insurgent warfare as most people would expect. Even at their height, reported combat kills were a small proportion of the overall lethal violence recorded. Most killing in the country takes place outside the context of combat; for example, in 2011, the MoD reported 406 enemy KIAs, whereas, the national forensic authority reported 16,554 homicides nationwide. This proportion is merely indicative, as underreporting needs to be considered.
Peatlands have long been recognised as a high priority for protection under international and national wildlife laws and agreements. Over the last half century this protection has essentially been reactionary in the face of more widespread land management policy and market forces, which have encouraged damage to peatlands. This damage has been mainly to support the delivery of provisioning services, such as food, timber and pulp, or the widespread extraction of peat and oil. Across the world, peatlands of different types face a variety of pressures from land use and land-use change as well as pollution (e.g. atmospheric pollution on British blanket bogs), making them more susceptible to impacts of climate change. Within the general framework of international agreements on peatland conservation, each country has developed its own approach to tackling the threats with varying degrees of success. While established wildlife conservation policy has helped limit the extent of damage to peatlands in some countries, there is a need and opportunity for a stronger and more urgent public policy response to address the significant ongoing losses of peatland biodiversity and ecosystem services. The recognition of the multiple benefits that peatlands provide has presented new avenues to support sustainably managed peatlands, in addition to reducing peatland loss through active restoration (e.g. Bain et al. 2011; Joosten, Tapio-Biström and Tol 2012). This chapter presents an overview of the principal international and national policy drivers, with examples from selected countries across the world to highlight how new resources could be directed at wise use and conservation of peatlands.
Global overview of policy drivers for peatland conservation
While peatlands have been regarded as wastelands, and areas to be ‘improved’ for agriculture and forestry since the late eighteenth century (Chapter 2), they are now recognised for their wildlife and increasingly for their ecosystem services. Peatlands, therefore, feature in some of the world's highest-level environmental policies.
One of the earliest global agreements to recognise the importance of peatlands for protection was the Ramsar Convention (1971) that promoted the establishment and management of a network of protected wetlands. In 1996, it was reported that though peatlands represented 50% of the world's freshwater and terrestrial wetlands, less than 10% of the designated Ramsar sites had peatland as their dominant habitat (Chapter 15). Given continuing peatland loss and degradation, Contracting Parties set out guidelines to improve peatland protection (Ramsar 2003).
Habitat suitability models can guide species conservation by identifying correlates of occurrence and predicting where species are likely to occur. We created habitat suitability models for the White-breasted Thrasher Ramphocinclus brachyurus, a narrowly distributed endangered songbird that occupies dry forest in Saint Lucia and Martinique. Eighty-five percent of the global population inhabits two ranges in Saint Lucia, both of which are largely unprotected and threatened by development. We developed three habitat suitability models using Maxent techniques and published occupancy datasets collected from the species’ two Saint Lucian ranges, and used abiotic, land cover, and predator distribution predictors. We built one model with occupancy data from both ranges, and two others with occupancy data specific to each range. The best full-range model included 11 predictors; high suitability was associated with close proximity to Saint Lucia fer-de-lance Bothrops caribbeaus range, moderately low precipitation, and areas near streams. Our assessment of suitable sites island-wide was more restricted than results from a recent model that considered older land cover data and omitted predator distributions. All sites identified in our full-range model as highly suitable were in or adjacent to the species’ current designated range. The model trained on southern range occurrences predicted zero suitable habitat in the northern range, where the population is much smaller. In contrast, the model trained on northern range occurrences identified areas of moderate suitability within the southern range and patches of moderately suitable habitat in the western part of the island, where no White-breasted Thrashers currently occur. We interpret these results as suggesting that White-breasted Thrashers currently occupy virtually all suitable habitat on the island, that birds in the northern range occupy marginal habitat, or that an important correlate of suitability is missing from the model. Our results suggest that habitat management should focus on currently occupied areas.
We detail the influence of tapered interfaces on the nanoscale morphologies of ion-doped poly(styrene-b-oligo-oxyethylene methacrylate) block polymers (BPs). Most significantly, the location of double-gyroid network phase window was found in ion-doped normal-tapered materials, and a similar window was not detectable in the corresponding non-tapered and inverse-tapered BPs. Additionally, the effective interaction parameters, χeff, were reduced substantially in the tapered materials in comparison with their non-tapered counterparts. Overall, this work demonstrates that tapering between polymer blocks provides unique control over BP morphologies and improves the material processability (due to lower χeff), potentially facilitating the development of future ion-conducting devices.
Ever more agricultural economics departments are offering appointments for nine rather than twelve months but little if any analysis of the impact of this change has been done. Our research shows that converting to nine-month contracts is an effective way to raise salaries without an initial outlay of new funds and thus meets the retention criterion. Lower ranks do not suffer significantly lower salaries (without supplements) and professors earn more. Because the nine-month alternative costs more, justification for converting all twelve-month faculty members must rest on other factors, such as enhanced grants or comparability.
To outline the evolution of school food standards and their implementation and evaluation in each of the four countries of the UK since 2000.
Review of relevant policies, surveys and evaluations, including country-specific surveys and regional evaluations.
UK: England, Wales, Scotland and Northern Ireland.
Primary and secondary schools and schoolchildren.
By September 2013 standards will have been introduced in all primary and secondary schools in the UK. Evaluations have varied in their scope and timing, relating to government forward planning, appropriate baselines and funding. Where standards have been implemented, the quality and nutritional value of food provided have improved. Emerging evidence shows improved overall diet and nutrient intake by school-aged children as a result.
The re-introduction of school food standards in the UK has not been centrally coordinated, but by September 2013 will be compulsory across all four countries in the UK, except in England where academies are now exempt. Provision of improved school food has had a demonstrable impact on diet and nutrition beyond the school dining room and the school gate, benefiting children from all socio-economic groups. Improved school food and dining environments are associated with higher levels of school lunch take up. Implementation of school food standards requires investment. It is critical to policy development that the value of this investment is measured and protected using planned, appropriate, robust and timely evaluations. Where appropriate, evaluations should be carried out across government departments and between countries.