To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Selenium (Se) is an essential element for human health. However, our knowledge of the prevalence of Se deficiency is less than for other micronutrients of public health concern such as iodine, iron and zinc, especially in sub-Saharan Africa (SSA). Studies of food systems in SSA, in particular in Malawi, have revealed that human Se deficiency risks are widespread and influenced strongly by geography. Direct evidence of Se deficiency risks includes nationally representative data of Se concentrations in blood plasma and urine as population biomarkers of Se status. Long-range geospatial variation in Se deficiency risks has been linked to soil characteristics and their effects on the Se concentration of food crops. Selenium deficiency risks are also linked to socio-economic status including access to animal source foods. This review highlights the need for geospatially-resolved data on the movement of Se and other micronutrients in food systems which span agriculture–nutrition–health disciplinary domains (defined as a GeoNutrition approach). Given that similar drivers of deficiency risks for Se, and other micronutrients, are likely to occur in other countries in SSA and elsewhere, micronutrient surveillance programmes should be designed accordingly.
Insufficient design often causes challenges to users on a cognitive level, hindering them from interacting with products smoothly. There is a lack of effective design tools and supporting materials that can help designers to understand human cognition and how it affects the way that users experience and use products and services. This paper aims to identify current approaches that can be applied to address this issue, and to examine their strengths and weaknesses. This helps to identify future directions for developing and improving cognitive design supports. A literature review was conducted of research publications in the fields of both design and cognition. Four key approaches are identified: cognitive design principles/guidelines, the demand-capability approach, cognitive walkthrough and cognitive modelling. Their strengths and weaknesses are analyzed from a design standpoint. The paper also analyses the underlying causes of the insufficient uptake of cognitive design approaches by designers.
To describe an adenovirus outbreak in a neonatal intensive care unit (NICU), including the use of qualitative and semiquantitative real-time polymerase chain reaction (qPCR) data to inform the outbreak response.
Mixed prospective and retrospective observational study.
A level IV NICU in the southeastern United States.
Two adenovirus cases were identified in a NICU. Screening of all inpatients with qPCR on nasopharyngeal specimens revealed 11 additional cases.
Outbreak response procedures, including enhanced infection control policies, were instituted. Serial qPCR studies were used to screen for new infections among exposed infants and to monitor viral clearance among cases. Changes to retinopathy of prematurity (ROP) exam procedures were made after an association was noted in those patients. At the end of the outbreak, a retrospective review allowed for comparison of clinical factors between the infected and uninfected groups.
There were no new cases among patients after outbreak identification. One adenovirus-infected patient died; the others recovered their clinical baselines. The ROP exams were associated with an increased risk of infection (odds ratio [OR], 84.6; 95% confidence interval [CI], 4.5–1,601). The duration of the outbreak response was 33 days, and the previously described second wave of cases after the end of the outbreak did not occur. Revisions to infection control policies remained in effect following the outbreak.
Retinopathy of prematurity exams are potential mechanisms of adenovirus transmission, and autoclaved or single-use instruments should be used to minimize this risk. Real-time molecular diagnostic and quantification data guided outbreak response procedures, which rapidly contained and fully terminated a NICU adenovirus outbreak.
In 2013, New York State mandated that, during influenza season, unvaccinated healthcare personnel (HCP) wear a surgical mask in areas where patients are typically present. We found that this mandate was associated with increased HCP vaccination and decreased HCP visits to the hospital Workforce Health and Safety Department with respiratory illnesses and laboratory-confirmed influenza.
Recent US disasters highlight the current imbalance between the high proportion of chronically ill Americans who depend on prescription medications and their lack of medication reserves for disaster preparedness. We examined barriers that Los Angeles County residents with chronic illness experience within the prescription drug procurement system to achieve recommended medication reserves.
A mixed methods design included evaluation of insurance pharmacy benefits, focus group interviews with patients, and key informant interviews with physicians, pharmacists, and insurers.
Results and Discussion
Most prescriptions are dispensed as 30-day units through retail pharmacies with refills available after 75% of use, leaving a monthly medication reserve of 7 days. For patients to acquire 14- to 30-day disaster medication reserves, health professionals interviewed supported 60- to 100-day dispensing units. Barriers included restrictive insurance benefits, patients’ resistance to mail order, and higher copay-ments. Physicians, pharmacists, and insurers also varied widely in their preparedness planning and collective mutual-aid plans, and most believed pharmacists had the primary responsibility for patients’ medication continuity during a disaster.
To strengthen prescription drug continuity in disasters, recommendations include the following: (1) creating flexible drug-dispensing policies to help patients build reserves, (2) training professionals to inform patients about disaster planning, and (3) building collaborative partnerships among system stakeholders. (Disaster Med Public Health Preparedness. 2013;7:257-265)
The purpose of the present study was to determine the effects of short-term supplementation with the free acid form of β-hydroxy-β-methylbutyrate (HMB-FA) on indices of muscle damage, protein breakdown, recovery and hormone status following a high-volume resistance training session in trained athletes. A total of twenty resistance-trained males were recruited to participate in a high-volume resistance training session centred on full squats, bench presses and dead lifts. Subjects were randomly assigned to receive either 3 g/d of HMB-FA or a placebo. Immediately before the exercise session and 48 h post-exercise, serum creatine kinase (CK), urinary 3-methylhistadine (3-MH), testosterone, cortisol and perceived recovery status (PRS) scale measurements were taken. The results showed that CK increased to a greater extent in the placebo (329 %) than in the HMB-FA group (104 %) (P= 0·004, d= 1·6). There was also a significant change for PRS, which decreased to a greater extent in the placebo (9·1 (sem 0·4) to 4·6 (sem 0·5)) than in the HMB-FA group (9·1 (sem 0·3) to 6·3 (sem 0·3)) (P= 0·005, d= − 0·48). Muscle protein breakdown, measured by 3-MH analysis, numerically decreased with HMB-FA supplementation and approached significance (P= 0·08, d= 0·12). There were no acute changes in plasma total or free testosterone, cortisol or C-reactive protein. In conclusion, these results suggest that an HMB-FA supplement given to trained athletes before exercise can blunt increases in muscle damage and prevent declines in perceived readiness to train following a high-volume, muscle-damaging resistance-training session.
In this work we propose a particle agglomeration model for chemical mechanical planarization (CMP) under the primary motivation of understanding the creation and behavior of the agglomerated slurry abrasive particles during the CMP process, which are a major cause of defectivity and poor consumable utility due to sedimentation.
The proposed model considers the slurry composition as a colloidal suspension of charged colloidal silica in an electrically neutral aqueous electrolyte. First, a theoretical relationship between the measurable chemical parameters of the slurry's aqueous electrolyte, the surface potential of the abrasive particles, and corresponding zeta potential between the agglomerated abrasive particles is presented. Secondly, this zeta potential is used in a modified DVLO interaction potential model to determine the particle interaction potentials due to both the attractive van Der Waals forces and repulsive electrostatic interactions. Finally, the total interaction potential created is then used to define a stability ratio for slow versus fast agglomeration and corresponding agglomeration rate equations between particles; these are used in a discrete population balance framework to describe the final particle size distribution with respect to time and agglomerate composition.
The proposed model will provide both a qualitative and quantitative description of agglomeration of abrasive slurry particles during CMP that can be extended to account for slurry composition or abrasive particle type, enabling more accurate process control, increased consumable utility, and possible defectivity reduction.
The objective of the present paper is to review the methods of measuring micronutrient intake adequacy for individuals and for populations in order to ascertain best practice. A systematic review was conducted to locate studies on the methodological aspects of measuring nutrient adequacy. The results showed that for individuals, qualitative methods (to find probability of adequacy) and quantitative methods (to find confidence of adequacy) have been proposed for micronutrients where there is enough data to set an average nutrient requirement (ANR). If micronutrients do not have ANR, an adequate intake (AI) is often defined and can be used to assess adequacy, provided the distribution of daily intake over a number of days is known. The probability of an individual's intake being excessive can also be compared with the upper level of safe intake and the confidence of this estimate determined in a similar way. At the population level, adequacy can be judged from the ANR using the probability approach or its short cut – the estimated average requirement cut-point method. If the micronutrient does not have an ANR, adequacy cannot be determined from the average intake and must be expressed differently. The upper level of safe intake can be used for populations in a similar way to that of individuals. All of the methodological studies reviewed were from the American continent and all used the methodology described in the Institute of Medicine publications. The present methodology should now be adapted for use in Europe.
Global and regional targets to reduce the rate of biodiversity loss bring with them the need to measure the state of nature and how it is changing. A number of different biodiversity indicators have been developed in response and here we consider bird population indicators in Europe. Birds are often used as surrogates for other elements of biodiversity because they are so well known and well studied, and not for their unique intrinsic value as environmental indicators. Yet, in certain situations and at particular scales, trends in bird populations correlate with those of other taxa making them a valuable biodiversity indicator with appropriate caveats. In this paper, we look at two case studies, in the UK and Europe as a whole, where headline bird indicators, that is, summary statistics based on bird population trends, have been developed and used to inform and assist policy makers. Wild bird indicators have been adopted by many European countries and by the European Union as indicators of biodiversity and of sustainable development. In the discussion, we review the strengths and weaknesses of using bird populations in this way, and look forward to how this work might be developed and expanded.
The 26 December 2004 Tsunami resulted in a death toll of >270,000 persons, making it the most lethal tsunami in recorded history. This article presents performance data observations and the lessons learned by a civilian team dispatched by the Australian government to “provide clinical and surgical functions and to make public health assessments”. The team, prepared and equipped for deployment four days after the event, arrived at its destination 13 days after the Tsunami. Aspiration pneumonia, tetanus, and extensive soft tissue wounds of the lower extremities were the prominent injuries encountered. Surgical techniques had to be adapted to work in the austere environment. The lessons learned included: (1) the importance of team member selection; (2) strategies for self-sufficiency; (3) personnel readiness and health considerations; (4) face-to-face handover; (5) coordination and liaison; (6) the characteristics of injuries; (7) the importance of protocols for patient discharge and hospital staffing; and (8) requirements for interpreter services.
Whereas disaster medical relief teams will be required in the future, the composition and equipment needs will differ according to the nature of the disaster. National teams should be on standby for international response.
The lambs production system in the South-European countries is characterised by producing light carcasses (< 13 kg) of young animals, less than 90 days old, and fed with the ewe milk and supplemented with concentrates. However, there is an increasing concern on the study of forage production system in growing lambs as a consequence of the interest in diversifying products and producing healthy and safe meat. When forage is included in the fattening diet a reduction of average daily gain is observed and carcasses have a lower degree of fatness, in comparison to the drylot system. The modification in the traditional type of carcass must be evaluated in order to assure that the final product meets the consumer demands. The objective of this experiment was to evaluate the effect of the lambs fattening system on the carcass characteristics and meat quality especially on the instrumental analysis traits as colour and texture.