To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objective: To conduct a formative evaluation of a transitional intervention for family caregivers, with assessment of feasibility, acceptability, appropriateness, and potential benefits. Methods: The intervention aimed to provide emotional support, information on community resources, and information and support for development of coping skills for the caregivers of patients aged 65 and older who were to be discharged home from an acute medical hospital admission. We used a one-group, pre- and three-month post-test study design. Results: Ninety-one patient-caregiver dyads were recruited. Of these, 63 caregivers (69%) received all five planned intervention sessions, while 60 (66%) completed the post-test. There were significant reductions in caregiver anxiety and depression following the intervention, and high rates of satisfaction. Discussion: This transitional intervention should be further evaluated, preferably with a control group, either as a stand-alone intervention or as one component of a comprehensive transitional intervention for older patients and their caregivers.
The amount of food eaten by the pig is a balance between the needs of the animal and the ability of the food to meet those demands. Major factors influencing the needs of the animal are live weight, genotype and sex. Equations have been established to predict the intake of pigs at different live weights but variation exists amongst these equations which reflect a change to genotypes with smaller appetites. Rapid growth may be limited by the appetite of the pig and differences in intake between the sexes are notable, the biggest being castrated males compared with boars and gilts. Intakes of boars and gilts can be similar but often gilts eat slightly more. Dietary energy has a marked effect, with the pig attempting to adjust its daily energy intake by eating less of high-energy diets. The extent to which it can exert this physiological control is limited when a stage is reached where it is unable to compensate by eating more of a low-quality diet because of physical capacity. The influence of protein on intake has received much less attention but there are clear indications that pigs eat less of diets which are very high or very low in crude protein. Intake is also influenced by protein quality and there is evidence from work at the University of Nottingham that quite small variations in lysine level relative to other essential amino acids can markedly affect intake.
Saharan trade has been much debated in modern times, but the main focus of interest remains the medieval and early modern periods, for which more abundant written sources survive. The pre-Islamic origins of Trans-Saharan trade have been hotly contested over the years, mainly due to a lack of evidence. Many of the key commodities of trade are largely invisible archaeologically, being either of high value like gold and ivory, or organic like slaves and textiles or consumable commodities like salt. However, new research on the Libyan people known as the Garamantes and on their trading partners in the Sudan and Mediterranean Africa requires us to revise our views substantially. In this volume experts re-assess the evidence for a range of goods, including beads, textiles, metalwork and glass, and use it to paint a much more dynamic picture, demonstrating that the pre-Islamic Sahara was a more connected region than previously thought.
Salmonella causes an estimated 1·2 million illnesses annually in the USA. Salmonella enterica serotype Javiana (serotype Javiana) is the fourth most common serotype isolated from humans, with the majority of illnesses occurring in southeastern states. The percentage of wetland cover by wetland type and the average incidence rates of serotype Javiana infection in selected counties of the Foodborne Disease Active Surveillance Network (FoodNet) were examined. This analysis explored the relationship between wetland environments and incidence in order to assess whether regional differences in environmental habitats may be associated with observed variations in incidence. Findings suggest that environmental habitats may support reservoirs or contribute to the persistence of serotype Javiana, and may frequently contribute to the transmission of infection compared with other Salmonella serotypes.
The WAIS (West Antarctic Ice Sheet) Divide deep ice core was recently completed to a total depth of 3405 m, ending 50 m above the bed. Investigation of the visual stratigraphy and grain characteristics indicates that the ice column at the drilling location is undisturbed by any large-scale overturning or discontinuity. The climate record developed from this core is therefore likely to be continuous and robust. Measured grain-growth rates, recrystallization characteristics, and grain-size response at climate transitions fit within current understanding. Significant impurity control on grain size is indicated from correlation analysis between impurity loading and grain size. Bubble-number densities and bubble sizes and shapes are presented through the full extent of the bubbly ice. Where bubble elongation is observed, the direction of elongation is preferentially parallel to the trace of the basal (0001) plane. Preferred crystallographic orientation of grains is present in the shallowest samples measured, and increases with depth, progressing to a vertical-girdle pattern that tightens to a vertical single-maximum fabric. This single-maximum fabric switches into multiple maxima as the grain size increases rapidly in the deepest, warmest ice. A strong dependence of the fabric on the impurity-mediated grain size is apparent in the deepest samples.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.
As poultry consumption continues to increase worldwide, and as the United States accounts for about one-third of all poultry exports globally, understanding factors leading to poultry-associated foodborne outbreaks in the United States has important implications for food safety. We analysed outbreaks reported to the United States’ Foodborne Disease Outbreak Surveillance System from 1998 to 2012 in which the implicated food or ingredient could be assigned to one food category. Of 1114 outbreaks, poultry was associated with 279 (25%), accounting for the highest number of outbreaks, illnesses, and hospitalizations, and the second highest number of deaths. Of the 149 poultry-associated outbreaks caused by a confirmed pathogen, Salmonella enterica (43%) and Clostridium perfringens (26%) were the most common pathogens. Restaurants were the most commonly reported location of food preparation (37% of poultry-associated outbreaks), followed by private homes (25%), and catering facilities (13%). The most commonly reported factors contributing to poultry-associated outbreaks were food-handling errors (64%) and inadequate cooking (53%). Effective measures to reduce poultry contamination, promote safe food-handling practices, and ensure food handlers do not work while ill could reduce poultry-associated outbreaks and illnesses.
The Parkes Interferometer was designed and constructed by the CSIRO Radiophysics Laboratory. It is unique in having continuously variable North-South and East-West baselines making possible plotting the visibility function of a source in a single observation. From an engineering point of view it is noteworthy in that the receivers, control circuits and computer employ semi-conductors exclusively.
Research was conducted from 2011 to 2014 to determine weed population dynamics and frequency of glyphosate-resistant (GR) Palmer amaranth with herbicide programs consisting of glyphosate, dicamba, and residual herbicides in dicamba-tolerant cotton. Five treatments were maintained in the same plots over the duration of the experiment: three sequential POST applications of glyphosate with or without pendimethalin plus diuron PRE; three sequential POST applications of glyphosate plus dicamba with and without the PRE herbicides; and a POST application of glyphosate plus dicamba plus acetochlor followed by one or two POST applications of glyphosate plus dicamba without PRE herbicides. Additional treatments included alternating years with three sequential POST applications of glyphosate only and glyphosate plus dicamba POST with and without PRE herbicides. The greatest population of Palmer amaranth was observed when glyphosate was the only POST herbicide throughout the experiment. Although diuron plus pendimethalin PRE in a program with only glyphosate POST improved control during the first 2 yr, these herbicides were ineffective by the final 2 yr on the basis of weed counts from soil cores. The lowest population of Palmer amaranth was observed when glyphosate plus dicamba were applied regardless of PRE herbicides or inclusion of acetochlor POST. Frequency of GR Palmer amaranth was 8% or less when the experiment was initiated. Frequency of GR Palmer amaranth varied by herbicide program during 2012 but was similar among all herbicide programs in 2013 and 2014. Similar frequency of GR Palmer amaranth across all treatments at the end of the experiment most likely resulted from pollen movement from Palmer amaranth treated with glyphosate only to any surviving female plants regardless of PRE or POST treatment. These data suggest that GR Palmer amaranth can be controlled by dicamba and that dicamba is an effective alternative mode of action to glyphosate in fields where GR Palmer amaranth exists.
A challenge to the development of foodborne illness prevention measures is determining the sources of enteric illness. Microbial subtyping source-attribution models attribute illnesses to various sources, requiring data characterizing bacterial isolate subtypes collected from human and food sources. We evaluated the use of antimicrobial resistance data on isolates of Salmonella enterica serotype Hadar, collected from ill humans, food animals, and from retail meats, in two microbial subtyping attribution models. We also compared model results when either antimicrobial resistance or pulsed-field gel electrophoresis (PFGE) patterns were used to subtype isolates. Depending on the subtyping model used, 68–96% of the human infections were attributed to meat and poultry food products. All models yielded similar outcomes, with 86% [95% confidence interval (CI) 80–91] to 91% (95% CI 88–96) of the attributable infections attributed to turkey, and 6% (95% CI 2–10) to 14% (95% CI 8–20) to chicken. Few illnesses (<3%) were attributed to cattle or swine. Results were similar whether the isolates were obtained from food animals during processing or from retail meat products. Our results support the view that microbial subtyping models are a flexible and robust approach for attributing Salmonella Hadar.
Salmonella enterica causes an estimated 1 million domestically acquired foodborne illnesses annually. Salmonella enterica serovar Enteritidis (SE) is among the top three serovars of reported cases of Salmonella. We examined trends in SE foodborne outbreaks from 1973 to 2009 using Joinpoint and Poisson regression. The annual number of SE outbreaks increased sharply in the 1970s and 1980s but declined significantly after 1990. Over the study period, SE outbreaks were most frequently attributed to foods containing eggs. The average rate of SE outbreaks attributed to egg-containing foods reported by states began to decline significantly after 1990, and the proportion of SE outbreaks attributed to egg-containing foods began declining after 1997. Our results suggest that interventions initiated in the 1990s to decrease SE contamination of shell eggs may have been integral to preventing SE outbreaks.