To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Osteoporosis was not a public health concern in black South African (SA) women, until recently when it was reported that the prevalence of vertebral fractures was 9.1% in black compared to 5.0% in white SA women. Accordingly, this study aimed to measure bone mineral density (BMD) of older black SA women and to investigate its association with risk factors for osteoporosis, including strength, muscle and fat mass, dietary intake and objectively measured physical activity (PA).
Methods and materials
Older black SA women (age, 68 (range; 60–85 years) n = 122) completed sociodemographic and quantitative food frequency questionnaires (QFFQ), fasting venous blood samples (25-hydroxycholecalciferol: Vitamin D-25), 24 h urine collection (estimate protein intake), grip strength and PA monitoring (activPAL). Dual-energy x-ray absorptiometry (DXA) scans of the hip (femoral neck and total) and lumbar spine determined BMD and whole-body scans for fat and fat-free soft tissue mass (FFSTM). WHO classifications were used to determine osteopenia (t-score -2.5 to -1), and osteoporosis (t-score < -2.5).
At the lumbar spine 34.4% of the women (n = 42) had osteopenia and 19.7% (n = 24) had osteoporosis. Osteopenia at the left femoral neck was 32% (n = 40) and osteoporosis was 13.1% (n = 16) of participants. The total left hip BMD indicated osteopenia in 27.9% (n = 34) and osteoporosis in 13.1% (n = 16) of participants. Multinomial regression revealed no differences in age (y) or frequency of falls in the past year between all groups (p = 0.727). Compared to those with normal BMD, participants with osteoporosis at the hip neck and lumbar spine were shorter, weighed less and had a lower body mass index (BMI) (all p < 0.05). When adjusted for height, the osteoporotic group (hip neck and lumbar spine) had lower trunk fat (% whole body), FFSTM (kg) and grip strength (kg), compared to those with normal BMD (p < 0.05). Only protein intake (g; 24 h urine analyses) was lower in women with osteoporosis (all sites) compared to those with normal BMD. Fat, carbohydrate and micronutrient intakes (relative to total daily energy intake), and vitamin D concentrations were not associated with BMD (all sites). Number of daily step count and stepping time (min) were inversely associated with BMI (p < 0.05), but not with BMD (all sites; p > 0.05).
A high prevalence of osteopenia and osteoporosis was evident at the lumbar spine and hip in older black SA women. This study highlights the importance of strength, body composition, and protein intake in maintaining BMD and preventing the development of osteoporosis in older women.
In recent years, unmanned aerial vehicle (UAV) technology has expanded to include UAV sprayers capable of applying pesticides. Very little research has been conducted to optimize application parameters and measure the potential of off-target movement from UAV-based pesticide applications. Field experiments were conducted in Raleigh, NC during spring 2018 to characterize the effect of different application speeds and nozzle types on target area coverage and uniformity of UAV applications. The highest coverage was achieved with an application speed of 1 m s−1 and ranged from 30% to 60%, whereas applications at 7 m s−1 yielded 13% to 22% coverage. Coverage consistently decreased as application speed increased across all nozzles, with extended-range flat-spray nozzles declining at a faster rate than air-induction nozzles, likely due to higher drift. Experiments measuring the drift potential of UAV-applied pesticides using extended-range flat spray, air-induction flat-spray, turbo air–induction flat-spray, and hollow-cone nozzles under 0, 2, 4, 7, and 9 m s−1 perpendicular wind conditions in the immediate 1.75 m above the target were conducted in the absence of natural wind. Off-target movement was observed under all perpendicular wind conditions with all nozzles tested but was nondetectable beyond 5 m away from the target. Coverage from all nozzles exhibited a concave-shaped curve in response to the increasing perpendicular wind speed due to turbulence. The maximum target coverage in drift studies was observed when the perpendicular wind was 0 and 8.94 m s−1, but higher turbulence at the two highest perpendicular wind speeds (6.71 and 8.94 m s−1) increased coverage variability, whereas the lowest variability was observed at 2.24 m s−1 wind speed. Results suggested that air-induction flat-spray and turbo air–induction flat-spray nozzles and an application speed of 3 m s−1 provided an adequate coverage of target areas while minimizing off-target movement risk.
Tanzania is commonly cited as “a success story” where a cohesive society has been built in tandem with its nationhood. In this chapter, we offer an account of interplay between ethnicity and social norms in the context of nation building in Tanzania and highlight the historical transformation of localized, ethnic-based mechanisms for self-protection, “trust networks”, to a national framework for trust enhancement and resolution of conflicts at local levels. This, we argue, was the key for acceptance of national identity by Tanzanians for self-protection, and, hence, a transition from divided pasts to cohesive futures. The chapter traces nation building efforts in Tanzania, and explains why Tanzania is an exception to the patterns of violence and instability experienced in Sub-Saharan Africa. It is argued that that, although conflicts are sometime inevitable, cross-cutting identities such as occupation, and particularly the all-encompassing identity of nationality, can help to decrease the likelihood that conflicts will divide the nation. Diversity may present a challenge to national unity, but it is not insuperable if the political leadership is genuinely committed to deemphasizing ethnic group identities in the public sphere and pursues policies which consider the goal of equality.
Residual herbicides are routinely applied to control troublesome weeds in pumpkin production. Fluridone and acetochlor, Groups 12 and 15 herbicides, respectively, provide broad-spectrum PRE weed control. Field research was conducted in Virginia and New Jersey to evaluate pumpkin tolerance and weed control to PRE herbicides. Treatments consisted of fomesafen at two rates, ethalfluralin, clomazone, halosulfuron, fluridone, S-metolachlor, acetochlor emulsifiable concentrate (EC), acetochlor microencapsulated (ME), and no herbicide. At one site, fluridone, acetochlor EC, acetochlor ME, and halosulfuron injured pumpkin 81%, 39%, 34%, and 35%, respectively, at 14 d after planting (DAP); crop injury at the second site was 40%, 8%, 19%, and 33%, respectively. Differences in injury between the two sites may have been due to the amount and timing of rainfall after herbicides were applied. Fluridone provided 91% control of ivyleaf morningglory and 100% control of common ragweed at 28 DAP. Acetochlor EC controlled redroot pigweed 100%. Pumpkin treated with S-metolachlor produced the most yield (10,764 fruits ha–1) despite broadcasting over the planted row; labeling requires a directed application to row-middles. A separate study specifically evaluated fluridone applied PRE at 42, 84, 126, 168, 252, 336, and 672 g ai ha–1. Fluridone resulted in pumpkin injury ≥95% when applied at rates of ≥168 g ai ha–1; significant yield loss was noted when the herbicide was applied at rates >42 g ai ha–1. We concluded that fluridone and acetochlor formulations are unacceptable candidates for pumpkin production.
Laser-based compact MeV X-ray sources are useful for a variety of applications such as radiography and active interrogation of nuclear materials. MeV X rays are typically generated by impinging the intense laser onto ~mm-thick high-Z foil. Here, we have characterized such a MeV X-ray source from 120 TW (80 J, 650 fs) laser interaction with a 1 mm-thick tantalum foil. Our measurements show X-ray temperature of 2.5 MeV, flux of 3 × 1012 photons/sr/shot, beam divergence of ~0.1 sr, conversion efficiency of ~1%, that is, ~1 J of MeV X rays out of 80 J incident laser, and source size of 80 m. Our measurement also shows that MeV X-ray yield and temperature is largely insensitive to nanosecond laser contrasts up to 10−5. Also, preliminary measurements of similar MeV X-ray source using a double-foil scheme, where the laser-driven hot electrons from a thin foil undergoing relativistic transparency impinging onto a second high-Z converter foil separated by 50–400 m, show MeV X-ray yield more than an order of magnitude lower compared with the single-foil results.
To achieve their conservation goals individuals, communities and organizations need to acquire a diversity of skills, knowledge and information (i.e. capacity). Despite current efforts to build and maintain appropriate levels of conservation capacity, it has been recognized that there will need to be a significant scaling-up of these activities in sub-Saharan Africa. This is because of the rapid increase in the number and extent of environmental problems in the region. We present a range of socio-economic contexts relevant to four key areas of African conservation capacity building: protected area management, community engagement, effective leadership, and professional e-learning. Under these core themes, 39 specific recommendations are presented. These were derived from multi-stakeholder workshop discussions at an international conference held in Nairobi, Kenya, in 2015. At the meeting 185 delegates (practitioners, scientists, community groups and government agencies) represented 105 organizations from 24 African nations and eight non-African nations. The 39 recommendations constituted six broad types of suggested action: (1) the development of new methods, (2) the provision of capacity building resources (e.g. information or data), (3) the communication of ideas or examples of successful initiatives, (4) the implementation of new research or gap analyses, (5) the establishment of new structures within and between organizations, and (6) the development of new partnerships. A number of cross-cutting issues also emerged from the discussions: the need for a greater sense of urgency in developing capacity building activities; the need to develop novel capacity building methodologies; and the need to move away from one-size-fits-all approaches.
The importance of PRE herbicide applications in cotton has increased since the evolution of glyphosate-resistant (GR) Palmer amaranth. Cotton producers are relying on residual herbicides for control of Palmer amaranth, as POST options are limited or ineffective. S-Metolachlor, acetochlor, fomesafen, and dicamba all provide PRE control of Palmer amaranth; however, little is known about the effect of irrigation rate on incorporation and herbicidal efficacy. In 2015, an experiment was conducted on fine sand and loamy sand soils to evaluate the influence of irrigation volume (0.0 to 12.7 mm ha−1) on Palmer amaranth control with PRE herbicides. Irrigation volume after herbicide application was significant for both S-metolachlor and acetochlor. Efficacy of S-metolachlor was greatest in plots receiving 6.4 and 12.7 mm of irrigation where Palmer amaranth biomass was reduced to 4 and 2% of a nontreated control (NTC), respectively, compared with 61% in plots with the 0-mm irrigation treatment. Palmer amaranth control by acetochlor incorporated at 3.2- to 12.7-mm irrigation did not differ but did reduce Palmer amaranth biomass compared with the 1.6-mm irrigation rate. Irrigation volume was not significant for the soil incorporation of fomesafen or dicamba. Across all herbicides, fomesafen-treated plots provided the most consistent control of Palmer amaranth, reducing its biomass to < 3% of NTC at all irrigation rates. Dicamba provided the least and most inconsistent control of Palmer amaranth, producing 17 to 51% of NTC biomass.
Understanding stellar birth requires observations of the clouds in which they form. These clouds are dense and self-gravitating, and in all existing observations, they are molecular with H2 the dominant species and CO the best available. When the abundances of carbon and oxygen are low compared to hydrogen, and the opacity from dust is also low, as in primeval galaxies and local dwarf irregular galaxies CO forms slowly and is easily destroyed, so it cannot accumulate inside dense clouds. Then we lose our ability to trace the gas in regions of star formation and we lose critical information on the temperatures, densities, and velocities of the material that collapses. I will report on high resolution observations with ALMA of CO clouds in the local group dwarf irregular galaxy WLM, which has a metallicity that is 13% of the solar value and 50% lower than the previous CO detection threshold and the properties derived of very small dense CO clouds mapped..
In this study the putative protective seroprevalence (PPS) of IgG antibodies to the 27-kDa and 15/17-kDa Cryptosporidium antigens in sera of healthy participants who were and were not exposed to Cryptosporidium oocysts via surface water-derived drinking water was compared. The participants completed a questionnaire regarding risk factors that have been shown to be associated with infection. The PPS was significantly greater (49−61%) in settlements where the drinking water originated from surface water, than in the control city where riverbank filtration was used (21% and 23%). Logistic regression analysis on the risk factors showed an association between bathing/swimming in outdoor pools and antibody responses to the 15/17-kDa antigen complex. Hence the elevated responses were most likely due to the use of contaminated water. Results indicate that waterborne Cryptosporidium infections occur more frequently than reported but may derive from multiple sources.
Dietary pattern (DP) analysis allows examination of the combined effects of nutrients and foods on the markers of CVD. Very few studies have examined these relationships during adolescence or young adulthood. Traditional CVD risk biomarkers were analysed in 12–15-year-olds (n 487; Young Hearts (YH)1) and again in the same individuals at 20–25 years of age (n 487; YH3). Based on 7 d diet histories, in the present study, DP analysis was performed using a posteriori principal component analysis for the YH3 cohort and the a priori Mediterranean Diet Score (MDS) was calculated for both YH1 and YH3 cohorts. In the a posteriori DP analysis, YH3 participants adhering most closely to the ‘healthy’ DP were found to have lower pulse wave velocity (PWV) and homocysteine concentrations, the ‘sweet tooth’ DP were found to have increased LDL concentrations, systolic blood pressure, and diastolic blood pressure and decreased HDL concentrations, the ‘drinker/social’ DP were found to have lower LDL and homocysteine concentrations, but exhibited a trend towards a higher TAG concentration, and finally the ‘Western’ DP were found to have elevated homocysteine and HDL concentrations. In the a priori dietary score analysis, YH3 participants adhering most closely to the Mediterranean diet were found to exhibit a trend towards a lower PWV. MDS did not track between YH1 and YH3, and nor was there a longitudinal relationship between the change in the MDS and the change in CVD risk biomarkers. In conclusion, cross-sectional analysis revealed that some associations between DP and CVD risk biomarkers were already evident in the young adult population, namely the association between the healthy DP (and the MDS) and PWV; however, no longitudinal associations were observed between these relatively short time periods.
I used to pore over the latest offerings from various highly reputable academic or scholarly quarters, and find nothing of any real practical help. (Tony Blair, cited in Powell, 2011)
During the 2000s there was a great deal of rhetoric about evidencebased policy and evidence-based policy-making (Davies et al, 2000; Perkins et al, 2010). However, policy and policy-making often appear to be rather more based on the existing ideas (or even prejudices or ideologies) of those in positions of power rather than on research evidence. And there are several reasons for this.
Policy-makers may believe they already know what needs to be done, and so do not need to examine what research says. Equally, those in positions of power may find research inaccessible in terms of its place of publication, or that it is written in dense, academic language they find difficult to understand. They may also find research to be too equivocal, too concerned with trying to consider both sides of a problem than coming to a conclusion or solution that they can get on with turning into a workable policy. Policy-makers may also have strong views about what needs to be done by government, regardless of what researchers are telling them, often seeming to put their own political goals ahead of research, and their ideology ahead of evidence.
When looking back at NHS reorganisations, it does seems to be the case that since the 1980s policy-makers have been unable to resist changing organisational structures, not even waiting to see if the last changes they attempted to put into place had worked or hadn’t. Secretaries of State for Health have sometimes seemed as if they are intent on leaving their own impression on the NHS organisation without considering whether what they are planning to change has any real chance of working.
From the perspective of academics and researchers, on the other hand, policy-makers and politicians often appear to have short attention spans and do not want to engage with the complexities of the area they are trying to change. Politicians can sometimes look as if they have decided what needs to be done without looking at lessons from the past or from other countries.
Reforming Healthcare: What's the Evidence? is the first major critical overview of the research published on healthcare reform in England from 1990 onwards by a team of leading UK health policy academics.
Chapter Two explored the Conservative government's attempts to reorganise healthcare in the 1980s, taking this account up to the introduction of the internal market at the end of that decade.
Having outlined the political and ideational context into which the internal market was being introduced at the end of Chapter Two, we now consider the programme theory for it. How was the internal market meant to work?
This chapter first considers the programme theory of the effects of the 1990s internal market reorganisation, before taking the story on to the change in government in 1997and New Labour's various attempts to reorganise healthcare in the 2000s. Chapters Four and Five then consider the evidence from Labour's healthcare reorganisations, before turning to the coalition government's 2010 Health and Social Care Bill.
Purchaser–provider split and the internal market
The logic underlying the programme theory of the purchaser–provider split was that it would allow purchasers to use their funding decisions to reward good providers of care with contracts, giving all providers a funding incentive to improve the quality of their service, and creating the opportunity for successful services to expand (Day and Klein, 1991). The internal market was also meant to incentivise purchasers to find the best value and best quality care for the people they were serving. The government believed that the introduction of market-like governance into the NHS would improve its performance by increasing efficiency and productivity, while at the same time raising quality and reducing the wasted ‘resources on excessive administration’ (Le Grand, 1991, p 1262) that they regarded as coming from a traditional public sector bureaucracy. The internal market represented an internal or wholesale market (in contrast to New Labour's later external, retail market) in that NHS managers were supposed to be working on behalf of patients as their agents, rather than patients being responsible for driving the process of choosing care for themselves. Patients, however, had limited choice or say in their healthcare apart from through GP fundholding and a very limited number of ‘extra contractual referrals’ (ECRs), which were quasi-individual contracts rather than making use of the more usual block contacting process.