To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Most oviposition by Helicoverpa zea (Boddie) occurs near the top of the canopy in soybean, Glycine max (L.) Merr, and larval abundance is influenced by the growth habit of plants. However, the vertical distribution of larvae within the canopy is not as well known. We evaluated the vertical distribution of H. zea larvae in determinate and indeterminate varieties, hypothesizing that larval distribution in the canopy would vary between these two growth habits and over time. We tested this hypothesis in a naturally infested replicated field experiment and two experimentally manipulated cage experiments. In the field experiment, flowering time was synchronized between the varieties by manipulating planting date, while infestation timing was manipulated in the cage experiments. Larvae were recovered using destructive sampling of individual soybean plants, and their vertical distribution by instar was recorded from three sampling points over time in each experiment. While larval population growth and development varied between the determinate and indeterminate varieties within and among experiments, we found little evidence that larvae have preference for different vertical locations in the canopy. This study lends support to the hypothesis that larval movement and location within soybean canopies do not result entirely from oviposition location and nutritional requirements.
Helicoverpa zea (Boddie) is a damaging pest of many crops including soybean, Glycine max (L.), especially in the southern United States. Previous studies have concluded that oviposition and development of H. zea larvae mirror the phenology of soybean, with oviposition occurring during full bloom, younger larvae developing on blooms and leaves, intermediate aged larvae developing on varying tissue types, and older larvae developing on flowers and pods. In a field trial, we investigated the presence of natural infestations of H. zea larvae by instar in determinate and indeterminate soybean varieties. In complementary experiments, we artificially infested H. zea and allowed them to oviposit on plants within replicated cages (one with a determinate variety and two with an indeterminate variety). Plants were sampled weekly during the time larvae were present. In the natural infestation experiment, most larvae were found on blooms during R3 and were early to middle instars; by R4, most larvae were found on leaves and were middle to late instars. In contrast, in the cage study, most larvae were found on leaves regardless of soybean growth stage or larval stage. Determinate and indeterminate growth habit did not impact larval preference for different soybean tissue types. Our studies suggest H. zea larvae prefer specific tissue types, but also provide evidence that experimental design can influence the results. Finally, our finding of larval preference for leaves contrasts with findings from previous studies.
Populations of native North American parasitoids attacking Agrilus Curtis (Coleoptera: Buprestidae) species have recently been considered as part of an augmentative biological control programme in an attempt to manage emerald ash borer, Agrilus planipennis Fairmaire, a destructive wood-boring beetle discovered in North America in 2002. We evaluate trapping methods to detect and monitor populations of two important native larval parasitoids, Phasgonophora sulcata Westwood (Hymenoptera: Chalcididae) and Atanycolus Förster (Hymenoptera: Braconidae) species, attacking emerald ash borer in its introduced range. We found that purple prism traps captured more P. sulcata than green prism traps, yellow pan traps, and log samples and thus were considered better for detecting and monitoring P. sulcata populations. Trap type did not affect the number of captures of Atanycolus species. Surprisingly, baiting prism traps with a green leaf volatile or manuka oil did not significantly increase captures of P. sulcata or Atanycolus species. Based on these results, unbaited purple prism traps would be optimal for sampling these native emerald ash borer parasitoids in long-term management programmes.
Dishion and Patterson's work on the unique role of fathers in the coercive family process showed that fathers' coercion explained twice the variance of mothers' in predicting children's antisocial behavior and how treatment and prevention of coercion and promotion of prosocial parenting can mitigate children's problem behaviors. Using these ideas, we employed a sample of 426 divorced or separated fathers randomly assigned to Fathering Through Change (FTC), an interactive online behavioral parent training program or to a waitlist control. Participating fathers had been separated or divorced within the past 24 months with children ages 4 to 12 years. We tested an intent to treat (ITT) mediation hypothesis positing that intervention-induced changes in child problem behaviors would be mediated by changes in fathers' coercive parenting. We also tested complier average causal effects (CACE) models to estimate intervention effects, accounting for compliers and noncompliers in the treatment group and would-be compliers in the controls. Mediation was supported. ITT analyses showed the FTC obtained a small direct effect on father-reported pre–post changes in child adjustment problems (d = .20), a medium effect on pre–post changes in fathers' coercive parenting (d = .61), and a moderate indirect effect to changes in child adjustment (d = .30). Larger effects were observed in CACE analyses.
Recruitment of participants and their retention in randomized controlled trials (RCTs) is key for research efficiency. However, for many trials, recruiting and retaining participants meeting the eligible criteria is extremely challenging. Digital tools are increasingly being used to identify, recruit and retain participants. While these tools are being used, there is a lack of quality evidence to determine their value in trial recruitment.
The aim of the main study was to identify the benefits and characteristics of innovative digital recruitment and retention tools for more efficient conduct of RCTs. Here we report on the qualitative data collected on the characteristics of digital tools required by trialists, research participants, primary care staff, research funders and Clinical Trials Units (CTUs) to judge them useful. A purposive sampling strategy was used to identify 16 participants from five stakeholder groups. A theoretical framework was informed from results of a survey with UKCRC registered CTUs. Semi-structured interviews were conducted and analysed using an inductive approach. A content and thematic analysis was used to explore the stakeholder's viewpoint and the value of digital tools.
The content analysis revealed that ‘barriers / challenges ‘ and ‘awareness of evidence’ were the most commonly discussed areas. Three key emergent themes were present across all groups: ‘security and legitimacy of information’, ‘inclusivity’, and ‘availability of human interaction’. Other themes focused on the engagement of stakeholders in their use and adoption of digital technology to enhance the recruitment/retention process. We also noted some interesting similarities and differences between practitioner and participant groups.
The key emergent themes clearly demonstrate the use of digital technology in the recruitment and retention of participants in trials. The challenge, however, is using these existing tools without sufficient evidence to support the usefulness compared to traditional techniques. This raises important questions around the potential value for future research.
Recruitment of participants to, and their retention in, Randomized Controlled Trials (RCTs) is a key determinant of research efficiency, but is challenging. Digital tools and media are increasingly used to reduce costs, waste and delays in the conduct and delivery of research. The aim of this UK Clinical Trials Unit (CTU) survey was to identify which digital recruitment and retention tools are being used to support RCTs, their benefits and success characteristics.
A survey was sent to all UK Clinical Research Collaboration (UKCRC)-registered CTUs with a webinar to help increase completion. A logic model and definitions of a “digital tool” were developed by iterative refinement by project team members, the Advisory Board (NIHR Research Design service, NHS Trust, NIHR Clinical Research Networks and patient input) and CTUs.
A total of 24/52 (46%) CTUs responded, 6 (25%) of which stated no prior use. Database screening tools (e.g. CPRD, EMIS) were the tool most widely used (45%) for recruitment and were considered very effective (67%). The most mentioned success criteria were saving GP time and reaching more patients. Social media was second (27%), but estimated effectiveness varied considerably, with only 17% stating very effective. Fewer retention tools were used, with SMS / email reminders reported most (10/15 67%), but certainty about effectiveness varied. A detailed definition on what constitutes a digital tool with examples and a logic model showing relationships between the resources, activities, outputs and outcomes for digital tools was developed.
Database screening tools are the most commonly used digital tool for recruitment, with clear success criteria and certainty about effectiveness. Our detailed definition of what constitutes a digital tool, with examples, will inform the NIHR research community about choices and help them identify potential tools to support recruitment and retention.
The endemic Mauritian flying fox Pteropus niger is perceived to be a major fruit pest. Lobbying of the Government of Mauritius by fruit growers to control the flying fox population resulted in national culls in 2015 and 2016, with a further cull scheduled for 2018. A loss of c. 38,318 individuals has been reported and the species is now categorized as Endangered on the IUCN Red List. However, until now there were no robust data available on damage to orchards caused by bats. During October 2015–February 2016 we monitored four major lychee Litchi chinensis and one mango (Mangifera spp.) orchard, and also assessed 10 individual longan Dimocarpus longan trees. Bats and introduced birds caused major damage to fruit, with 7–76% fruit loss (including natural fall and losses from fungal damage) per tree. Bats caused more damage to taller lychee trees (> 6 m high) than to smaller ones, whereas bird damage was independent of tree height. Bats damaged more fruit than birds in tall lychee trees, although this trend was reversed in small trees. Use of nets on fruiting trees can result in as much as a 23-fold reduction in the damage caused by bats if nets are applied correctly. There is still a need to monitor orchards over several seasons and to test non-lethal bat deterrence methods more widely.
Ischemic stroke treatment is time-sensitive, and barriers to providing prehospital care encountered by Emergency Medical Services (EMS) providers have been under-studied.
This study described barriers to providing prehospital care, identified predictors of these barriers, and assessed the impact of these barriers on EMS on-scene time and administration of tissue plasminogen activator (tPA) in the emergency department (ED).
A retrospective cohort study was performed using the Get With The Guidelines-Stroke (GWTG-S; American Heart Association [AHA]; Dallas, Texas USA) registry at two hospitals to identify ischemic stroke patients arriving by EMS. Variables were abstracted from prehospital and hospital medical records and merged with registry data. Barriers to care were grouped into themes. Logistic regression was used to identify predictors of barriers to care, and bi-variate tests were used to assess differences in EMS on-scene time and the proportion of patients receiving tPA between patients with and without barriers.
Barriers to providing prehospital care were documented for 15.5% of patients: 29.6% related to access, 26.7% communication, 23.0% extrication and transportation, 20.0% refusal, and 14.1% assessment/management. Non-white and non-black race (OR: 3.69; 95% CI, 1.63-8.36) and living alone (OR: 1.53; 95% CI, 1.05-2.23) were associated with greater odds of barriers to providing care. The EMS on-scene time was ≥15 minutes for 70.4% of patients who had a barrier to care, compared with 49.0% of patients who did not (P<.001). There was no significant difference in the proportion of patients who were administered tPA between those with and without barriers to care (14.1% vs 19.2%; P=.159).
Barriers to providing prehospital care were documented for a sizable proportion of ischemic stroke patients, with the majority related to patient access and communication, and occurred more frequently among non-white and non-black patients and those living alone. Although EMS on-scene time was longer for patients with barriers to care, the proportion of patients receiving tPA in the ED did not differ.
LiT, CushmanJT, ShahMN, KellyAG, RichDQ, JonesCMC. Barriers to Providing Prehospital Care to Ischemic Stroke Patients: Predictors and Impact on Care. Prehosp Disaster Med.2018;33(5):501–507.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
Crop losses to foraging elephants are one of the primary obstacles to the coexistence of elephants and people. Understanding whether some individuals in a population are more likely to forage on crops, and the temporal patterns of elephant visits to farms, is key to mitigating the negative impacts of elephants on farmers’ livelihoods. We used camera traps to study the crop foraging behaviour of African elephants Loxodonta africana in farmland adjacent to the Udzungwa Mountains National Park in southern Tanzania during October 2010–August 2014. Camera traps placed on elephant trails into farmland detected elephants on 336 occasions during the study period. We identified individual elephants for 126 camera-trap detections. All were independent males, and we identified 48 unique bulls aged 10–29 years. Two-thirds of the bulls identified were detected only once by camera traps during the study period. Our findings are consistent with previous studies that found that adult males are more likely to adopt high-risk feeding behaviours such as crop foraging, although young males dispersing from maternal family units also consume crops in Udzungwa. We found a large number of occasional crop-users (32 of the 48 bulls identified) and a smaller number of repeat crop-users (16 of 48), suggesting that lethal control of crop-using elephants is unlikely to be an effective long-term strategy for reducing crop losses to elephants.
Determining the most appropriate level of care for patients in the prehospital setting during medical emergencies is essential. A large body of literature suggests that, compared with Basic Life Support (BLS) care, Advanced Life Support (ALS) care is not associated with increased patient survival or decreased mortality. The purpose of this special report is to synthesize the literature to identify common study design and analytic challenges in research studies that examine the effect of ALS, compared to BLS, on patient outcomes. The challenges discussed in this report include: (1) choice of outcome measure; (2) logistic regression modeling of common outcomes; (3) baseline differences between study groups (confounding); (4) inappropriate statistical adjustment; and (5) inclusion of patients who are no longer at risk for the outcome. These challenges may affect the results of studies, and thus, conclusions of studies regarding the effect of level of prehospital care on patient outcomes should require cautious interpretation. Specific alternatives for avoiding these challenges are presented.
LiT, JonesCMC, ShahMN, CushmanJT, JuskoTA. Methodological Challenges in Studies Comparing Prehospital Advanced Life Support with Basic Life Support. Prehosp Disaster Med. 2017;32(4):444–450.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
We determine the age of 7 stars in the Ursa Major moving group using a novel method that models the fundamental parameters of rapidly rotating A-stars based on interferometric observations and literature photometry and compares these parameters (namely, radius, luminosity, and rotation velocity) with evolution models that account for rotation. We find these stars to be coeval, thus providing an age estimate for the moving group and validating this technique. With this technique validated, we determine the age of the rapidly rotating, directly imaged planet host star, κ Andromedae.
The task of concluding a history with a consideration of the immediate past carries certain risks. Among them are the temptation to offer predictions that will later prove to have been wrong and the danger that in offering an account of the past ten years or so the writers will miss what later turns out to have been significant and focus instead on events that were not. In the case of Oman in the twenty-first century, however, it is reasonable to imagine that Oman's participation in events associated with the ‘war on terror’, the ‘Arab Spring’ (both of them terms that we will wish to question and perhaps replace) and the crisis over Iran and the threat of regional nuclear proliferation will still be of interest to readers for some time to come. We therefore turn to these events as the principal topics for discussion in this final chapter, before concluding it with a retrospective consideration of how the history we have set out across the book as a whole might help us understand the situation of Oman in the second decade of the twenty-first century.
One additional and crucial issue facing Oman at this juncture is, of course, the future direction of political leadership. The Basic Statute of the State outlines a process for the appointment of a new Sultan on the death of the incumbent. The Ruling Council meets to agree on a successor, and if they cannot reach agreement, the Defence Council then consults instructions prepared by the present Sultan. This arrangement does not address, however, any questions as to the scope and nature of future political leadership. Most Omanis have known no ruler other than Sultan Qaboos. His retention of formal responsibility for multiple aspects of government has almost certainly been possible only because of an exceptional degree of popular and political consensus over his legitimacy. It is unlikely that any successor will automatically inherit such strong support. Some significant changes in the roles and responsibilities of the Sultan and other leading political figures may therefore take place following the death of Sultan Qaboos.
Historical overviews of Oman since the 1970s almost invariably remark upon the speed of social and economic change. Many Omanis, particularly those older than about forty who have clear memories of what life was like in their own childhoods, also comment on how rapid and comprehensive the transformation of their country has been. A familiar trope is to contrast an Oman of the past, in which there were only three schools, few roads and a supposedly isolated tribal society, with the modern metropolis of Muscat in the late twentieth and early twenty-first century, with its shopping malls and ubiquitous mobile phones. Such accounts tend to lend credence to versions of Omani history that make 1970 a unique turning point and designate the recent past as a ‘renaissance’. They also repeat the problematic logic of the ‘modernisation thesis’, which holds that all ‘developing nations’ are making their way towards a predestined state of modern development. As we have already seen, however, more discriminating histories tend to identify much stronger continuities between past and present, even as they recognise the extent of the social and economic transformation wrought with oil in recent decades. It is also noted, from time to time, that Oman has a more old-fashioned air than its other oil-rich neighbours. Muscat is not a high-rise city: municipal planning regulations have been used to shape a very different urban landscape from that of Abu Dhabi, Dubai or Doha. A sort of modern Omani vernacular has been developed in both public and private buildings, featuring predominantly white or pale walls, arched windows and crenellations apparently taken from the architecture of the country's forts. Most Omani citizens still observe the convention established during Sultan Qaboos's reign that national dress should be worn in public: this is particularly noticeable in government offices, but the white dishdasha with coloured cap or turban is still the outfit most frequently seen in the streets, malls, cafés and other public spaces in Muscat.
In the period between the death of Sayyid Said in 1856 and the accession of his namesake, Sultan Said bin Taimur, in 1932, Oman experienced a complex process of change, often described in terms of decline and marginalisation. As British imperial consolidation in the second half of the nineteenth century played a major role in the development of a global economy dominated by the industrialised nations of the North, and driven by their colonial appropriations, both Oman and Zanzibar, which had been very much part of this process in the first half of the century, found themselves increasingly excluded from it. The separation of the two main parts of Sayyid Said's ‘thalassocracy’ – Oman and Zanzibar – from one another was a major factor in this process of decline. The marginalisation of Oman itself involved the decline of local and regional trade in the Gulf, as a result of the development of steamer traffic in the Indian Ocean and the gradual replacement of a regional economy dominated by Indian manufactures and trade networks by a system dominated by the British. This had substantial political consequences for an Omani elite whose power had been built on its participation in the earlier interregional trading networks. In effect, Oman was simply left out of the action, as trade no longer passed through its formerly thriving port cities. In Zanzibar, a slightly different process ensued, in which trade continued apace, but the commercial and political alliance between the Omani ruling elite and the Indian merchant class gave way to increasing British domination. In Zanzibar, the action was seized by the British from the Omanis. However, although these changes did significantly limit Omani participation in the emerging networks of the globalising economy, there were related developments that served to incorporate Oman and Omanis into new 65transnational networks, including, most significantly, those that involved intellectual and political opposition to the emergent colonial world order. In this chapter we show how these new movements, shaped by a renewal of Muslim religious and political thought – the Arab nahda – interacted with long-standing aspects of Omani religious and political culture to offer an alternative to Al Bu Said rule, identified by many during this period as excessively dependent upon British power.