To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Herbicides have been a primary means of managing undesirable brush on grazing lands across the southwestern United States for decades. Continued encroachment of honey mesquite and huisache on grazing lands warrants evaluation of treatment life and economics of current and experimental treatments. Treatment life is defined as the time between treatment application and when canopy cover of undesirable brush returns to a competitive level with native forage grasses (i.e., 25% canopy cover for mesquite and 30% canopy cover for huisache). Treatment life of industry-standard herbicides was compared with that of aminocyclopyrachlor plus triclopyr amine (ACP+T) from 10 broadcast-applied honey mesquite and five broadcast-applied huisache trials established from 2007 through 2013 across Texas. On average, the treatment life of industry standard treatments (IST) for huisache was 3 yr. In comparison, huisache canopy cover was only 2.5% in plots treated with ACP+T 3 yr after treatment. The average treatment life of IST for honey mesquite was 8.6 yr, whereas plots treated with ACP+T had just 2% mesquite canopy cover at that time. Improved treatment life of ACP+T compared with IST life was due to higher mortality resulting in more consistent brush canopy reduction. The net present values (NPVs) of ACP+T and IST for both huisache and mesquite were similar until the treatment life of the IST application was reached (3 yr for huisache and 8.6 yr for honey mesquite). At that point, NPVs of the programs diverged as a result of brush competition with desirable forage grasses and additional input costs associated with theoretical follow-up IST necessary to maintain optimum livestock forage production. The ACP+T treatments did not warrant a sequential application over the 12-yr analysis for huisache or 20-yr analysis for honey mesquite that this research covered. These results indicate ACP+T provides cost-effective, long-term control of honey mesquite and huisache.
Huisache is a major brush problem on native rangelands and pastures in South Texas. Although herbicide applications to foliage provide very high plant-kill levels, the same herbicides have not proven reliable when applied as broadcast ground or aerial foliar treatments. Aerial and ground broadcast herbicide foliar treatments were applied to 31 huisache sites. Soil temperature and soil moisture were measured at a depth of 30 cm at the time of herbicide application. Cumulative rainfall before herbicide application was recorded. Across all aerial treatments, plant mortality was 69% for plants shorter than 2 m versus 40% for plants taller than 2 m. Across all aerial- and ground-treated sites, plants shorter than 2 m had an average 89% mortality when cumulative 2-wk rainfall was at least 50 mm, versus 72% mortality with cumulative rainfall less than 50 mm. Average plant mortality was 84% when 4-wk cumulative rainfall was at least 76 mm, versus 71% with rainfall less than 76 mm; and 85% when, on a dry-to-wet scale of 0 to 10, soil moisture measured at least 8, versus 71% when soil moisture measured less than 8. In a separate aerial trial, plant-mortality effects of spray droplet size (417, 630, and 800 µm) and spray volume (37.4 L ha−1 and 93.5 L ha−1) were replicated and tested at a single study site in 2014. Plant mortality was lowest for the 93.5 L ha−1 and 800 µm treatment. Plant mortality rates for other treatments were similar, demonstrating a greater importance of droplet size than spray volume. Targeting huisache trees shorter than 2 m, when cumulative rainfall has reached at least 50 mm or at least 76 mm 2 or 4 wk before application, respectively, as well as maintaining spray droplet sizes no larger than 630 µm can increase herbicide efficacy with foliar broadcast applications.
Species distribution models (SDMs) are statistical tools used to develop continuous predictions of species occurrence. ‘Integrated SDMs’ (ISDMs) are an elaboration of this approach with potential advantages that allow for the dual use of opportunistically collected presence-only data and site-occupancy data from planned surveys. These models also account for survey bias and imperfect detection through the use of a hierarchical modelling framework that separately estimates the species–environment response and detection process. This is particularly helpful for conservation applications and predictions for rare species, where data are often limited and prediction errors may have significant management consequences. Despite this potential importance, ISDMs remain largely untested under a variety of scenarios. We performed an exploration of key modelling decisions and assumptions on an ISDM using the endangered Baird’s tapir (Tapirus bairdii) as a test species. We found that site area had the strongest effect on the magnitude of population estimates and underlying intensity surface and was driven by estimates of model intercepts. Selecting a site area that accounted for the individual movements of the species within an average home range led to population estimates that coincided with expert estimates. ISDMs that do not account for the individual movements of species will likely lead to less accurate estimates of species intensity (number of individuals per unit area) and thus overall population estimates. This bias could be severe and highly detrimental to conservation actions if uninformed ISDMs are used to estimate global populations of threatened and data-deficient species, particularly those that lack natural history and movement information. However, the ISDM was consistently the most accurate model compared to other approaches, which demonstrates the importance of this new modelling framework and the ability to combine opportunistic data with systematic survey data. Thus, we recommend researchers use ISDMs with conservative movement information when estimating population sizes of rare and data-deficient species. ISDMs could be improved by using a similar parameterization to spatial capture–recapture models that explicitly incorporate animal movement as a model parameter, which would further remove the need for spatial subsampling prior to implementation.
Introduction: The Canadian population is aging and an increasing proportion of emergency department (ED) patients are seniors. ED visits among seniors are frequently instigated by a fall at home. Some of these patients develop intracranial hemorrhage (ICH) because of falling. There has been little research on the frequency of ICH in elderly patients who fall, and on which clinical factors are associated with ICH in these patients. The aim of this study was to identify the incidence of ICH, and the clinical features which are associated with ICH, in seniors who present to the ED having fallen. Methods: This was a prospective cohort study conducted in three EDs. Patients were included if they were age >65 years, and presented to the ED within 48 hours of a fall on level ground, off a bed/chair/toilet or down one step. Patients were excluded if they fell from a height, were knocked over by a vehicle or were assaulted. ED physicians recorded predefined clinical findings (yes/no) before any head imaging was done. Head imaging was done at the ED physician's discretion. All patients were followed for 6 weeks (both by telephone call and chart review at 6 weeks) for evidence of ICH. Associations between baseline clinical findings and the presence of ICH were assessed with multivariable logistic regression. Results: In total, 1753 patients were enrolled. The prevalence of ICH was 5.0% (88 patients), of whom 74 patients had ICH on the ED CT scan and 14 had ICH diagnosed during follow-up. 61% were female and the median age was 82 (interquartile range 75-88). History included hypertension in 76%, diabetes in 29%, dementia in 27%, stroke/TIA in 19%, major bleeding in 11% and chronic kidney disease in 11%. 35% were on antiplatelet therapy and 25% were on an anticoagulant. Only 4 clinical variables were independently associated with ICH: bruise/laceration on the head (odds ratio (OR): 4.3; 95% CI 2.7-7.0), new abnormalities on neurological examination (OR: 4.4; 2.4-8.1), chronic kidney disease (OR: 2.4; 1.3-4.6) and reduced GCS from baseline (OR: 1.9; 1.0-3.4). Neither anticoagulation (OR: 0.9; 0.5-1.6) nor antiplatelet use (OR: 1.1; 0.6-1.8) appeared to be associated with ICH. Conclusion: This prospective study found a prevalence of ICH of 5.0% in seniors after a fall, and that bruising on the head, abnormal neurological examination, abnormal GCS and chronic kidney disease were predictive of ICH.
Throughout its range in Latin America the jaguar Panthera onca is threatened by habitat loss and fragmentation, and conflict with humans. Protected areas in Panama harbour some of the last remaining suitable habitat for jaguars and are vital to conservation. However, no previous studies had assessed which factors in particular affect the tolerance of rural Panamanians towards jaguars and National Park conservation, which is important to jaguar persistence. Whether these factors are consistent with previous research on human–carnivore coexistence is unclear. To address this we estimated the number of instances of depredation of cattle by jaguars, and assessed attitudes and perceptions of rural Panamanians. We conducted semi-structured interviews in two disparate study areas: Cerro Hoya National Park and Darién National Park. Depredation events were more frequent in the latter, but only residents of the former reported conflict between people and coyotes Canis latrans. Positive perceptions of jaguars and National Parks, and criticism of park management, increased with level of education and land ownership. Men were more open to receiving help on their farms to mitigate impacts of jaguars, and more tolerant of the presence of jaguars, than women. Residents from both study areas indicated high appreciation for their respective National Parks. We provide recommendations to improve community outreach and education initiatives, and suggest priority areas for future mitigation efforts concerning human–jaguar interactions in Panama.
Large carnivores are recolonizing parts of North America and Europe as a result of modern management and conservation policy. In the midwestern USA, black bears Ursus americanus, cougars Puma concolor and grey wolves Canis lupus have the potential to recolonize provided there is suitable habitat. Understanding where large carnivores may become re-established will prepare resource professionals for the inevitable ecosystem effects and potential human–carnivore conflicts associated with these species. We developed individual and combined models of suitable habitat for black bears, cougars and wolves in 18 midwestern states, using geospatial data, expert-opinion surveys, and multi-criteria evaluation. Large, contiguous areas of suitable habitat comprised 35, 21 and 13% of the study region for wolves, bears and cougars, respectively. Approximately 12% of the region was considered suitable for all three species. Arkansas, Minnesota, Texas and Wisconsin had the highest proportions (> 40%) of suitable habitat for black bears; Arkansas, Michigan, Missouri, Texas and Wisconsin had the highest proportions (≥ 20%) of suitable habitat for cougars; and only in four states in the study region was < 29% of land suitable wolf habitat. Models performed well when validated by comparing suitability values of independent sets of known carnivore locations to those of random locations. Contiguous areas of suitable habitat typically spanned multiple states, thus coordination across boundaries and among agencies will be vital to successful conservation of these species. Our models highlight differences in habitat requirements and geographical distribution of potential habitat among these carnivores, as well as areas vital to their persistence in the Midwest.
Climate change is predicted to be a major threat for biodiversity and, from a conservation prospective, it is important to understand how ecosystems may respond to that change. Predicted climate change effects on the distribution of meadows in the arid and semi-arid Argentinean Patagonia by 2050 were assessed for change trends and areas of desertification vulnerability using species distribution models (SDM) and climate-change models. Four modelling techniques composed an ensemble-forecasting approach. Suitable areas for meadows will decrease by 7.85% by 2050 given predicted changes in climate. However, there were two contrasting trends: severe reduction of suitable areas for meadows in north-west Patagonia and Tierra del Fuego Island, and an expansion of suitable areas for meadows in the south and a small section in the north-west. Meadows in Patagonia will likely be impacted by climate change, probably due to changes in precipitation regimes, and consequently many species that rely on meadows in an arid environment will also be impacted. Given the low level of protection of meadows in Patagonia, such information on meadow distribution and vulnerability to climate change will be important for increasing and improving the network of conservation areas through conservation planning.
A 2-yr (2009 to 2010), no-till (direct-seeded) “follow-up” study was conducted at five western Canada sites to determine weed interference impacts and barley and canola yield recovery after 4 yr of variable crop inputs (seed, fertilizer, herbicide). During the initial period of the study (2005 to 2008), applying fertilizer in the absence of herbicides was often worse than applying no optimal inputs; in the former case, weed biomass levels were at the highest levels (2,788 to 4,294 kg ha−1), possibly due to better utilization of nutrients by the weeds than by the crops. After optimal inputs were restored (standard treatment), most barley and canola plots recovered to optimal yield levels after 1 yr. However, 4 yr with all optimal inputs but herbicides led to only 77% yield recovery for both crops. At most sites, when all inputs were restored for 2 yr, all plots yielded similarly to the standard treatment combination. Yield “recovery” occurred despite high weed biomass levels (> 4,000 kg ha−1) prior to the first recovery year and despite high wild oat seedbank levels (> 7,000 seeds m−2) at the end of the second recovery year. In relatively competitive narrow-row crops such as barley and canola, the negative effects of high soil weed seedbanks can be mitigated if growers facilitate healthy crop canopies with appropriate seed and fertilizer rates in combination with judicious herbicide applications to adequately manage recruited weeds.
A study was initiated in 2001at four locations in western Canada to
investigate an integrated approach to managing wild oat, the region's worst
weed. The study examined the effects of combining semidwarf or tall barley
cultivars with normal or twice-normal barley seeding rates in either
continuous barley or a barley–canola–barley–field pea–barley rotation.
Herbicides were applied at 25, 50, and 100% of recommended rates. The first
phase of the study was completed in 2005. This paper reports on the second
phase, which was continued for four more years at two of the locations,
Beaverlodge and Fort Vermilion, AB, Canada. The objective was to determine
the long-term impact of the treatments on wild oat seed in the soil seed
bank. In 2009 (final year), the diverse rotation combined with the higher
barley seeding rate (optimal cultural practice) resulted in higher barley
yields and reduced wild oat biomass compared to continuous barley and lower
barley seeding rate (suboptimal cultural practice). In contrast to the first
phase, barley yield was higher with the semidwarf cultivar, and cultivar had
no effect on wild oat management. Wild oat seed in the soil seed bank
decreased with increasing herbicide rate, but amounts were often lower with
the optimal cultural practice. For example, at the recommended herbicide
rate at Beaverlodge, an approximate 40-fold reduction in wild oat seed
occurred with the optimal compared to the suboptimal cultural practice. The
results indicate that combining optimal cultural practices with herbicides
will reduce the amount of wild oat seed in the soil seed bank, and result in
higher barley yields. Optimal cultural practices may also compensate for
reduced herbicidal effects in terms of reducing wild oat seed accumulation
in the soil seed bank and increasing barley yield. The results have
implications for mitigating the evolution of herbicide resistance in wild
We examined human and ecological attributes of attacks by tigers Panthera tigris and leopards Panthera pardus on humans in and around the Tadoba-Andhari Tiger Reserve in the Chandrapur District of central India to provide recommendations to prevent or mitigate conflicts between people and large carnivores. During 2005–2011 132 carnivore attacks on humans occurred, 71 (54%) of which were lethal to humans. Tigers and leopards were responsible for 78% and 22% of attacks, respectively. Significantly more victims were attacked while collecting minor forest products than during other activities. Probability of attack significantly decreased with increasing distance from forests and villages, and attacks occurred most frequently in the forested north-eastern corridor of the study area. Human activities near the Reserve need to be regulated and limited as much as possible to reduce human mortality and other conflicts. Increasing access to alternative fuel sources (e.g. biogas, solar) may reduce the pressure of timber harvesting on protected areas. Residents should be trained in identifying carnivore sign and in ways to reduce their vulnerability when working outdoors.
Growing crops that exhibit a high level of competition with weeds increases opportunities to practice integrated weed management and reduce herbicide inputs. The recent development and market dominance of hybrid canola cultivars provides an opportunity to reassess the relative competitive ability of canola cultivars with small-grain cereals. Direct-seeded (no-till) experiments were conducted at five western Canada locations from 2006 to 2008 to compare the competitive ability of canola cultivars vs. small-grain cereals. The relative competitive ability of the species and cultivars was determined by assessing monocot and dicot weed biomass at different times throughout the growing season as well as oat (simulated weed) seed production. Under most conditions, but especially under warm and relatively dry environments, barley cultivars had the greatest relative competitive ability. Rye and triticale were also highly competitive species under most environmental conditions. Canada Prairie Spring Red wheat and Canada Western Red Spring wheat cultivars usually were the least competitive cereal crops, but there were exceptions in some environments. Canola hybrids were more competitive than open-pollinated canola cultivars. More importantly, under cool, low growing degree day conditions, canola hybrids were as competitive as barley, especially with dicot weeds. Under most conditions, hybrid canola growers on the Canadian Prairies are well advised to avoid the additional selection pressure inherent with a second in-crop herbicide application. Combining competitive cultivars of any species with optimal agronomic practices that facilitate crop health will enhance cropping system sustainability and allow growers to extend the life of their valuable herbicide tools.
The inclusion of winter cereals in spring-annual rotations in the northern Great Plains may reduce weed populations and herbicide requirements. A broad range of spring and winter cereals were compared for ability to suppress weeds and maximize grain yield at Lacombe (2002 to 2005) and Lethbridge (2003 to 2005), Alberta, Canada. High seeding rates (≥ 400 seeds/m2) were used in all years to maximize crop competitive ability. Spring cereals achieved high crop-plant densities (> 250 plants/m2) at most sites, but winter cereals had lower plant densities due to winterkill, particularly at Lethbridge in 2004. All winter cereals and spring barley were highly effective at reducing weed biomass at Lacombe for the first 3 yr of the study. Weed suppression was less consistently affected by winter cereals in the last year at Lacombe and at Lethbridge, primarily due to poor winter survival. Grain yields were highest for spring triticale and least for spring wheat at Lacombe, with winter cereals intermediate. At Lethbridge, winter cereals had higher grain yields in 2003 whereas spring cereals had higher yields in 2004 and 2005. Winter cereals were generally more effective at suppressing weed growth than spring cereals if a good crop stand was established, but overlap in weed-competitive ability among cultivars was considerable. This information will be used to enhance the sustainable production of winter and spring cereals in traditional and nontraditional agro-ecological zones.
Wild oat causes more crop yield losses and accounts for more herbicide expenditures than any other weed species on the Canadian Prairies. A study was conducted from 2001 to 2005 at four Canadian Prairie locations to determine the influence of repeated cultural and herbicidal management practices on wild oat population density, biomass, and seed production, and on barley biomass and seed yield. Short or tall cultivars of barley were combined with normal or double barley seeding rates in continuous barley or a barley–canola–barley–field-pea rotation under three herbicide rate regimes. The same herbicide rate regime was applied to the same plots in all crops each year. In barley, cultivar type and seeding rate were also repeated on the same plots year after year. Optimal cultural practices (tall cultivars, double seeding rates, and crop rotation) reduced wild oat emergence, biomass, and seed production, and increased barley biomass and seed yield, especially at low herbicide rates. Wild oat seed production at the quarter herbicide rate was reduced by 91, 95, and 97% in 2001, 2003, and 2005, respectively, when tall barley cultivars at double seeding rates were rotated with canola and field pea (high management) compared to short barley cultivars at normal seeding rates continuously planted to barley (low management). Combinations of favorable cultural practices interacted synergistically to reduce wild oat emergence, biomass and seed production, and to increase barley yield. For example, at the quarter herbicide rate, wild oat biomass was reduced 2- to 3-, 6- to 7-, or 19-fold when optimal single, double, or triple treatments were combined, respectively. Barley yield reductions in the low-management scenario were somewhat compensated for by full herbicide rates. However, high management at low herbicide rates often produced more barley than low management in higher herbicide rate regimes.
Field-scale experiments were conducted at several western Canada locations to determine the importance of early weed removal over variable landscapes. In eight of 10 cases, imidazolinone-resistant (IR) canola yield decreased linearly as herbicide application (15/15 g/ha imazamox/imazethapyr or 15/15 g/ha imazamox/imazethapyr plus 150 g/ha clopyralid) was delayed beyond the one- to two-leaf stage. In two of 10 cases, canola oil content also decreased as herbicide treatment was delayed. Canola yields at all environments (location by year combinations) averaged 2,073, 1,872, or 1,650 kg/ha when treated at the one- to two-, three- to five-, or six- to seven-leaf stage, respectively. Assuming canola prices from a low of $250/t to a high of $650/t, growers could lose $50 to $131/ha, respectively, by delaying herbicide application from the one- to two- to the three- to five-leaf stage, or $106 to $275/ha, respectively, by delaying herbicide application from the one- to two- to the six- to seven-leaf stage.
Early results from the SAGE-SMC (Surveying the Agents of Galaxy Evolution in the tidally-disrupted, low-metallicity Small Magellanic Cloud) Spitzer legacy program are presented. These early results concentrate on the SAGE-SMC MIPS observations of the SMC Tail region. This region is the high H i column density portion of the Magellanic Bridge adjacent to the SMC Wing. We detect infrared dust emission and measure the gas-to-dust ratio in the SMC Tail and find it similar to that of the SMC Body. In addition, we find two embedded cluster regions that are resolved into multiple sources at all MIPS wavelengths.
It is both surprising and exciting to find that young galaxies at high redshift contain large dust masses. For galaxies at z > 5, after only 1 Gyr, there has not been time for low-mass stars to have evolved to the AGB phase and produce dust. In such galaxies, Type II SNe and red supergiants (RSGs) may even dominate the dust production rate. It has long been known that RSG atmospheres produce dust, but little is known about it. We are pursuing three parallel studies to better understand RSG dust. First, we are using optical spectra and JHK photometry to characterize the optical and near-IR extinction curves of the RSGs. Second, we are using the optical spectra combined with 2MASS, IRAC and MIPS photometry to estimate the dust mass loss rates from Local Group RSGs. In addition, we will use our Monte Carlo radiative transfer models to analyze the emission from dust in the circumstellar shells. Third, the final piece of the puzzle is being provided by obtaining new IRS spectra of LMC and SMC RSGs. We plan to use the IRS to make a systematic study of the dust properties in RSG shells in the LMC and SMC so that we can probe how they may vary with a large range of galactic metallicities. The derived stellar SEDs and extinction curves will be combined with Spitzer IRAC and MIPS photometry and IRS spectra for use as inputs to our Monte Carlo codes which will be used to study the composition, size distributions and clumpiness of the dust.
Weed management strategies can influence insect infestations in field crops, yet no attempts have been made previously to manipulate weed populations in canola for integrated weed and insect management. Field studies were conducted during 2003 to 2005 at Lacombe and Beaverlodge, Alberta, Canada to manipulate weed and root maggot, Delia spp. (Diptera: Anthomyiidae), interactions in canola. Densities of monocot weeds were varied by altering herbicide applications, with rates ranging from 0 to 100% of the rate recommended. Weed populations declined, and yields were variable with increased herbicide rates. Root maggot damage decreased with increases in monocot weed dry weight for both canola species at both study sites. Results support the hypothesis that heterogenous environments, arising from mixed populations of monocot weeds with canola, minimize opportunities for females of Delia spp. to complete the behavioral sequence required for oviposition, leading to reduced infestation levels in weedy systems. However, effects of dicot weeds on root maggot infestations varied between sites as a result of site-related differences in weed species complexes. When wild mustard was common, crop damage increased, because this weed can serve as an alternate host for root maggots. The study emphasizes the importance of adopting crop management practices that are compatible for both weed and root maggot control.