To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Agronomic crops engineered with resistance to 2,4-D or dicamba have been commercialized and widely adopted throughout the United States. Due to this, increased use of these herbicides in time and space has increased damage to sensitive crops. From 2014 to 2016, cucumber and cantaloupe studies were conducted in Tifton, GA to demonstrate how auxinic herbicides (2,4-D or dicamba), herbicide rate (1/75 or 1/250 field use), and application timing (26, 16, and 7 d before harvest (DBH) of cucumber; 54, 31, and 18 DBH of cantaloupe) influenced crop injury, growth, yield, and herbicide residue accumulation in marketable fruit. Greater visual injury, reductions in vine growth, and yield loss were observed at higher rates when herbicides were applied during early-season vegetative growth compared to late-season with fruit development. Dicamba was more injurious in cucumber while cantaloupe responded similarly to both herbicides. For cucumber, total fruit number and relative weights were reduced (16 to 19%) when either herbicide was applied at the 1/75 rate 26 DBH. Cantaloupe fruit weight was also reduced 21 and 10% when either herbicide was applied at the 1/75 rate 54 or 31 DBH, respectively. Residue analysis noted applications closer to harvest were more likely to be detectable in fruit than earlier applications. In cucumber, dicamba was detected at both rates when applied 7 DBH, while in cantaloupe it was detected at both rates when applied 18 or 31 DBH in 2016 and at the 1/75 rate applied 18 or 31 DBH in 2014. Detectable amounts of 2,4-D were not observed in cucumber but were detected in cantaloupe when applied at either rate 18 or 31 DBH. While early season injury will more likely reduce cucumber or cantaloupe yields, the quantity of herbicide residue detected will be most influenced by the time interval between the off-target incident and sampling.
A distinct feature of many of the earliest archaeological sites (13,000-11,200 cal yr BP) at the core of the Atacama Desert is that they lie at or just below the surface, often encased in desert pavements. In this study, we compare these sites and undisturbed desert pavements to understand archaeological site formation and pavement development and recovery. Our results indicate these pavements and their soils are poorly developed regardless of their age. We propose that this is because of sustained lack of rain and extreme physical breakdown of clasts by salt expansion. Thus, the core of the Atacama provides an example of the lower limits of rainfall (<50 mm/yr) needed to form desert pavements. At site Quebrada Maní 12 (QM12), humans destroyed the pavement. After abandonment, human-made depressions were filled with eolian sands, incorporating artifacts in shallow deposits. Small and medium-sized artifacts preferentially migrated upwards, perhaps due to earthquakes and the action of salts. These artifacts, which now form palimpsests at the surface, helped – along with older clasts - to restore surface clast cover. Larger archaeological features remained undisturbed on top of a deeper Byzm horizon. The vesicular A horizons (Av horizons) have not regenerated on the archaeological sites due to extreme scarcity of rainfall during the Holocene.
Archaeologists have long subjected Clovis megafauna kill/scavenge sites to the highest level of scrutiny. In 1987, a Columbian mammoth (Mammuthus columbi) was found in spatial association with a small artifact assemblage in Converse County, Wyoming. However, due to the small tool assemblage, limited nature of the excavations, and questions about the security of the association between the artifacts and mammoth remains, the site was never included in summaries of human-killed/scavenged megafauna in North America. Here we present the results of four field seasons of new excavations at the La Prele Mammoth site that confirm the presence of an associated cultural occupation based on geologic context, artifact attributes, spatial distributions, protein residue analysis, and lithic microwear analysis. This new work identified a more extensive cultural occupation including the presence of multiple discrete artifact clusters in close proximity to the mammoth bone bed. This study confirms the presence of a second Clovis mammoth kill/scavenge site in Wyoming and shows the value in revisiting proposed terminal Pleistocene kill/scavenge sites.
Dicamba and 2,4-D systems control many problematic weeds; however, drift to susceptible crops can be a concern in diverse production areas. Glufosinate-based systems are an alternative, but current recommended rates of glufosinate can result in variable control. Research was conducted in 2017 and 2018 to investigate the optimum time interval between sequential glufosinate applications and determine if the addition of glyphosate with glufosinate is beneficial for controlling Palmer amaranth and annual grasses in cotton. The interval between sequential applications (1, 3, 5, 7, 10, or 14 d or no second spray) was the whole plot and herbicide option (glufosinate or glufosinate plus glyphosate) was the subplot. Combined over herbicides, Palmer amaranth 15- to 20-cm tall (at four locations) was controlled 98% to 99% with sequential intervals of 1 to 7 d compared with 70% to 88% with intervals of 10 or 14 d. Lowest biomass weight and population densities were noted with 1- to 7-d intervals. Large crabgrass 15- to 20-cm tall (at five locations) was controlled 93% to 98% with glufosinate applications 3- to 7-d apart as compared with 76% to 81% with applications 10- to 14-d apart. Lowest biomass weights were observed with 1- to 7-d intervals. When glufosinate controlled grass less than 93%, adding glyphosate was beneficial. Neither interval between sequential applications nor herbicide option influenced cotton yield. Shorter time intervals between sequential application and including glyphosate can improve the effectiveness of a glufosinate-based system in managing Palmer amaranth and large crabgrass.
Nutsedge species are problematic in plastic-mulched vegetable production because of the weed’s rapid reproduction and ability to penetrate the mulch. Vegetable growers rely heavily on halosulfuron to manage nutsedge species; however, the herbicide cannot be applied over mulch before vegetable transplanting due to potential crop injury. This can be problematic when multiple crops are produced on a single mulch installation. Field experiments were conducted to determine the response of broccoli, cabbage, squash, and watermelon to halosulfuron applied on top of mulch prior to transplanting. Halosulfuron at 80 g ai ha−1 was applied 21, 14, 7, and 1 d before planting (DBP), and 160 g ai ha−1 was applied 21 DBP. In all experiments, extending the interval between halosulfuron application and planting reduced crop injury. For squash and watermelon, visual injury, plant diameters/vine runner lengths, marketable fruit weights, and postharvest plant biomass resulted in similar values when applying 80 g ha−1 21 DBP and with the nontreated weed-free control. Reducing this interval increased injury for both crops. Visual crop injury and yield reductions up to 40% occurred, with halosulfuron applied 14, 7, or 1 DBP in squash and 1 DBP in watermelon. Broccoli and cabbage showed greater sensitivity, with injury and plant diameter reductions greater than 15%, even with halosulfuron applied at 80 g ha−1 21 DBP. Experimental results confirm that halosulfuron binds to plastic mulch, remains active, and is slowly released from the mulch over a substantial period, during rainfall or overhead irrigation events. Extending the plant-back interval to at least 21 d before transplanting did overcome squash and watermelon injury concerns with halosulfuron at 80 g ha−1, but not broccoli and cabbage. Applying halosulfuron over mulch to control emerged nutsedge before planting squash and watermelon would be beneficial if adequate rainfall or irrigation and appropriate intervals between application and planting are implemented.
Introduction: Although oral rehydration therapy is recommended for children with acute gastroenteritis (AGE) with none to some dehydration, intravenous (IV) rehydration is still commonly administered to these children in high-income countries. IV rehydration is associated with pain, anxiety, and emergency department (ED) revisits in children with AGE. A better understanding of the factors associated with IV rehydration is needed to inform knowledge translation strategies. Methods: This was a planned secondary analysis of the Pediatric Emergency Research Canada (PERC) and Pediatric Emergency Care Applied Research Network (PECARN) randomized, controlled trials of oral probiotics in children with AGE-associated diarrhea. Eligible children were aged 3-48 months and reported > 3 watery stools in a 24-hour period. The primary outcome was administration of IV rehydration at the index ED visit. We used mixed-effects logistic regression model to explore univariable and multivariable relationships between IV rehydration and a priori risk factors. Results: From the parent study sample of 1848 participants, 1846 had data available for analysis: mean (SD) age of 19.1 ± 11.4 months, 45.4% females. 70.2% (1292/1840) vomited within 24 hours of the index ED visit and 34.1% (629/1846) received ondansetron in the ED. 13.0% (240/1846) were administered IV rehydration at the index ED visit, and 3.6% (67/1842) were hospitalized. Multivariable predictors of IV rehydration were Clinical Dehydration Scale (CDS) score [compared to none: mild to moderate (OR: 8.1, CI: 5.5-11.8); severe (OR: 45.9, 95% CI: 20.1-104.7), P < 0.001], ondansetron in the ED (OR: 1.8, CI: 1.2-2.6, P = 0.003), previous healthcare visit for the same illness [compared to no prior visit: prior visit with no IV (OR: 1.9, 95% CI: 1.3-2.9); prior visit with IV (OR: 10.5, 95% CI: 3.2-34.8), P < 0.001], and country [compared to Canada: US (OR: 4.1, CI: 2.3-7.4, P < 0.001]. Significantly more participants returned to the ED with symptoms of AGE within 3 days if IV fluids were administered at the index visit [30/224 (13.4%) versus 88/1453 (6.1%), P < 0.001]. Conclusion: Higher CDS scores, antiemetic use, previous healthcare visits and country were independent predictors of IV rehydration which was also associated with increased ED revisits. Knowledge translation focused on optimizing the use of antiemetics (i.e. for those with dehydration) and reducing the geographic variation in IV rehydration use may improve the ED experience and reduce ED-revisits.
Zirconolite glass-ceramics are being developed as potential wasteforms for the disposition of Pu wastes in the UK. Previous studies utilised a variety of surrogates whilst this work uses both cold-press and sinter and hot isostatic press methods to validate the wasteform with PuO2. A cold press and sinter sample was fabricated as part of a validation study for plutonium incorporation in hot isostatically pressed (HIPed) wasteforms. The results confirmed the cold-press and sinter, achieved successful waste incorporation and a microstructure and phase assemblage that was in agreement with those expected of a HIPed equivalent. A HIP sample was fabricated of the same composition and characterised by SEM and XRD. Results were in agreement with the sintered sample and achieved complete waste incorporation into the glass-ceramic wasteform. These samples have demonstrated successful incorporation of PuO2 into glass-ceramic HIPed wasteforms proposed for processing Pu-based waste-streams in the UK.
Shortages of hired labour are leading to greater interest in mechanisation for crop establishment in smallholder agriculture. Due to small field sizes, mechanised planters mounted on four-wheel tractors are not a suitable technology. The Versatile Multi-crop Planter (VMP) was developed for zero tillage (ZT), strip planting (SP) or single pass shallow tillage (SPST) on flat land and for forming and planting on tops of beds, each in a single pass operation, when mounted on a two-wheel tractor (2WT). The aim of the present study was to evaluate the field performance of the VMP in comparison to conventional broadcast seeding and full rotary tillage (2 to 4 passes; called CT) for establishing chickpea (Cicer arietinum L.), jute (Corchorus olitorius L.), lentil (Lens culinaris Medikus), maize (Zea mays L.), mung bean (Vigna radiata L. R. Wilczek), rice (Oryza sativa L.) and wheat (Triticum aestivum L.) in 15 locations of Bangladesh. Plant populations emerging from all single pass operations viz. SP, ZT, and bed planting (BP) were generally satisfactory and in 12 out of 15 experiments plant populations after SP were similar to or greater than after CT. In addition, SP gave comparable or greater plant populations than SPST and BP planting methods. Overall, the SP planting achieved comparable yields and lower costs of establishment than CT. We conclude that effective and reliable planters are now available for sowing a range of crop species on small fields with minimum soil disturbance. This opens up realistic options for the development of mechanised conservation agriculture suited to small field sizes.
Various transmission routes contribute to spread of carbapenem-resistant Klebsiella pneumoniae (CRKP) in hospitalized patients. Patients with readmissions during which CRKP is again isolated (“CRKP readmission”) potentially contribute to transmission of CRKP.
To evaluate CRKP readmissions in the Consortium on Resistance against Carbapenems in K. pneumoniae (CRaCKLe).
Cohort study from December 24, 2011, through July 1, 2013.
Multicenter consortium of acute care hospitals in the Great Lakes region.
All patients who were discharged alive during the study period were included. Each patient was included only once at the time of the first CRKP-positive culture.
All readmissions within 90 days of discharge from the index hospitalization during which CRKP was again found were analyzed. Risk factors for CRKP readmission were evaluated in multivariable models.
Fifty-six (20%) of 287 patients who were discharged alive had a CRKP readmission. History of malignancy was associated with CRKP readmission (adjusted odds ratio [adjusted OR], 3.00 [95% CI, 1.32–6.65], P<.01). During the index hospitalization, 160 patients (56%) received antibiotic treatment against CRKP; the choice of regimen was associated with CRKP readmission (P=.02). Receipt of tigecycline-based therapy (adjusted OR, 5.13 [95% CI, 1.72–17.44], using aminoglycoside-based therapy as a reference in those treated with anti-CRKP antibiotics) was associated with CRKP readmission.
Hospitalized patients with CRKP—specifically those with a history of malignancy—are at high risk of readmission with recurrent CRKP infection or colonization. Treatment during the index hospitalization with a tigecycline-based regimen increases this risk.
Infect. Control Hosp. Epidemiol. 2016;37(3):281–288
To determine the rates of and risk factors for tigecycline nonsusceptibility among carbapenem-resistant Klebsiella pneumoniae (CRKPs) isolated from hospitalized patients
Multicenter prospective observational study
Acute care hospitals participating in the Consortium on Resistance against Carbapenems in Klebsiella pneumoniae (CRaCKle)
A cohort of 287 patients who had CRKPs isolated from clinical cultures during hospitalization
For the period from December 24, 2011 to October 1, 2013, the first hospitalization of each patient with a CRKP during which tigecycline susceptibility for the CRKP isolate was determined was included. Clinical data were entered into a centralized database, including data regarding pre-hospital origin. Breakpoints established by the European Committee on Antimicrobial Susceptibility Testing (EUCAST) were used to interpret tigecycline susceptibility testing.
Of 287 patients included in the final cohort, 155 (54%) had tigecycline-susceptible CRKPs. Of all index isolates, 81 (28%) were tigecycline-intermediate and 51 (18%) were tigecycline resistant. In multivariate modeling, independent risk factors for tigecycline nonsusceptibility were (1) admission from a skilled nursing facility (OR, 2.51; 95% CI, 1.51–4.21; P=.0004), (2) positive culture within 2 days of admission (OR, 1.82; 95% CI, 1.06–3.15; P=.03), and (3) receipt of tigecycline within 14 days (OR, 4.38, 95% CI, 1.37–17.01, P=.02).
In hospitalized patients with CRKPs, tigecycline nonsusceptibility was more frequently observed in those admitted from skilled nursing facilities and occurred earlier during hospitalization. Skilled nursing facilities are an important target for interventions to decrease antibacterial resistance to antibiotics of last resort for treatment of CRKPs.
This article discusses the magnitude and rate of change of radiocarbon reservoir ages from the surface ocean in the South Pacific during the Holocene. 14C reservoir ages are calculated from paired U/Th and 14C measurements. Seventeen pairs of coral dates were determined from samples collected on Rendova and Tetepare Islands, in the Solomon Islands, and from Espiritu Santo Island, Vanuatu. The samples are all Holocene in age, with 230Th ages ranging from about 400 to 9400 BP. Samples were collected as drill cores or surface outcrops. About half of the surface samples appear to have incorporated modern carbon through postdepositional recrystallization. Two of the core samples were also affected by carbon exchange. The Holocene 14C reservoir ages observed in this data set show stable values for the last 3000 yr, and substantial variability from 5000–6000 BP (~100 to ~950 14C yr). Persistent low values (<200 14C yr) were observed for samples from 7000–8000 BP. We attribute these variations to temporal changes in lateral advection and vertical mixing, and possibly to local environmental conditions related to the interplay between sea-level rise and episodic uplift, characteristic of all the coral localities.
Alkaline earth aluminosilicate glasses (AeAS) with different MoO3 additions have been produced and assessed. MoO3 solubility increases with the equimolar substitution of smaller to larger alkaline earths and reaches 5.34 mol% in magnesium aluminosilicate glass (MAS). All visibly homogeneous glasses are X-ray amorphous, while the partially crystallised glasses exhibit some small X-ray diffraction peaks which are probably due to corresponding molybdates. The addition of MoO3 decreases glass transition and crystallisation temperatures and creates two broad Raman bands which are assigned to vibrations of MoO42‒ tetrahedra. The intensities of these bands increase along with MoO3 incorporation until the maximum solubility is reached. Electron microscopy shows that these separated particles are spherical, with sub-micron diameters and are randomly dispersed within glass. The separated phases are formed through liquid-liquid separation and thereafter crystallisation. Overall AeAS glasses look quite promising for molybdate immobilisation with MAS glasses being particularly attractive.
Deep borehole disposal (or DBD) is now seen as a viable alternative to the (comparatively shallow) geologically repository concept for disposal of high level waste and spent nuclear fuel. Based on existing oil and geothermal well technologies, we report details of investigations into cementitious grouts as sealing/support matrices (SSMs) for waste disposal scenarios in the DBD process where temperatures at the waste package surface do not exceed ∼190ºC. Grouts based on Class G oil well cements, partially replaced with silica flour, are being developed, and the use of retarding admixtures is being investigated experimentally. Sodium gluconate appears to provide sufficient retardation and setting characteristics to be considered for this application and also provides an increase in grout fluidity. The quantity of sodium gluconate required in the grout to ensure fluidity for 4 hours at 90, 120 and 140°C is 0.05, 0.25 and 0.25 % by weight of cement respectively. A phosphonate admixture only appears to provide desirable retardation properties at 90°C. The presence of either retarder does not affect the composition of the hardened cement paste over 14 days curing and the phases formed are durable under conditions of high temperature and pressure.
This study investigates the dissolution of CeO2, an isostructural analogue for UO2 and ThO2, which was synthesized to closely approximate the microstructure of a spent nuclear fuel matrix. Dissolution of CeO2 particles was performed in simplified solutions representative of saline, near-neutral and alkaline ground waters that may be encountered in geological disposal scenarios, and in acidic medium for comparison. The normalized mass loss of cerium was found to be significantly influenced by the formation of colloidal particles, especially in the near-neutral and alkaline solutions investigated. The normalized dissolution rate, RL(Ce), k (g m-2 d-1), in these two solutions was found to be similar, but significantly lower than in a nitric acid medium. The activation energies based on the normalized release rate of cerium, at 40°C, 70°C and 90°C in each solution, were in the range of 24 ± 3 kJ mol-1 to 27 ± 7 kJ mol-1, indicative of a surface-mediated dissolution mechanism. The mechanism of dissolution was postulated to be similar in each of the solutions investigated, and further work is proposed to investigate the role of carbonate on the CeO2 dissolution mechanism.
(See the commentary by Pfeiffer and Beldavs, on pages 984–986.)
Describe the epidemiology of carbapenem-resistant Enterobacteriaceae (CRE) and examine the effect of lower carbapenem breakpoints on CRE detection.
Inpatient care at community hospitals.
All patients with CRE-positive cultures were included.
CRE isolated from 25 community hospitals were prospectively entered into a centralized database from January 2008 through December 2012. Microbiology laboratory practices were assessed using questionnaires.
A total of 305 CRE isolates were detected at 16 hospitals (64%). Patients with CRE had symptomatic infection in 180 cases (59%) and asymptomatic colonization in the remainder (125 cases; 41%). Klebsiella pneumoniae (277 isolates; 91%) was the most prevalent species. The majority of cases were healthcare associated (288 cases; 94%). The rate of CRE detection increased more than fivefold from 2008 (0.26 cases per 100,000 patient-days) to 2012 (1.4 cases per 100,000 patient-days; incidence rate ratio (IRR), 5.3 [95% confidence interval (CI), 1.22–22.7]; P = .01). Only 5 hospitals (20%) had adopted the 2010 Clinical and Laboratory Standards Institute (CLSI) carbapenem breakpoints. The 5 hospitals that adopted the lower carbapenem breakpoints were more likely to detect CRE after implementation of breakpoints than before (4.1 vs 0.5 cases per 100,000 patient-days; P < .001; IRR, 8.1 [95% CI, 2.7–24.6]). Hospitals that implemented the lower carbapenem breakpoints were more likely to detect CRE than were hospitals that did not (3.3 vs 1.1 cases per 100,000 patient-days; P = .01).
The rate of CRE detection increased fivefold in community hospitals in the southeastern United States from 2008 to 2012. Despite this, our estimates are likely underestimates of the true rate of CRE detection, given the low adoption of the carbapenem breakpoints recommended in the 2010 CLSI guidelines.
Sandia Cave generated much interest when in the 1940s extinct Pleistocene megafauna were reported in association with what appeared to be a pre-Folsom Paleoindian component. By the 1950s a series of controversies regarding the stratigraphy and dating began to push the site into obscurity. The human occupation at the site has never been directly dated beyond 2250 ± 50 BP, and nonartifactual associated bone will not provide reliable age estimates because of extensive bioturbation, poor provenience, and the fact that the majority of fossils were accumulated by carnivores and rodents, rather than humans. However, a small number of mineralized fragments display human modification, suggesting occasional human activity of some antiquity at the site. One bone tool, one burned bone, and four bones bearing butchery marks were subjected to direct Accelerator Mass Spectrometry (AMS) 14C dating. Unfortunately, mineralized bones did not preserve sufficient collagen to be dated. Two unmineralized specimens (the burned bone and the bone tool) push the direct Chronometric ages for the human occupation at Sandia Cave back to 3447 ± 96 BP. An older Folsom occupation is suggested by associated dates on breccia, but all lines of evidence taken together provide no support for a pre-Folsom human occupation.
Teflon amorphous fluoropolymer (TAF) multi-walled carbon nanotube (MWCNT) suspensions have the potential for creating conductive coatings on insulating films for numerous applications. However, there are few studies on polymer MWCNT suspension properties and even fewer that use Teflon. To define mechanical and electrical property relationships, bilayer films of TAF-MWCNT were created with differing concentrations of MWCNTs. Nanoindentation revealed that addition of 8 wt% MWCNTs to TAF increased the elastic modulus by about 25% and hardness by about 15%. Conducting indentation showed 8 wt% MWCNT films exhibit uniform stable conductance once indentation depth exceeds several hundred nanometers. Films with lower concentrations of CNTs were insulating. The two techniques provide a unique description of structure property relationships in this suspension film system.