To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Efforts to reduce Clostridioides difficile infection (CDI) have targeted transmission from patients with symptomatic C. difficile. However, many patients with the C. difficile organism are carriers without symptoms who may serve as reservoirs for spread of infection and may be at risk for progression to symptomatic C. difficile. To estimate the prevalence of C. difficile carriage and determine the risk and speed of progression to symptomatic C. difficile among carriers, we established a pilot screening program in a large urban hospital.
Prospective cohort study.
An 800-bed, tertiary-care, academic medical center in the Bronx, New York.
A sample of admitted adults without diarrhea, with oversampling of nursing facility patients.
Perirectal swabs were tested by polymerase chain reaction for C. difficile within 24 hours of admission, and patients were followed for progression to symptomatic C. difficile. Development of symptomatic C. difficile was compared among C. difficile carriers and noncarriers using a Cox proportional hazards model.
Of the 220 subjects, 21 (9.6%) were C. difficile carriers, including 10.2% of the nursing facility residents and 7.7% of the community residents (P = .60). Among the 21 C. difficile carriers, 8 (38.1%) progressed to symptomatic C. difficile, but only 4 (2.0%) of the 199 noncarriers progressed to symptomatic C. difficile (hazard ratio, 23.9; 95% CI, 7.2–79.6; P < .0001).
Asymptomatic carriage of C. difficile is prevalent among admitted patients and confers a significant risk of progression to symptomatic CDI. Screening for asymptomatic carriers may represent an opportunity to reduce CDI.
Residual herbicides applied to summer cash crops have the potential to injure subsequent winter annual cover crops, yet little information is available to guide growers’ choices. Field studies were conducted in 2016 and 2017 in Blacksburg and Suffolk, Virginia, to determine carryover of 30 herbicides commonly used in corn, soybean, or cotton on wheat, barley, cereal rye, oats, annual ryegrass, forage radish, Austrian winter pea, crimson clover, hairy vetch, and rapeseed cover crops. Herbicides were applied to bare ground either 14 wk before cover crop planting for a PRE timing or 10 wk for a POST timing. Visible injury was recorded 3 and 6 wk after planting (WAP), and cover crop biomass was collected 6 WAP. There were no differences observed in cover crop biomass among herbicide treatments, despite visible injury that suggested some residual herbicides have the potential to effect cover crop establishment. Visible injury on grass cover crop species did not exceed 20% from any herbicide. Fomesafen resulted in the greatest injury recorded on forage radish, with greater than 50% injury in 1 site-year. Trifloxysulfuron and atrazine resulted in greater than 20% visible injury on forage radish. Trifloxysulfuron resulted in the greatest injury (30%) observed on crimson clover in 1 site-year. Prosulfuron and isoxaflutole significantly injured rapeseed (17% to 21%). Results indicate that commonly used residual herbicides applied in the previous cash crop growing season result in little injury on grass cover crop species, and only a few residual herbicides could potentially affect the establishment of a forage radish, crimson clover, or rapeseed cover crop.
Cardiovascular disease is a leading cause of morbidity and mortality in childhood cancer survivors. Cardiologists must be aware of risk factors and long-term follow-up guidelines, which have historically been the purview of oncologists. Little is known about paediatric cardiologists’ knowledge regarding the cardiotoxicity of cancer treatment and how to improve this knowledge.
A total of 58 paediatric cardiologists anonymously completed a 21-question, web-based survey focused on four cardio-oncology themes: cancer treatment-related risk factors (n = 6), patient-related risk factors (n = 6), recommended surveillance (n = 3), and cardiac-specific considerations (n = 6). Following the baseline survey, a multi-disciplinary team of paediatric cardiologists and cancer survivor providers developed an in-person and web-based educational intervention. A post-intervention survey was conducted 5 months later.
The response rate was 41/58 (70.7%) pre-intervention and 30/58 (51.7%) post-intervention. On the baseline survey, the percentage of correct answers was 68.8 ± 10.3%, which improved to 79.2 ± 16.2% after the intervention (p = 0.009). The theme with the most profound knowledge deficit was surveillance; however, it also had the greatest improvement after the intervention (49.6 ± 26.7 versus 66.7 ± 27.7% correct, p = 0.025). Individual questions with the largest per cent improvement pertained to risk of cardiac dysfunction with time since treatment (52.4 versus 93.1%, p = 0.002) and the role of dexrazoxane (48.8 versus 82.8%, p = 0.020).
Specific knowledge deficits about the care of paediatric cancer survivors were identified amongst cardiologists using a web-based survey. Knowledge of surveillance was initially lowest but improved the most after an educational intervention. This highlights the need for cardio-oncology-based educational initiatives among paediatric cardiologists.
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods.
Cryo-planing SEM provides a powerful 3D view of internal and external tissue and cell features. However, specimen preparation is tedious because it requires custom apparatus and difficult cryogenic manipulations. We present an easy method using mPrep/s capsules that provide a “handle” to hold and orient specimens throughout cryo-preparation and imaging. Cryo-facing is done with a cryo-ultramicrotome, requiring no custom equipment while providing accurate control of the image plane. We show this with fresh Christmas cactus leaves and aldehyde-fixed kidney tissue imaged by room temperature SEM after freeze-drying, demonstrating the method operating without an expensive cryo-SEM.
Electron microprobe trace element analysis is a significant challenge. Due to the low net intensity of peak measurements, the accuracy and precision of such analyses relies critically on background measurements, and on the accuracy of any pertinent peak interference corrections. A linear regression between two points selected at appropriate background positions is a classical approach for electron probe microanalysis (EPMA). However, this approach neglects the accurate assessment of background curvature (exponential or polynomial), and the presence of background interferences, a hole in the background, or an absorption edge can dramatically affect the results if underestimated or ignored. The acquisition of a quantitative wavelength-dispersive spectrometry (WDS) scan over the spectral region of interest remains a reasonable option to determine the background intensity and curvature from a fitted regression of background portions of the scan, but this technique can be time consuming and retains an element of subjectivity, as the analyst has to select areas in the scan which appear to represent background. This paper presents a new multi-point background (MPB) method whereby the background intensity is determined from up to 24 background measurements from wavelength positions on either side of analytical lines. This method improves the accuracy and precision of trace element analysis in a complex matrix through careful regression of the background shape, and can be used to characterize the background over a large spectral region covering several elements to be analyzed. The overall efficiency improves as systematic WDS scanning is not required to assess background interferences. The method is less subjective compared to methods that rely on WDS scanning, including selection of two interpolation points based on WDS scans, because “true” backgrounds are selected through an exclusion method of possible erroneous backgrounds. The first validation of the MPB method involves blank testing to ensure the method can accurately measure the absence of an element. The second validation involves the analysis of U-Th-Pb in several monazite reference materials of known isotopic age. The impetus for the MPB method came from efforts to refine EPMA monazite U-Th-Pb dating, where it was recognized that background errors resulting from interference or strong background curvature could result in errors of several tens of millions of years on the calculated date. Results obtained on monazite reference materials using two different microprobes, a Cameca SX-100 Ultrachron and a JEOL JXA-8230, yield excellent agreement with ages obtained by isotopic methods (Thermal Ionization Mass Spectrometry [TIMS], Sensitive High-Resolution Ion MicroProbe [SHRIMP], or Secondary Ion Mass Spectrometry [SIMS]). Finally, the MPB method can be used to model the background over a large spectrometer range to improve the accuracy of background measurement of minor and trace elements acquired on a same spectrometer, a method called the shared background measurement. This latter significantly improves the accuracy of minor and trace element analysis in complex matrices, as demonstrated by the analysis of Rare Earth Elements (REE) in REE-silicates and phosphates and of trace elements in scheelite.
Pyrolized carbon in biochar can sequester atmospheric CO2 into soil to reduce impacts of anthropogenic CO2 emissions. When estimating the stability of biochar, degradation of biochar carbon, mobility of degradation products, and ingress of carbon from other sources must all be considered. In a previous study we tracked degradation in biochars produced from radiocarbon-free wood and subjected to different physico-chemical treatments over three years in a rainforest soil. Following completion of the field trial, we report here a series of in-vitro incubations of the degraded biochars to determine CO2 efflux rates, 14C concentration and δ13C values in CO2 to quantify the contributions of biochar carbon and other sources of carbon to the CO2 efflux. The 14C concentration in CO2 showed that microbial degradation led to respiration of CO2 sourced from indigenous biochar carbon (≈0.5–1.4 μmoles CO2/g biochar C/day) along with a component of carbon closely associated with the biochars but derived from the local environment. Correlations between 14C concentration, δ13C values and Ca abundance indicated that Ca2+ availability was an important determinant of the loss of biochar carbon.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
OBJECTIVES/SPECIFIC AIMS: Rodent models can be used to study neonatal abstinence syndrome (NAS), but the applicability of findings from the models to NAS in humans is not well understood. The objective of this study was to develop a rat model of norbuprenorphine-induced NAS and validate its translational value by comparing blood concentrations in the norbuprenorphine-treated pregnant rat to those previously reported in pregnant women undergoing buprenorphine treatment. METHODS/STUDY POPULATION: Pregnant Long-Evans rats were implanted with 14-day osmotic minipumps containing vehicle, morphine (positive control), or norbuprenorphine (0.3–3 mg/kg/d) on gestation day 9. Within 12 hours of delivery, pups were tested for spontaneous or precipitated opioid withdrawal by injecting them with saline (10 mL/kg, i.p.) or naltrexone (1 or 10 mg/kg, i.p), respectively, and observing them for well-validated neonatal withdrawal signs. Blood was sampled via indwelling jugular catheters from a subset of norbuprenorphine-treated dams on gestation day 8, 10, 13, 17, and 20. Norbuprenorphine concentrations in whole blood samples were quantified using LC/MS/MS. RESULTS/ANTICIPATED RESULTS: Blood concentrations of norbuprenorphine in rats exposed to 1–3 mg/kg/d of norbuprenorphine were similar to levels previously reported in pregnant women undergoing buprenorphine treatment. Pups born to dams treated with these doses exhibited robust withdrawal signs. Blood concentrations of norbuprenorphine decreased across gestation, which is similar to previous reports in humans. DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that dosing dams with 1–3 mg/kg/day norbuprenorphine produces maternal blood concentrations and withdrawal severity similar to those previously reported in humans. This provides evidence that, at these doses, this model is useful for testing hypotheses about norbuprenorphine that are applicable to NAS in humans.
OBJECTIVES/SPECIFIC AIMS: To build a multisite de-identified database of female adolescents, aged 12–21 years (January 2011–December 2012), and their subsequent offspring through 24 months of age from electronic health records (EHRs) provided by participating Community Health. METHODS/STUDY POPULATION: We created a community-academic partnership that included New York City Community Health Centers (n=4) and Hospitals (n=4), The Rockefeller University, The Sackler Institute for Nutrition Science and Clinical Directors Network (CDN). We used the Community-Engaged Research Navigation model to establish a multisite de-identified database extracted from EHRs of female adolescents aged 12–21 years (January 2011–December 2012) and their offspring through 24 months of age. These patients received their primary care between 2011 and 2015. Clinical data were used to explore possible associations among specific measures. We focused on the preconception, prenatal, postnatal periods, including pediatric visits up to 24 months of age. RESULTS/ANTICIPATED RESULTS: The analysis included all female adolescents (n=122,556) and a subset of pregnant adolescents with offspring data available (n=2917). Patients were mostly from the Bronx; 43% of all adolescent females were overweight (22%) or obese (21%) and showed higher systolic and diastolic blood pressure, blood glucose levels, hemoglobin A1c, total cholesterol, and triglycerides levels compared with normal-weight adolescent females (p<0.05). This analysis was also performed looking at the nonpregnant females and the pregnant females separately. Overall, the pregnant females were older (mean age=18.3) compared with the nonpregnant females (mean age=16.5), there was a higher percentage of Hispanics among the pregnant females (58%) compared with the nonpregnant females (43.9%). There was a statistically significant association between the BMI status of mothers and infants’ birth weight, with underweight/normal-weight mothers having more low birth weight (LBW) babies and overweight/obese mothers having more large babies. The odds of having a LBW baby was 0.61 (95% CI: 0.41, 0.89) lower in obese compared with normal-weight adolescent mothers. The risk of having a preterm birth before 37 weeks was found to be neutral in obese compared with normal-weight adolescent mothers (OR=0.81, 95% CI: 0.53, 1.25). Preliminary associations are similar to those reported in the published literature. DISCUSSION/SIGNIFICANCE OF IMPACT: This EHR database uses available measures from routine clinical care as a “rapid assay” to explore potential associations, and may be more useful to detect the presence and direction of associations than the magnitude of effects. This partnership has engaged community clinicians, laboratory, and clinical investigators, and funders in study design and analysis, as demonstrated by the collaborative development and testing of hypotheses relevant to service delivery. Furthermore, this research and learning collaborative is examining strategies to enhance clinical workflow and data quality as well as underlying biological mechanisms. The feasibility of scaling-up these methods facilitates studying similar populations in different Health Systems, advancing point-of-care studies of natural history and comparative effectiveness research to identify service gaps, evaluate effective interventions, and enhance clinical and data quality improvement.
OBJECTIVES/SPECIFIC AIMS: To study the role of OSA as an independent predictor of perioperative outcomes. METHODS/STUDY POPULATION: For this single-institution cohort study, we included data from patients who were enrolled into 1 of 3 prospective parent studies. All participants underwent in-patient surgeries, excluding neurosurgeries, which required general anesthesia and a postoperative stay of at least 1 day. Patients included in this study were assessed daily for postoperative delirium and pain severity as part of the parent studies. In the current study, determination of delirium diagnosis was based on the 3-minute Diagnostic Confusion Assessment Method (3D-CAM), and the Visual Analogue Pain Scale (VAS) was used for pain severity. Data on OSA diagnosis (determined by sleep study); OSA risk (determined by the STOP-Bang tool; snoring, tiredness, observed apnea, high blood pressure, body mass index>35 kg/m2, age>50, neck circumference, male gender); and compliance with treatment were obtained from the preoperative assessment record. Participants were grouped into 1 of 3 categories: high risk of OSA (HR-OSA; including patients with a previous positive sleep study or STOP-Bang score ≥5); intermediate risk of OSA (IR-OSA; including patients with a STOP-Bang score of 3 or 4); and low risk of OSA (LR-OSA; including patients with a previous negative sleep study or STOP-Bang score <3). Candidate risk factors for delirium and pain were also extracted from this record. RESULTS/ANTICIPATED RESULTS: Logistic regression will be used to test whether OSA independently predicts postoperative delirium and linear regression to assess OSAs relationship to acute pain severity. We hypothesize that patients in the HR-OSA category will experience a higher incidence of postoperative delirium and greater postoperative pain severity. We also predict a step-wise increase in risk of these adverse outcomes when analyzing patients stratified by OSA risk (HR-OSA vs. IR-OSA vs. LR-OSA). For our secondary analyses, we anticipate these outcomes are modified by compliance with CPAP treatment. We believe patients with OSA who do not use prescribed CPAP will experience a higher incidence of postoperative delirium as well as increased pain severity. DISCUSSION/SIGNIFICANCE OF IMPACT: OSA is a common and frequently undiagnosed perioperative problem associated with altered pain processing and a high incidence of postoperative delirium. While likely providing stronger evidence of OSA’s reported impact on postoperative delirium and pain, our findings might also help discern points of intervention for treatment and prevention. Since OSA’s presumed impact poses challenges to clinicians and patients, prospective, randomized trials testing preventative or mitigating interventions are necessary. We hope to use these results to design such trials and clinical plans, with the goal of reducing postoperative delirium and acute postsurgical pain severity for the large number of patients at risk due to OSA.
Early life exposures affect health and disease across the life course and potentially across multiple generations. The Clinical and Translational Research Institutes (CTSIs) offer an opportunity to utilize and link existing databases to conduct lifespan research.
A survey with Lifespan Domain Taskforce expert input was created and distributed to lead lifespan researchers at each of the 64 CTSIs. The survey requested information regarding institutional databases related to early life exposure, child-maternal health, or lifespan research.
Of 64 CTSI, 88% provided information on a total of 130 databases. Approximately 59% (n=76/130) had an associated biorepository. Longitudinal data were available for 72% (n=93/130) of reported databases. Many of the biorepositories (n=44/76; 68%) have standard operating procedures that can be shared with other researchers.
The majority of CTSI databases and biorepositories focusing on child-maternal health and lifespan research could be leveraged for lifespan research, increased generalizability and enhanced multi-institutional research in the United States.
To evaluate probiotics for the primary prevention of Clostridium difficile infection (CDI) among hospital inpatients.
A before-and-after quality improvement intervention comparing 12-month baseline and intervention periods.
A 694-bed teaching hospital.
We administered a multispecies probiotic comprising L. acidophilus (CL1285), L. casei (LBC80R), and L. rhamnosus (CLR2) to eligible antibiotic recipients within 12 hours of initial antibiotic receipt through 5 days after final dose. We excluded (1) all patients on neonatal, pediatric and oncology wards; (2) all individuals receiving perioperative prophylactic antibiotic recipients; (3) all those restricted from oral intake; and (4) those with pancreatitis, leukopenia, or posttransplant. We defined CDI by symptoms plus C. difficile toxin detection by polymerase chain reaction. Our primary outcome was hospital-onset CDI incidence on eligible hospital units, analyzed using segmented regression.
The study included 251 CDI episodes among 360,016 patient days during the baseline and intervention periods, and the incidence rate was 7.0 per 10,000 patient days. The incidence rate was similar during baseline and intervention periods (6.9 vs 7.0 per 10,000 patient days; P=.95). However, compared to the first 6 months of the intervention, we detected a significant decrease in CDI during the final 6 months (incidence rate ratio, 0.6; 95% confidence interval, 0.4–0.9; P=.009). Testing intensity remained stable between the baseline and intervention periods: 19% versus 20% of stools tested were C. difficile positive by PCR, respectively. From medical record reviews, only 26% of eligible patients received a probiotic per the protocol.
Despite poor adherence to the protocol, there was a reduction in the incidence of CDI during the intervention, which was delayed ~6 months after introducing probiotic for primary prevention.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Two of the five species of European blackberry (Rubus fruticosus L. aggregate) along the West Coast of the United States are considered invasive. They are also similar in appearance. Biological control of invasive blackberry by Phragmidium violaceum, causal agent of a rust disease, had been under consideration when rust-diseased blackberry was discovered in Oregon in 2005. An investigation was initiated to determine whether this disease would be an important factor affecting population density of these blackberries. Surveys were made over a 5-yr period at more than 30 field sites in the Willamette Valley and along the Pacific coast of Oregon. Diseased and nondiseased blackberry specimens were collected for artificial greenhouse inoculations and for identification. The two blackberry species, Rubus armeniacus and R. praecox, were identified as the most invasive. They were readily distinguished morphologically on the basis of inflorescence and flower characteristics and to a certain extent by differences in primocane leaf and leaflet shape. Artificial greenhouse inoculation studies revealed that R. praecox was susceptible to the rust disease and that R. armeniacus was not. These results were confirmed during a field survey. Results of this investigation revealed that the rust disease will not be effective for biological control of R. armeniacus and other approaches to management of this particular species will be required.
Severe leaf blight of Japanese stiltgrass (JSG) from Bipolaris disease, causing significant decline in population density at some locations, has been reported sporadically in the field. Even so, much of the JSG in the mid-Atlantic is not diseased. Six populations of JSG from the field, one that was severely diseased by B. microstegii and the others “healthy,” were tested by artificial inoculation for susceptibility to both B. microstegii (five isolates) and B. drechsleri (three isolates). Populations of JSG in this study differed in their response to the two Bipolaris species, but within species of Bipolaris the plant responses were consistent. Plants from the diseased population of JSG from Frederick, MD, were very susceptible to B. microstegii, and plants from other populations from Maryland (three locations), Delaware, and Indiana were not. In contrast, B. drechsleri caused moderate disease on plants from all accessions but one, and it was significantly less aggressive than was B. microstegii on the susceptible accession of JSG. Results of a limited host range determination only with B. microstegii revealed hypersensitive responses, and therefore high levels of resistance, in corn (four cultivars) and sorghum (three accessions). The native, sympatric grass deertongue was not diseased in these tests. Results reveal a distinct differential response among populations of JSG to disease from B. microstegii, while in contrast, B. drechsleri is capable of causing disease on a broader range of JSG populations.
The threshold for the onset of breaking proposed by Barthelemy et al. (arXiv:1508.06002v1, 2015) has been investigated in the laboratory for unidirectional wave groups in deep water and extended to include different classes of wave groups and moderate wind forcing. Thermal image velocimetry was used to compare measurements of the wave crest point (maximum elevation and also the point of maximum) surface water particle velocity (
) with the wave crest point speed (
) determined by an array of closely spaced wave gauges. The crest point surface energy flux ratio
that distinguishes maximum recurrence from marginal breaking was found to be
. Increasing wind forcing from zero to
systematically increased this threshold by 2 %. Increasing the spectral bandwidth (decreasing the Benjamin–Feir index from 0.39 to 0.31) systematically reduced the threshold by 1.5 %.