To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The advancement of lead-free piezoelectric nanogenerators (PENGs) for flexible electronics necessitates designing more efficient systems for improved energy storage capacity. In this light, the effects of patterning BaTiO3 nanotubes within PENG on the electromechanical properties of the device were investigated. The PENGs comprised a sandwich structure of Ti–BaTiO3–graphite–Ti encapsulated in polydimethylsiloxane. Four patterns of vertically aligned BaTiO3 nanotubes were synthesized via the hydrothermal conversion of selectively-anodized TiO2 nanotubes. The highest output voltage reached up to 1.9 V. Decreasing the nanotube array spacing and pattern diameter increased the lateral displacement of BaTiO3 therefore, increasing the output voltage of the device.
The co-occurrence of the 2020 Atlantic hurricane season and the ongoing coronavirus disease 2019 (COVID-19) pandemic creates complex dilemmas for protecting populations from these intersecting threats. Climate change is likely contributing to stronger, wetter, slower-moving, and more dangerous hurricanes. Climate-driven hazards underscore the imperative for timely warning, evacuation, and sheltering of storm-threatened populations – proven life-saving protective measures that gather evacuees together inside durable, enclosed spaces when a hurricane approaches. Meanwhile, the rapid acquisition of scientific knowledge regarding how COVID-19 spreads has guided mass anti-contagion strategies, including lockdowns, sheltering at home, physical distancing, donning personal protective equipment, conscientious handwashing, and hygiene practices. These life-saving strategies, credited with preventing millions of COVID-19 cases, separate and move people apart. Enforcement coupled with fear of contracting COVID-19 have motivated high levels of adherence to these stringent regulations. How will populations react when warned to shelter from an oncoming Atlantic hurricane while COVID-19 is actively circulating in the community? Emergency managers, health care providers, and public health preparedness professionals must create viable solutions to confront these potential scenarios: elevated rates of hurricane-related injury and mortality among persons who refuse to evacuate due to fear of COVID-19, and the resurgence of COVID-19 cases among hurricane evacuees who shelter together.
Lead halide perovskite nanocrystals (NCs) are promising for applications in light emitting devices owing to a strong emission spectrum that is tunable throughout the visible region by altering halide composition. However, in mixed-halide perovskite systems photoinduced migration drives formation of halide-segregated domains, altering the emission spectrum. The mechanism by which this segregation occurs is currently the subject of intense investigation. Processes involving the perovskite surface are expected to be of enhanced prevalence in NCs due to their large surface area to volume ratio. In this work, we use transient absorption spectroscopy to probe the excited-state dynamics of NCs before and after halide segregation. Comparison of global fit spectra of the measured signals suggests the accumulation of iodide at the surface, resulting in a redshifted emission spectrum.
We evaluated the impact of reflex urine culture screen results on antibiotic initiation. More patients with positive urine screen but negative culture received antibiotics than those with a negative screen (30.5 vs 7.1%). Urine screen results may inappropriately influence antibiotic initiation in patients with a low likelihood of infection.
It is not clear to what extent associations between schizophrenia, cannabis use and cigarette use are due to a shared genetic etiology. We, therefore, examined whether schizophrenia genetic risk associates with longitudinal patterns of cigarette and cannabis use in adolescence and mediating pathways for any association to inform potential reduction strategies.
Associations between schizophrenia polygenic scores and longitudinal latent classes of cigarette and cannabis use from ages 14 to 19 years were investigated in up to 3925 individuals in the Avon Longitudinal Study of Parents and Children. Mediation models were estimated to assess the potential mediating effects of a range of cognitive, emotional, and behavioral phenotypes.
The schizophrenia polygenic score, based on single nucleotide polymorphisms meeting a training-set p threshold of 0.05, was associated with late-onset cannabis use (OR = 1.23; 95% CI = 1.08,1.41), but not with cigarette or early-onset cannabis use classes. This association was not mediated through lower IQ, victimization, emotional difficulties, antisocial behavior, impulsivity, or poorer social relationships during childhood. Sensitivity analyses adjusting for genetic liability to cannabis or cigarette use, using polygenic scores excluding the CHRNA5-A3-B4 gene cluster, or basing scores on a 0.5 training-set p threshold, provided results consistent with our main analyses.
Our study provides evidence that genetic risk for schizophrenia is associated with patterns of cannabis use during adolescence. Investigation of pathways other than the cognitive, emotional, and behavioral phenotypes examined here is required to identify modifiable targets to reduce the public health burden of cannabis use in the population.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
OBJECTIVES/GOALS: We compared the validity of an International Classification of Diseases, Clinical Modification (ICD) algorithm for identifying high-grade cervical intraepithelial neoplasia and adenocarcinoma in situ (together referred to as CIN2+) from ICD 9th revision (ICD-9) and 10th revision (ICD-10) codes. METHODS/STUDY POPULATION: Using Tennessee Medicaid data, we identified cervical diagnostic procedures in 2008-2017 among females aged 18-39 years in Davidson County, TN. Gold-standard cases were pathology-confirmed CIN2+ diagnoses validated by HPV-IMPACT, a population-based surveillance project in catchment areas of five US states. Procedures in the ICD transition year (2015) were excluded to account for implementation lag. We pre-grouped diagnosis and procedure codes by theme. We performed feature selection using least absolute shrinkage and selection operator (LASSO) logistic regression with 10-fold cross validation and validated models by ICD-9 era (2008-2014, N = 6594) and ICD-10 era (2016-2017, N = 1270). RESULTS/ANTICIPATED RESULTS: Of 7864 cervical diagnostic procedures, 880 (11%) were true CIN2+ cases. LASSO logistic regression selected the strongest features of case status: Having codes for a CIN2+ tissue diagnosis, non-specific CIN tissue diagnosis, high-grade squamous intraepithelial lesion, receiving a cervical treatment procedure, and receiving a cervical/vaginal biopsy. Features of non-case status were codes for a CIN1 tissue diagnosis, Pap test, and HPV DNA test. The ICD-9 vs ICD-10 algorithms predicted case status with 68% vs 63% sensitivity, 95% vs 94% specificity, 63% vs 64% positive predictive value, 96% vs 94% negative predictive value, 92% vs 89% accuracy, and C-indices of 0.95 vs 0.92, respectively. DISCUSSION/SIGNIFICANCE OF IMPACT: Overall, the algorithm’s validity for identifying CIN2+ case status was similar between coding versions. ICD-9 had slightly better discriminative ability. Results support a prior study concluding that ICD-10 implementation has not substantially improved the quality of administrative data from ICD-9.
Herbicide resistance has for decades been an increasing problem of agronomic crops such as corn and soybean. Several weed species have evolved herbicide resistance in turfgrass systems such as golf courses, sports fields, and sod production—particularly biotypes of annual bluegrass and goosegrass. Consequences of herbicide resistance in agronomic cropping systems indicate what could happen in turfgrass if herbicide resistance becomes broader in terms of species, distribution, and mechanisms of action. The turfgrass industry must take action to develop effective resistance management programs while this problem is still relatively small in scope. We propose that lessons learned from a series of national listening sessions conducted by the Herbicide Resistance Education Committee of the Weed Science Society of America to better understand the human dimensions affecting herbicide resistance in crop production provide tremendous insight into what themes to address when developing effective resistance management programs for the turfgrass industry.
Eurypterids are generally considered to comprise a mixture of active nektonic to nektobenthic predators and benthic scavenger-predators exhibiting a mode of life similar to modern horseshoe crabs. However, two groups of benthic stylonurine eurypterids, the Stylonuroidea and Mycteropoidea, independently evolved modifications to the armature of their anterior appendages that have been considered adaptations toward a sweep-feeding life habit, and it has been suggested the evolution toward sweep-feeding may have permitted stylonurines to capture smaller prey species and may have been critical for the survival of mycteropoids during the Late Devonian mass extinction. There is a linear correlation between the average spacing of feeding structures and prey sizes among extant suspension feeders. Here, we extrapolate this relationship to sweep-feeding eurypterids in order to estimate the range of prey sizes that they could capture and examine prey size in a phylogenetic context to determine what role prey size played in determining survivorship during the Late Devonian. The mycteropoid Cyrtoctenus was the most specialized sweep-feeder, with comblike appendage armature capable of capturing mesoplankton out of suspension, while the majority of stylonurines possess armature corresponding to a prey size range of 1.6–52 mm, suggesting they were suited for capturing small benthic macroinvertebrates such as crustaceans, mollusks, and wormlike organisms. There is no clear phylogenetic signal to prey size distribution and no evolutionary trend toward decreasing prey sizes among Stylonurina. Rather than prey size, species survivorship during the Late Devonian was likely mediated by geographic distribution and ability to capitalize on the expanding freshwater benthos.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
Although lignin has been negatively correlated with neutral-detergent fibre (NDF) digestibility (NDFD) in ruminants and used to predict potential extent of NDF digestion of forages, selection of an analysis, Klason lignin (KL) or acid-detergent lignin (ADL), to describe that the nutritionally relevant lignin has not been resolved. Dismissed as an artifact is the difference between KL and ADL (ΔL). A question is whether ΔL influences NDFD. We evaluated the relationships of ΔL, KL and ADL with NDFD in order to determine the nutritionally homogeneous or heterogeneous nature of KL. Data sets from two laboratories (DS1 and DS2) were used that included ADL, KL and in vitro NDFD at 48 h (NDFD48). DS1 contained seven C3 grasses, seventeen C4 maize forages and nineteen alfalfas, and DS2 had fifteen C3 grasses, eight C4 forages and six alfalfas. Mean ΔL was greater than ADL in C3 and C4 samples and less in alfalfas. Within forage type and laboratory, ΔL was not correlated with NDFD48 (r −0·34–0·49; all P > 0·17). ADL was more consistently correlated with NDFD48 (r −0·47–−0·95; P < 0·01–0·21) than with KL (r 0·03–−0·91; P < 0·01–0·94). ΔL as a proportion of KL was correlated with NDFD48 in C3 and C4 samples (r 0·44–0·76; P < 0·01–0·08). The differing behaviours of ΔL and ADL relative to NDFD48 indicate that KL is a nutritionally heterogeneous fraction, the behaviour of which may vary by forage type and ratios of ADL and ΔL present.
The occupation of new environments by evolutionary lineages is frequently associated with morphological changes. This covariation of ecotype and phenotype is expected due to the process of natural selection, whereby environmental pressures lead to the proliferation of morphological variants that are a better fit for the prevailing abiotic conditions. One primary mechanism by which phenotypic variants are known to arise is through changes in the timing or duration of organismal development resulting in alterations to adult morphology, a process known as heterochrony. While numerous studies have demonstrated heterochronic trends in association with environmental gradients, few have done so within a phylogenetic context. Understanding species interrelationships is necessary to determine whether morphological change is due to heterochronic processes; however, research is hampered by the lack of a quantitative metric with which to assess the degree of heterochronic traits expressed within and among species. Here I present a new metric for quantifying heterochronic change, expressed as a heterochronic weighting, and apply it to xiphosuran chelicerates within a phylogenetic context to reveal concerted independent heterochronic trends. These trends correlate with shifts in environmental occupation from marine to nonmarine habitats, resulting in a macroevolutionary ratchet. Critically, the distribution of heterochronic weightings among species shows evidence of being influenced by both historical, phylogenetic processes and external ecological pressures. Heterochronic weighting proves to be an effective method to quantify heterochronic trends within a phylogenetic framework and is readily applicable to any group of organisms that have well-defined morphological characteristics, ontogenetic information, and resolved internal relationships.
Our objective was to examine the performance characteristics of a bladder stimulation technique for urine collection among infants presenting to the emergency department (ED).
This prospective cohort study enrolled a convenience sample of infants aged ≤ 90 days requiring urine testing in the ED. Infants were excluded if critically ill, moderately to severely dehydrated, or having significant feeding issues. Bladder stimulation consisted of finger tapping on the lower abdomen with or without lower back massage while holding the child upright. The primary outcome was successful midstream urine collection within 5 minutes of stimulation. Secondary outcomes included sample contamination, bladder stimulation time for successful urine collection, and perceived patient distress on a 100-mm visual analog scale (VAS).
We enrolled 151 infants and included 147 in the analysis. Median age was 53 days (interquartile range [IQR] 27–68 days). Midstream urine sample collection using bladder stimulation was successful in 78 infants (53.1%; 95% confidence interval [CI] 45–60.9). Thirty-nine samples (50%) were contaminated. Most contaminated samples (n = 31; 79.5%) were reported as “no significant growth” or “growth of 3 or more organisms”. Median bladder stimulation time required for midstream urine collection was 45 seconds (IQR 20–120 seconds). Mean VAS for infant distress was 22 mm (standard deviation 23 mm).
The success rate of this bladder stimulation technique was lower than previously reported. The contamination rate was high, however most contaminated specimens were easily identified and had no clinical impact.
The ‘jumping to conclusions’ (JTC) bias is associated with both psychosis and general cognition but their relationship is unclear. In this study, we set out to clarify the relationship between the JTC bias, IQ, psychosis and polygenic liability to schizophrenia and IQ.
A total of 817 first episode psychosis patients and 1294 population-based controls completed assessments of general intelligence (IQ), and JTC, and provided blood or saliva samples from which we extracted DNA and computed polygenic risk scores for IQ and schizophrenia.
The estimated proportion of the total effect of case/control differences on JTC mediated by IQ was 79%. Schizophrenia polygenic risk score was non-significantly associated with a higher number of beads drawn (B = 0.47, 95% CI −0.21 to 1.16, p = 0.17); whereas IQ PRS (B = 0.51, 95% CI 0.25–0.76, p < 0.001) significantly predicted the number of beads drawn, and was thus associated with reduced JTC bias. The JTC was more strongly associated with the higher level of psychotic-like experiences (PLEs) in controls, including after controlling for IQ (B = −1.7, 95% CI −2.8 to −0.5, p = 0.006), but did not relate to delusions in patients.
Our findings suggest that the JTC reasoning bias in psychosis might not be a specific cognitive deficit but rather a manifestation or consequence, of general cognitive impairment. Whereas, in the general population, the JTC bias is related to PLEs, independent of IQ. The work has the potential to inform interventions targeting cognitive biases in early psychosis.