To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The present study aims to investigate the effect of wholegrain and legume consumption on the incidence of age-related cataract in an older Australian population-based cohort. The Blue Mountains Eye Study (BMES) is a population-based cohort study of eye diseases among older adults aged 49 years or older (1992–1994, n 3654). Of 2334 participants of the second examination of the BMES (BMES 2, 1997–2000), 1541 (78·3 % of survivors) were examined 5 years later (BMES 3) who had wholegrain and legume consumption estimated from the FFQ at BMES 2. Cataract was assessed using photographs taken during examinations following the Wisconsin cataract grading system. Multivariable-adjusted logistic regression models were used to assess associations with the 5-year incidence of cataract from BMES 2 (baseline) to BMES 3. The 5-year incidence of cortical, nuclear and posterior subcapsular (PSC) cataract was 18·2, 16·5 and 5·9 %, respectively. After adjustment for age, sex and other factors, total wholegrain consumption at baseline was not associated with incidence of any type of cataract. High consumption of legumes showed a protective association for incident PSC cataract (5th quintile: adjusted OR 0·37; 95 % CI 0·15, 0·92). There was no significant trend of this association across quintiles (P = 0·08). In this older Australian population, we found no associations between wholegrain intake at baseline and the 5-year incidence of three cataract types. However, intake of legumes in the highest quintile, compared with the lowest quintile, may protect against PSC formation, a finding needing replication in other studies.
Triage at mass gatherings in Australia is commonly performed by staff members with first aid training. There have been no evaluations of the performance of first aid staff with respect to diagnostic accuracy or identification of presentations requiring ambulance transport to hospital.
It was hypothesized that triage decisions by first aid staff would be considered correct in at least 61% of presentations.
A retrospective audit of 1,048 presentations to a single supplier of event health care services in Australia was conducted. The presentations were assessed based on the first measured set of physiological parameters, and the primary triage decision was classified as “expected” if the primary and secondary triage classifications were the same or “not expected” if they differed. The performance of the two triage systems was compared using area under the receiver operating characteristic curve (AUROC) analysis.
The expected decision was made by first aid staff in 674 (71%) of presentations. Under-triage occurred in 131 (14%) presentations and over-triage in 142 (15%) presentations. The primary triage strategy had an AUROC of 0.7644, while the secondary triage strategy had an AUROC of 0.6280, which was significantly different (P = .0199).
The results support the continued use of first aid trained staff members in triage roles at Australian mass gatherings. Triage tools should be simple, and the addition of physiological variables to improve the sensitivity of triage tools is not recommended because such an approach does not improve the discriminatory capacity of the tools.
OBJECTIVES/SPECIFIC AIMS: Clostridium difficile infection (CDI) is the most common cause of antibiotic-associated diarrhea and an increasingly common infection in children in both hospital and community settings. Between 20% and 30% of pediatric patients will have a recurrence of symptoms in the days to weeks following an initial infection. Multiple recurrences have been successfully treated with fecal microbiota transplantation (FMT), though the body of evidence in pediatric patients is limited primarily to case reports and case series. The goal of our study was to better understand practices, success, and safety of FMT in children as well as identify risk factors associated with a failed FMT in our pediatric patients. METHODS/STUDY POPULATION: This multicenter retrospective analysis included 373 patients who underwent FMT for CDI between January 1, 2006 and January 1, 2017 from 18 pediatric centers. Demographics, baseline characteristics, FMT practices, C. difficile outcomes, and post-FMT complications were collected through chart abstraction. Successful FMT was defined as no recurrence of CDI within 60 days after FMT. Of the 373 patients in the cohort, 342 had known outcome data at two months post-FMT and were included in the primary analysis evaluating risk factors for recurrence post-FMT. An additional six patients who underwent FMT for refractory CDI were excluded from the primary analysis. Unadjusted analysis was performed using Wilcoxon rank-sum test, Pearson χ2 test, or Fisher exact test where appropriate. Stepwise logistic regression was utilized to determine independent predictors of success. RESULTS/ANTICIPATED RESULTS: The median age of included patients was 10 years (IQR; 3.0, 15.0) and 50% of patients were female. The majority of the cohort was White (89.0%). Comorbidities included 120 patients with inflammatory bowel disease (IBD) and 14 patients who had undergone a solid organ or stem cell transplantation. Of the 336 patients with known outcomes at two months, 272 (81%) had a successful outcome. In the 64 (19%) patients that did have a recurrence, 35 underwent repeat FMT which was successful in 20 of the 35 (57%). The overall success rate of FMT in preventing further episodes of CDI in the cohort with known outcome data was 87%. Unadjusted predictors of a primary FMT response are summarized. Based on stepwise logistic regression modeling, the use of fresh stool, FMT delivery via colonoscopy, the lack of a feeding tube, and a lower number of CDI episodes before undergoing FMT were independently associated with a successful outcome. There were 20 adverse events in the cohort assessed to be related to FMT, 6 of which were felt to be severe. There were no deaths assessed to be related to FMT in the cohort. DISCUSSION/SIGNIFICANCE OF IMPACT: The overall success of FMT in pediatric patients with recurrent or severe CDI is 81% after a single FMT. Children without a feeding tube, who receive an early FMT, FMT with fresh stool, or FMT via colonoscopy are less likely to have a recurrence of CDI in the 2 months following FMT. This is the first large study of FMT for CDI in a pediatric cohort. These findings, if confirmed by additional prospective studies, will support alterations in the practice of FMT in children.
The long-term stability of mechanically exfoliated MoS2 flakes was compared for storage in the air and storage under vacuum. Significant changes in MoS2 flakes were observed for samples stored in the air, whereas similar flakes on samples stored in vacuum underwent no change. Small speckles were observed to appear on the surface of flakes stored in the air, followed by thinning and eventual decomposition of MoS2 flakes. The speckles are suspected to be formed by oxidation of MoS2 in the presence of atmospheric oxygen and water molecules, resulting in the formation of hydrated MoO3.
The English spelling system has a variety of rules and exceptions, but both theoretical and empirical accounts have generally concluded that by about age 9 or 10, children master the morphological rule that regular plural nouns (e.g., socks) and third-person singular present verbs (e.g., lacks) are spelled with the inflectional ending –s. In three experiments, however, we found that when forced to rely exclusively on morphological cues, only a minority of primary school children, secondary school children, and even adults performed significantly above chance at choosing the appropriate spelling for novel words presented as inflected or uninflected nouns and verbs. Further, significantly above-chance performance was more common in adults who had attended school until age 18, compared to age 16. We conclude that many spellers, especially those who do not go on to tertiary education, never learn some simple morphological spelling rules, and instead rely on a store of individual word-specific spellings.
Poor physiological self-regulation has been proposed as a potential biological vulnerability for adolescent suicidality. This study tested this hypothesis by examining the effect of parasympathetic stress responses on future suicide ideation. In addition, drawing from multilevel developmental psychopathology theories, the interplay between parasympathetic regulation and friendship support, conceptualized as an external source of regulation, was examined. At baseline, 132 adolescent females (M age = 14.59, SD = 1.39) with a history of mental health concerns participated in an in vivo interpersonal stressor (a laboratory speech task) and completed self-report measures of depressive symptoms and perceived support within a close same-age female friendship. Respiratory sinus arrhythmia (RSA) was measured before and during the speech task. Suicide ideation was assessed at baseline and at 3, 6, and 9 months follow-up. The results revealed that females with greater relative RSA decreases to the laboratory stressor were at higher risk for reporting suicide ideation over the subsequent 9 months. Moreover, parasympathetic responses moderated the effect of friendship support on suicide ideation; among females with mild changes or higher relative increases in RSA, but not more pronounced RSA decreases, friendship support reduced risk for future suicide ideation. Findings highlight the crucial role of physiological and external regulation sources as protective factors for youth suicidality.
We aimed to examine the relationship between dietary glycaemic index (GI) and glycaemic load of foods consumed, intakes of carbohydrates, sugars and fibre, and the prevalence of depressive symptoms in older adults. Data collected from 2334 participants aged 55+ years and 1952 participants aged 60+ years were analysed. Dietary information was collected using a semi-quantitative FFQ. Depressive symptoms were based on antidepressant use or either the 36-Item Short-Form Survey, which included the Mental Health Index (MHI), or the Center for Epidemiologic Studies Depression-10 Scale. Participants in the highest v. lowest tertile of dietary GI intake had increased odds of depressive symptoms (assessed by the MHI scale), multivariable-adjusted OR 1·55 (95 % CI 1·12, 2·14). Participants in the highest compared with lowest tertile of fruit consumption had reduced odds of prevalent depressive symptoms, multivariable-adjusted OR 0·66 (95 % CI 0·46, 0·95). Total fibre, vegetable fibre and breads/cereal fibre intakes were all inversely associated with the prevalence of depressive symptoms, with global P values of 0·03, 0·01 and 0·03, respectively. Participants in the second v. first tertile of vegetable consumption had 41 % reduced odds of prevalent depressive symptoms, multivariable-adjusted OR 0·59 (95 % CI 0·40, 0·88). We show that dietary GI and fibre intakes as well as consumption of fruits and vegetables are associated with the prevalence of depressive symptoms.
We prospectively assessed the (1) frequency and socio-economic correlates of takeaway food consumption during adolescence; and (2) association between frequent takeaway food consumption with intakes of major food groups and anthropometric measures and blood pressure (BP). In total, 699 Sydney schoolchildren (380 girls and 319 boys) who had dietary data at both 12 and 17 years of age were included for analyses. Takeaway food consumption was self-reported and based on a single question. Anthropometric measures and BP were collected. The proportion of participants who ate takeaway foods once per week or more increased significantly over 5 years from the age of 12 to 17 years: 35·5–44·1 % (P<0·0001). In total, 12-year-old girls compared with boys had reduced odds of takeaway foods once per week or more at the age of 17 years (P=0·01), multivariable-adjusted OR 0·63 (95 % CI 0·44, 0·90). In total, 12-year-old children who ate takeaway foods once per week or more had significantly lower mean fruit (220·3 v. 253·0 g/d; P=0·03) and vegetable consumption (213·2 v. 247·7 g/d; P=0·004), 5 years later (at 17 years of age). Frequent takeaway food consumption at the age of 12 years was not associated with anthropometric indices and BP at the age of 17 years. Consumption of takeaway foods became more frequent during adolescence, particularly among boys, and it was associated with reduced intake of fruits and vegetables.
A key issue in two-dimensional structures composed of atom-thick sheets of electronic materials is the dependence of the properties of the combined system on the features of its parts. Here, we introduce a simple framework for the study of the electronic structure of layered assemblies based on perturbation theory. Within this framework, we calculate the band structure of commensurate and twisted bilayers of graphene (Gr) and hexagonal boron nitride (h-BN), and of a Gr/h-BN heterostructure, which we compare with reference full-scale density functional theory calculations. This study presents a general methodology for computationally efficient calculations of two-dimensional materials and also demonstrates that for relatively large twist in the graphene bilayer, the perturbation of electronic states near the Fermi level is negligible.
Current systems for safely manipulating values containing names only support simple binding structures for those names. As a result, few tools exist to safely manipulate code in those languages for which name problems are the most challenging. We address this problem with Romeo, a language that respects α-equivalence on its values, and which has access to a rich specification language for binding, inspired by attribute grammars. Our work has the complex-binding support of David Herman's λm, but is a full-fledged binding-safe language like Pure FreshML.
Documentation of on-farm sustainability in agricultural sectors is becoming an essential element to ensure market access. An assessment process was developed to help soybean farmers document practices and verifiable advances in community, environmental and economic sustainability. Technical difficulties in analyzing and summarizing such assessment data include a large number of practices, correlation in variables, and use of discrete measures. By combining non-negative principal components analysis and common-weight data envelopment analysis, we overcame these difficulties to calculate a composite sustainability index for each individual farm and for the farm group as a whole. Applying this method to assessment data from 410 US Midwestern soybean farmers gave average sustainability scores of 0.846 and 0.842 for the soybean-specific and whole-farm assessments, respectively. Scenario analysis examined the impact if the bottom 10% of growers adopted the top ten sustainability drivers identified by the analysis. The average sustainability score only increased by 2%, but the minimum score increased from 0.515 to 0.647 for the soybean-specific assessment, and from 0.624 to 0.685 for the whole-farm assessment, while the lowest 10th percentile increased from 0.635 to 0.819 for the soybean-specific assessment, and from 0.634 to 0.920 for the whole-farm assessment. These results suggest that significant advancements could be made through focused efforts to improve adoption of sustainable practices by soybean farmers at the lower end of the spectrum.
Atrazine has been used for control of many weeds, primarily broadleaf weeds, in U.S. corn fields since 1957. Recently, the adoption of glyphosate-resistant corn hybrids have led to glyphosate eclipsing atrazine as the most commonly used herbicide in corn production. However, the evolution and spread of glyphosate-resistant weeds is a major concern. Atrazine use in Wisconsin is prohibited in 102 areas encompassing 0.49 million ha where total chlorinated residues were found in drinking water wells at concentrations > 3 μg L−1. Atrazine has been prohibited in many of those areas for > 10 yr, providing an opportunity to evaluate weed community composition differences due to herbicide regulation. In question, has the abundance of broadleaf weeds increased, coupled with an increased reliance on glyphosate, where atrazine use has been discontinued? To answer this, an online questionnaire was distributed to Wisconsin growers in June and then weeds present in 343 fields in late July through mid-September in 2012 and 2013 were counted. Data were summarized for frequency, uniformity, density, and relative abundance to compare weed community composition in fields with discontinued vs. recent atrazine use. Growers used glyphosate in 70 vs. 54% of fields with discontinued vs. recent atrazine use, respectively (P = 0.021). Moreover, broadleaf weeds were found more frequently, (73 vs. 61%; P = 0.03), they had 50% greater in-field uniformity (P = 0.002), and density was 0.4 vs. 0.19 plants m−2 (i.e., twofold greater; P < 0.0001) in discontinued vs. recent atrazine-use fields. Changes were most evident with troublesome glyphosate-resistant broadleaf weeds such as Amaranthus species and giant ragweed. In conclusion, weed community composition consisted of more broadleaf weeds in fields where atrazine has not been used in the recent decade coupled with greater glyphosate use. These results provide evidence of negative long-term implications for glyphosate resistance where growers increased reliance on glyphosate in place of atrazine.
Atrazine is an important herbicide for broadleaf weed control in corn. Use rates have declined in many corn production systems due to environmental concerns and the availability of other effective herbicides, especially glyphosate in glyphosate-resistant hybrids. However, using multiple effective herbicide modes of action is ever more important because occurrence of herbicide-resistant weeds is increasing. An experiment to compare application timings of reduced rates of atrazine to benefit resistance management in broadleaf weeds while protecting corn yield was conducted in Wisconsin across four site-years in 2012 and 2013. Herbicide treatments consisted of five atrazine rate and timing combinations and three POST base herbicides: glyphosate, glufosinate, and tembotrione. Metolachlor was applied PRE at 2.1 kg ai ha−1 for grass control in all treatments. A linear regression model estimated that atrazine rates ≥ 1.0 kg ai ha−1 applied PRE would prevent exposure of common lambsquarters plants to POST herbicides, but giant ragweed and velvetleaf exposure was not influenced by timing. Corn yield was also not influenced by atrazine rate and timing combinations at the α = 0.05 level; however, at P = 0.06, corn yield was greater for atrazine applied PRE at 1.1 kg ha−1 than for atrazine applied PRE at 0.5 kg ha−1, POST at 1.1 kg ha−1, or not at all. In summary, higher rates of atrazine applied PRE may improve yield, as reported by others, but this study concludes reduced rates of atrazine (i.e., ≤ 1.1 kg ha−1) applied to corn in a POST tank mixture combination provided more consistent control of giant ragweed, velvetleaf, and common lambsquarters compared with atrazine applied PRE. This information should help direct atrazine application timing applied POST when applied at low rates to improve proactive herbicide resistance management.
It is unclear whether lifestyle modifications, such as dietary changes, should be advocated to prevent olfactory dysfunction. We investigated the association between dietary intakes of fats (saturated, mono-unsaturated and polyunsaturated fats, and cholesterol) and related food groups (nuts, fish, butter, margarine) with olfactory impairment. There were 1331 and 667 participants (older than 60 years) at baseline and 5-year follow-up, respectively, with complete olfaction and dietary data. Dietary data were collected using a validated semi-quantitative FFQ. Olfaction was measured using the San Diego Odor Identification Test. In a cross-sectional analysis of baseline data, those in the highest v. lowest quartile of n-6 PUFA intake had reduced odds of having any olfactory impairment, multivariable-adjusted OR 0·66 (95 % CI 0·44, 0·97), P for trend = 0·06. Participants in the highest v. lowest quartile of margarine consumption had a 65 % reduced odds of having moderate/severe olfactory impairment (P for trend = 0·02). Participants in the highest quartile compared to the lowest quartile (reference) of nut consumption had a 46 % (P for trend = 0·01) and 58 % (P for trend = 0·001) reduced odds of having any or mild olfactory impairment, respectively. Older adults in the highest v. lowest quartile of fish consumption had 35 % (P for trend = 0·03) and 50 % (P for trend = 0·01) reduced likelihood of having any or mild olfactory impairment, respectively. In longitudinal analyses, a marginally significant association was observed between nut consumption and incidence of any olfactory impairment, highest v. lowest quartile of nut consumption: OR 0·61 (95 % CI 0·37, 1·00). Older adults with the highest consumption of nuts and fish had reduced odds of olfactory impairment, independent of potential confounding variables.
Endothelial dysfunction and arterial stiffness are early predictors of CVD. Intervention studies have suggested that diet is related to vascular health, but most prior studies have tested individual foods or nutrients and relied on small samples of younger adults. The purpose of the present study was to examine the relationships between adherence to the 2010 Dietary Guidelines for Americans and vascular health in a large cross-sectional analysis. In 5887 adults in the Framingham Heart Study Offspring and Third Generation cohorts, diet quality was quantified with the 2010 Dietary Guidelines Adherence Index (DGAI-2010). Endothelial function was assessed via brachial artery ultrasound and arterial stiffness via arterial tonometry. In age-, sex- and cohort-adjusted analyses, a higher DGAI-2010 score (greater adherence) was modestly associated with a lower resting flow velocity, hyperaemic response, mean arterial pressure, carotid–femoral pulse wave velocity (PWV), and augmentation index, but not associated with resting arterial diameter or flow-mediated dilation (FMD). In multivariable models adjusting for cardiovascular risk factors, only the association of a higher DGAI-2010 score with a lower baseline flow velocity and augmentation index persisted (β = − 0·002, P= 0·003 and β = − 0·05 ± 0·02, P< 0·001, respectively). Age-stratified multivariate-adjusted analyses suggested that the relationship of higher DGAI-2010 scores with lower mean arterial pressure, PWV and augmentation index was more pronounced among adults younger than 50 years. Better adherence to the 2010 Dietary Guidelines for Americans, particularly in younger adults, is associated with a lower peripheral blood flow velocity and arterial wave reflection, but not FMD. The present results suggest a link between adherence to the Dietary Guidelines and favourable vascular health.
The Murchison Widefield Array is a new low-frequency interferometric radio telescope built in Western Australia at one of the locations of the future Square Kilometre Array. We describe the automated radio-frequency interference detection strategy implemented for the Murchison Widefield Array, which is based on the aoflagger platform, and present 72–231 MHz radio-frequency interference statistics from 10 observing nights. Radio-frequency interference detection removes 1.1% of the data. Radio-frequency interference from digital TV is observed 3% of the time due to occasional ionospheric or atmospheric propagation. After radio-frequency interference detection and excision, almost all data can be calibrated and imaged without further radio-frequency interference mitigation efforts, including observations within the FM and digital TV bands. The results are compared to a previously published Low-Frequency Array radio-frequency interference survey. The remote location of the Murchison Widefield Array results in a substantially cleaner radio-frequency interference environment compared to Low-Frequency Array’s radio environment, but adequate detection of radio-frequency interference is still required before data can be analysed. We include specific recommendations designed to make the Square Kilometre Array more robust to radio-frequency interference, including: the availability of sufficient computing power for radio-frequency interference detection; accounting for radio-frequency interference in the receiver design; a smooth band-pass response; and the capability of radio-frequency interference detection at high time and frequency resolution (second and kHz-scale respectively).