To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ergothioneine (ERG) is an unusual thio-histidine betaine amino acid that has potent antioxidant activities. It is synthesised by a variety of microbes, especially fungi (including in mushroom fruiting bodies) and actinobacteria, but is not synthesised by plants and animals who acquire it via the soil and their diet, respectively. Animals have evolved a highly selective transporter for it, known as solute carrier family 22, member 4 (SLC22A4) in humans, signifying its importance, and ERG may even have the status of a vitamin. ERG accumulates differentially in various tissues, according to their expression of SLC22A4, favouring those such as erythrocytes that may be subject to oxidative stress. Mushroom or ERG consumption seems to provide significant prevention against oxidative stress in a large variety of systems. ERG seems to have strong cytoprotective status, and its concentration is lowered in a number of chronic inflammatory diseases. It has been passed as safe by regulatory agencies, and may have value as a nutraceutical and antioxidant more generally.
This chapter discusses and reviews research on the relationship between two closely aligned concepts: intelligence and reasoning. We begin by defining reasoning in a general sense. Next, we review prominent theories and models of intelligence and reasoning in both the psychometric and cognitive psychological traditions, highlighting how the two constructs are both intertwined yet nonetheless conceptually discriminable. We follow by discussing issues involved in validly measuring reasoning, touching on considerations, concerns, and evidence informed by the cognitive and psychometric perspectives. Then, we review the relationship between reasoning and allied constructs and domains, including expertise, practical outcomes (e.g., educational and workplace achievement), working memory, and critical thinking. We conclude by sketching multiple avenues for future research.
The rising need for crop diversification to mitigate the impacts of climate change on food security urges the exploration of crop wild relatives (CWR) as potential genetic resources for crop improvement. This study aimed at assessing the diversity of CWR of the Indian Ocean islands of Mauritius and Rodrigues and proposing cost-effective conservation measures for their sustainable use. A comprehensive list of the native species was collated from The Mauritius Herbarium and published literature. Each species was assessed for the economic value of its related crop, utilization potential for crop improvement, relative distribution, occurrence status and Red List conservation status, using a standard scoring method for prioritization. The occurrence data of the priority species were collected, verified, geo-referenced and mapped. A total of 43 crop-related species were identified for both islands and 21 species were prioritized for active conservation. The CWR diversity hotspots in Mauritius included Mondrain, followed by Florin and Le Pouce Mountain. Although a wide diversity of CWR has been recorded on both islands, most do not relate to major economic crops in use, therefore only a few species may be gene donors to economic crops at the regional and global level. For example, coffee, a major global beverage crop, has three wild relatives on Mauritius, which could potentially be of interest for future predictive characterization.
Successful conservation strategies require that taxa are prioritized because resources for planning and implementation are always limited. In this study, we created a partial checklist of crop wild relatives (CWR) that occur in the Southern African Development Community (SADC) region and identified the taxa of highest priority for regional conservation planning based on their importance for food and economic security. We found that the region contains over 1900 wild relatives of species cultivated for food, beverages, ornamental, forage/fodder, forestry, medicinal, environmental and other uses. Prioritization of these species was based on two criteria: (i) the value of the related crop for human food and economic security in the region and/or globally, and (ii) the potential or known value of the wild relatives of those crops for crop improvement. The region contains 745 CWR species related to 64 human food and beverage crops that are of high socioeconomic importance and 100 of these are of immediate priority for conservation action. The results of this study show that the SADC region contains a wealth of CWR diversity that is not only of value for food and economic security within the region but also globally. Furthermore, this study represents the first step in developing a CWR conservation and sustainable use strategy for the region, where its implementation would contribute to food security and well-being.
Crop wild relatives (CWR) are a vital source of traits for crop improvement – therefore, conserving CWR diversity is critical to ensure food, nutrition and economic security. Efficient CWR conservation planning is a critical first step to maintain this natural resource for future use. The development of National Strategic Action Plans (NSAPs) for the conservation and sustainable use of CWR is an effective means of conservation planning and also plays an important role in sensitizing policy makers and other stakeholders to the importance of CWR. Tools to guide and facilitate countries in CWR national conservation planning and NSAP development have been prepared, namely: an ‘Interactive Toolkit for CWR Conservation Planning’, a ‘Template for the Preparation of a NSAP for the Conservation and Sustainable Use of CWR’, a ‘Template for the Preparation of a Technical Background Document for a NSAP for the Conservation and Sustainable Use of CWR’, a ‘CWR Checklist and Inventory Data Template’ and an ‘Occurrence Data Collation Template’. In this short communication, we briefly explain what these tools are, how they were developed, how they can be used and where they can be found.
Borderline personality disorder (BPD) is characterised by recurring crises, hospitalisations, self-harm, suicide attempts, addictions, episodes of depression, anxiety and aggression and lost productivity. The objective of this study is to determine the use of direct health care resources by persons with BPD in Ireland and the corresponding costs.
This prevalence-based micro-costing study was undertaken on a sample of 196 individuals with BPD attending publicly funded mental health services in Ireland. All health care costs were assessed using a resource utilisation questionnaire completed by mental health practitioners. A probabilistic sensitivity analysis, using a Monte Carlo simulation, was performed to examine uncertainty.
Total direct healthcare cost per individual was €10 844 annually (ranging from 5228 to 20 609). Based on a prevalence of 1% and an adult population (18–65 years) of 2.87 million, we derived that there were 28 725 individuals with BPD in Ireland. Total yearly cost of illness was calculated to be up to €311.5 million.
There is a dearth of data on health care resource use and costs of community mental health services in Ireland. The absence of this data is a considerable constraint to research and decision-making in the area of community mental health services. This paper contributes to the limited literature on resource use and costs in community mental health services in Ireland. The absence of productivity loss data (e.g. absenteeism and presenteeism), non-health care costs (e.g. addiction treatment), and indirect costs (e.g. informal care) from study participants is a limitation of this study.
Tail docking of pigs is commonly performed to reduce the incidence of unwanted tail-biting behaviour. Two docking methods are commonly used: blunt trauma cutting (i.e. using side clippers), or cutting and concurrent cauterisation using a hot cautery iron. A potential consequence of tail amputation is the development of neuromas at the docking site. Neuromas have been linked to neuropathic pain, which can influence the longer-term welfare of affected individuals. To determine whether method of tail docking influences the extent of neuroma formation, 75 pigs were allocated to one of three treatments at birth: tail docked using clippers; tail docked using cautery iron; tail left intact. Tail docking was performed at 2 days of age and pigs were kept under conventional conditions until slaughter at 21 weeks of age. Tails were removed following slaughter and subjected to histological examination. Nerve histomorphology was scored according to the following scale: 1=discrete well-organised nerve bundles; 2=moderate neural proliferation and disorganisation affecting more than half of the circumference of the tail; 3=marked neural proliferation to form almost continuous disorganised bundles or non-continuous enlarged bundles compressing the surrounding connective tissue. Scores of 2 or 3 indicated neuroma formation. Scores were higher in docked pigs than undocked pigs (P<0.001), but did not differ between pigs docked using clippers and those docked using cautery (P=0.23). The results indicate that tail docking using either clippers or cautery results in neuroma formation, thus having the potential to affect long-term pig welfare.
We describe in this work the BASS survey for brown dwarfs in young moving groups of the solar neighborhood, and summarize the results that it generated. These include the discovery of the 2MASS J01033563–5515561 (AB)b and 2MASS J02192210–3925225 B young companions near the deuterium-burning limit as well as 44 new low-mass stars and 69 new brown dwarfs with a spectroscopically confirmed low gravity. Among those, ~20 have estimated masses within the planetary regime, one is a new L4 γ bona fide member of AB Doradus, three are TW Hydrae candidates with later spectral types (L1–L4) than all of its previously known members and six are among the first contenders for low-gravity ≥ L5 β/γ brown dwarfs, reminiscent of WISEP J004701.06+680352.1, PSO J318.5338–22.8603 and VHS J125601.92–125723.9 b. Finally, we describe a future version of this survey, BASS-Ultracool, that will specifically target ≥ L5 candidate members of young moving groups. First experimentations in designing the survey have already led to the discovery of a new T dwarf bona fide member of AB Doradus, as well as the serendipitous discoveries of an L9 subdwarf and an L5 + T5 brown dwarf binary.
Socio-economic status (SES) has been associated with measures of diet quality; however, such measures have not directly captured overall eating practices in individuals. Based on the factor analysis of fifty-six food groups from FFQ, associations between patterns of food consumption and SES were examined in a nationwide sample of 17 062 black (34·6 %) and white participants (age >45 years) from the REasons for Geographic And Racial Differences in Stroke (REGARDS) study. Logistic regression models adjusted for age, sex, racial group and geographic region were used to examine adherence to five emergent dietary patterns (convenience, plant-based, sweets/fats, southern and alcohol/salads) according to four levels each of individual education, household income and community-level SES. Further models assessed adherence to these dietary patterns by racial group, and an overall model including both racial groups examined whether the relationships between SES and adherence to these dietary patterns differed among black and white participants. For all the three measures of SES, higher SES had been associated with greater adherence to plant-based and alcohol/salads patterns, but lower adherence to sweets/fats and southern patterns. Statistically significant differences between black and white participants were observed in the associations between household income and adherence to alcohol/salads, individual education and adherence to plant-based and sweets/fats, and community SES and adherence to convenience patterns. As adherence to dietary patterns has been shown to be associated with health outcomes in this population (e.g. stroke), the present study offers valuable insight into behavioural and environmental factors that may contribute to health disparities in the diverse US population.
Field studies were conducted from 2007 to 2009 in East Lansing, MI to evaluate three residual herbicide programs, three POST herbicide application timings, and two POST herbicides in glyphosate- and glufosinate-resistant corn. Herbicide programs included a residual PRE-applied herbicide followed by (fb) POST application (residual fb POST), a residual herbicide tank-mixed with a POST herbicide (residual + POST), and a nonresidual POST. Three POST herbicide application timings included early POST (EP), mid-POST (MP), and late POST (LP) at an average corn growth stage of V3/V4, V4/V5, and V5/V6, respectively. The two POST herbicides evaluated were glyphosate and glufosinate. Control of common lambsquarters and giant foxtail was evaluated 28 d after the LP application. Glyphosate often provided greater weed control than glufosinate. The LP application resulted in greater giant foxtail control compared with the EP application timing, which may be attributed to control of late-emerging weeds. The EP application timing improved common lambsquarters control compared with the LP application timing. The residual + POST program resulted in greater weed control compared with the residual fb POST program in all years. The effect of residual herbicide program, POST herbicide, and POST application timing on corn grain yield varied by year. In 2007, the use of glyphosate resulted in higher grain yield compared with glufosinate. In 2008, corn grain yield was the highest in the PRE fb POST program and with POST applications at EP and MP. To provide the most consistent weed control and minimize the likelihood of grain yield reductions, a PRE fb POST program applied at EP or MP is recommended.
Glyphosate-resistant (GR) alfalfa offers growers new options for weed control in alfalfa. One potential benefit of using GR alfalfa is increased longevity of an alfalfa stand under frequent harvests. It was hypothesized that GR alfalfa would have a greater longevity because of removal of weed interference with minimal crop injury. To study GR alfalfa yield, weed invasion, alfalfa stand persistence, and relative forage quality (RFQ), a field experiment with three weed control methods (no herbicide, glyphosate, and hexazinone) under two harvest frequencies (high and moderate) was established in August 2003 at the Michigan State University Agronomy Farm in East Lansing, MI. Forage yield of established alfalfa was not adversely affected by herbicide treatments. There were no differences in weed biomass between alfalfa treated with glyphosate and that treated with hexazinone, except in 2007. Average GR alfalfa stand density decreased approximately 90% (from 236 to 27 plant m−2), and yield decreased approximately 30% (from 11.04 to 7.87 Mg ha−1) during the 7-yr period (2004 to 2010) of the experiment. Stand density of GR alfalfa showed natural thinning during the 7-yr period regardless of harvest intensity or herbicide treatment. In most production years (4 out of 5 yr), relative forage quality of GR alfalfa was higher under a high-intensity harvesting system (4 to 5 harvests yr−1) than it was with a moderate intensity harvesting system (3 to 4 harvests yr−1). Relative forage quality was not affected by weed removal with herbicides in most years. Weed removal and harvest intensity in established GR alfalfa had no effect on stand persistence.
It is widely believed that planets form in accretion discs by the growth of small dust
and ice grains. To verify the scenarios of protoplanetary disc processes including the
transport of material in vertical as well as in radial direction, it is crucial to
understand the interaction of small dust and ice particles with their surroundings,
i.e., with the gas, star light, and other ice and dust particles. In
first laboratory experiments, we observe trapped irregular-shaped water-ice particles
which levitate up to half an hour in a vacuum chamber at a pressure of about 2 mbar due to
photophoresis and thermophoresis. While they are firmly levitating, they rotate
preferentially about their vertical axis. The physics leading to the levitation is
explained and the results of an analysis of the particle rotation are presented.
The introduction of glyphosate-resistant (GR) alfalfa offers a new weed management system for alfalfa establishment; however, alfalfa seeding rates are based on conventional cultivars. Determining optimum seeding rates allows forage producers to maximize yield, quality, and profitability with GR alfalfa. Field experiments were established in 2005 and 2006 to determine the effect of seeding rate and weed control on GR alfalfa yield, forage quality, and persistence up to 3 yr after establishment. Seeding rates of 4.5, 9.0, and 18 kg ha−1 were evaluated. Weed control methods during the seeding year included no herbicide, glyphosate applied once before the first harvest, and glyphosate applied once before the first harvest and then 7 to 10 d following subsequent harvests. Alfalfa yield was greater at higher seeding rates and when weeds were removed with glyphosate. Season forage yields were the greatest with the 18 kg ha−1 seeding rate and where no herbicide was applied. Weed biomass often was lower at the higher seeding rates and was 91 to 98% lower in the glyphosate treatments compared to the nontreated. Forage quality was not affected by seeding rate but varied by herbicide treatment depending on establishment year. Plant density increased with seeding rate and treatment effects persisted for three growing seasons. Herbicide treatment did not affect stand density as greatly as seeding rate and did not influence stand longevity.
A small-angle neutron scattering (SANS) investigation of saturated aqueous proton exchange membranes is presented. Our membranes were synthesized by radiation-induced grafting of poly(ethylene-alt-tetrafluoroethylene) (ETFE) with styrene in the presence of crosslinker (divinylbenzene, DVB) and the polystyrene was sulfonated subsequently. The contrast variation method was used to understand the relationship between morphology, water uptake, and proton conductivity. We find that the membranes are separated into two phases, mostly following the morphology already defined in the semi-crystalline ETFE base film. The amorphous phase hosts the water and swells upon hydration, swelling being inversely proportional to the degree of crosslinking. Proton conductivity and volumetric fraction of water are related by a power law, indicating a percolated and most likely random network of finely dispersed aqueous pores in the hydrophilic domains.
Field studies were conducted in 2004 and 2005 in Michigan to determine the effect of seeding establishment method and weed control on forage quality of glyphosate-resistant alfalfa in the establishment year. Seeding methods included alfalfa only (clear-seeding) and alfalfa with a companion crop of oat (companion-seeding). Herbicide treatments included an untreated control and glyphosate treatment for both establishment systems, and either imazamox in the clear-seeding system or imazamox + clethodim in the companion-seeding system. The greatest differences among treatments in forage quality were observed at the first harvest in both establishment years. Results suggest high quality, productive alfalfa stands can be established utilizing glyphosate-resistant alfalfa in a clear seeding system.
Glyphosate-resistant alfalfa offers new weed control options for alfalfa establishment. Field studies were conducted in 2004 and 2005 to determine the effect of establishment method and weed control method on forage production and alfalfa stand establishment. Seeding methods included clear seeding and companion seeding with oats. Herbicide treatments included glyphosate, imazamox, imazamox + clethodim, and no herbicide. Temporary stunting from the glyphosate treatments was observed (< 7%); however, injury did not reduce forage yield or stand density in 2004. No glyphosate injury was observed in 2005. Weed control with glyphosate was more consistent than with imazamox or imazamox + clethodim. In 2004, total seasonal forage yield, which consisted of alfalfa, weeds, and oats (in some treatments), was the highest where no herbicide was applied in the oat companion crop and was reduced where herbicides were applied in both establishment systems. In 2005, seeding method or weed control method did not affect total seasonal forage production. Alfalfa established with the clear-seeded method and treated with glyphosate yielded the highest alfalfa dry matter in both years. Imazamox injury reduced first-harvest alfalfa yield in the clear-seeded system in both years. When no herbicide was applied, alfalfa yield was higher in the clear-seeded system. The oat companion crop suppressed alfalfa yield significantly in both years. Alfalfa established with an oat companion crop had a lower weed biomass than the clear-seeded system where no herbicide was applied in both years.
Common dandelion has developed into a troublesome agronomic weed for no-tillage corn and soybean producers in Michigan and throughout the north central region of the United States. Field experiments were conducted on established populations of common dandelion in 2001 to 2002 and 2002 to 2003 to evaluate the effect of preplant and sequential herbicide applications on established populations of common dandelion. Preplant treatments of glyphosate or 2,4-D ester were applied early fall, late fall, early spring, and late spring. For both glyphosate and 2,4-D ester, the fall applications were more effective than the spring applications. Glyphosate at 840 gae/ha was more effective than 2,4-D ester at 1,120 gae/ha at each application timing. A single application of glyphosate or 2,4-D ester applied either in the fall or spring did not provide season-long control of common dandelion. Sequential treatments of glyphosate following preplant applications of either glyphosate or 2,4-D ester provided season-long control of common dandelion.
Common dandelion has developed into a troublesome agronomic weed for no-tillage corn producers. A postemergence herbicide application is often required to reduce common dandelion competition. Field experiments were conducted in 2002 and 2003 to evaluate 22 postemergence herbicide treatments for efficacy on established populations of common dandelion in no-tillage corn. All herbicides were applied to five- to six-collar corn at registered rates with typical adjuvants. At 28 d after treatment (DAT) the most effective treatments included glufosinate and mesotrione providing at least 76% control of common dandelion. All other herbicide treatments provided less than 40% common dandelion control 28 DAT. Common dandelion control was evaluated 56 DAT when regrowth of treated plants was observed for some herbicide treatments. AT 56 DAT, dicamba + diflufenzopyr was the most effective treatment, providing 83% control of common dandelion. In 2002, all herbicide treatments, with the exception of flumiclorac, resulted in corn yields greater than the nontreated.