To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The net benefit from investing in any technology is a function of the cost of implementation and the expected return in revenue. The objective of the present study was to quantify, using deterministic equations, the net monetary benefit from investing in genotyping of commercial females. Three case studies were presented reflecting dairy cows, beef cows and ewes based on Irish population parameters; sensitivity analyses were also performed. Parameters considered in the sensitivity analyses included the accuracy of genomic evaluations, replacement rate, proportion of female selection candidates retained as replacements, the cost of genotyping, the sire parentage error rate and the age of the female when it first gave birth. Results were presented as an annualised monetary net benefit over the lifetime of an individual, after discounting for the timing of expressions. In the base scenarios, the net benefit was greatest for dairy, followed by beef and then sheep. The net benefit improved as the reliability of the genomic evaluations improved and, in fact, a negative net benefit of genotyping was less frequent when the reliability of the genomic evaluations was high. The impact of a 10% point increase in genomic reliability was, however, greatest in sheep, followed by beef and then dairy. The net benefit of genotyping female selection candidates reduced as replacement rate increased. As genotyping costs increased, the net benefit reduced irrespective of the percentage of selection candidates kept, the replacement rate or even the population considered. Nonetheless, the association between the genotyping cost and the net benefit of genotyping differed by the percentage of selection candidates kept. Across all replacement rates evaluated, retaining 25% of the selection candidates resulted in the greatest net benefit when genotyping cost was low but the lowest net benefit when genotyping cost was high. Genotyping breakeven cost was non-linearly associated with the percentage of selection candidates retained, reaching a maximum when 50% of selection candidates were retained, irrespective of replacement rate, genomic reliability or the population. The genotyping breakeven cost was also non-linearly associated with replacement rate. The approaches outlined within provide the back-end framework for a decision support tool to quantify the net benefit of genotyping, once parameterised by the relevant population metrics.
Accurately dating when people first colonized new areas is vital for understanding the pace of past cultural and environmental changes, including questions of mobility, human impacts and human responses to climate change. Establishing effective chronologies of these events requires the synthesis of multiple radiocarbon (14C) dates. Various “chronometric hygiene” protocols have been used to refine 14C dating of island colonization, but they can discard up to 95% of available 14C dates leaving very small datasets for further analysis. Despite their foundation in sound theory, without independent tests we cannot know if these protocols are apt, too strict or too lax. In Iceland, an ice core-dated tephrochronology of the archaeology of first settlement enables us to evaluate the accuracy of 14C chronologies. This approach demonstrated that the inclusion of a wider range of 14C samples in Bayesian models improves the precision, but does not affect the model outcome. Therefore, based on our assessments, we advocate a new protocol that works with a much wider range of samples and where outlying 14C dates are systematically disqualified using Bayesian Outlier Models. We show that this approach can produce robust termini ante quos for colonization events and may be usefully applied elsewhere.
No existing models of alcohol prevention concurrently adopt universal and selective approaches. This study aims to evaluate the first combined universal and selective approach to alcohol prevention.
A total of 26 Australian schools with 2190 students (mean age: 13.3 years) were randomized to receive: universal prevention (Climate Schools); selective prevention (Preventure); combined prevention (Climate Schools and Preventure; CAP); or health education as usual (control). Primary outcomes were alcohol use, binge drinking and alcohol-related harms at 6, 12 and 24 months.
Climate, Preventure and CAP students demonstrated significantly lower growth in their likelihood to drink and binge drink, relative to controls over 24 months. Preventure students displayed significantly lower growth in their likelihood to experience alcohol harms, relative to controls. While adolescents in both the CAP and Climate groups demonstrated slower growth in drinking compared with adolescents in the control group over the 2-year study period, CAP adolescents demonstrated faster growth in drinking compared with Climate adolescents.
Findings support universal, selective and combined approaches to alcohol prevention. Particularly novel are the findings of no advantage of the combined approach over universal or selective prevention alone.
Vibrio alginolyticus causes soft tissue and bloodstream infection; little systematically collected clinical and epidemiological information is available. In the USA, V. alginolyticus infections are reported to the Cholera and Other Vibrio Illness Surveillance system. Using data from 1988 to 2012, we categorised infections using specimen source and exposure history, analysed case characteristics, and calculated incidence rates using US Census Bureau data. Most (96%) of the 1331 V. alginolyticus infections were from coastal states. Infections of the skin and ear were most frequent (87%); ear infections occurred more commonly in children, lower extremity infections more commonly in older adults. Most (86%) infections involved water activity. Reported incidence of infections increased 12-fold over the study period, although the extent of diagnostic or surveillance bias is unclear. Prevention efforts should target waterborne transmission in coastal areas and provider education to promote more rapid diagnosis and prevent complications.
Epstein Barr virus (EBV) infects 95% of the global population and is associated with up to 2% of cancers globally. Immunoglobulin G (IgG) antibody levels to EBV have been shown to be heritable and associated with developing malignancies. We, therefore, performed a pilot genome-wide association analysis of anti-EBV IgG traits in an African population, using a combined approach including array genotyping, whole-genome sequencing and imputation to a panel with African sequence data. In 1562 Ugandans, we identify a variant in human leukocyte antigen (HLA)-DQA1, rs9272371 (p = 2.6 × 10−17) associated with anti-EBV nuclear antigen-1 responses. Trans-ancestry meta-analysis and fine-mapping with European-ancestry individuals suggest the presence of distinct HLA class II variants driving associations in Uganda. In addition, we identify four putative, novel, very rare African-specific loci with preliminary evidence for association with anti-viral capsid antigen IgG responses which will require replication for validation. These findings reinforce the need for the expansion of such studies in African populations with relevant datasets to capture genetic diversity.
Globally, the Series 2 – Series 3 boundary of the Cambrian System coincides with a major carbon isotope excursion, sea-level changes and trilobite extinctions. Here we examine the sedimentology, sequence stratigraphy and carbon isotope record of this interval in the Cambrian strata (Durness Group) of NW Scotland. Carbonate carbon isotope data from the lower part of the Durness Group (Ghrudaidh Formation) show that the shallow-marine, Laurentian margin carbonates record two linked sea-level and carbon isotopic events. Whilst the carbon isotope excursions are not as pronounced as those expressed elsewhere, correlation with global records (Sauk I – Sauk II boundary and Olenellus biostratigraphic constraint) identifies them as representing the local expression of the ROECE and DICE. The upper part of the ROECE is recorded in the basal Ghrudaidh Formation whilst the DICE is seen around 30m above the base of this unit. Both carbon isotope excursions co-occur with surfaces interpreted to record regressive–transgressive events that produced amalgamated sequence boundaries and ravinement/flooding surfaces overlain by conglomerates of reworked intraclasts. The ROECE has been linked with redlichiid and olenellid trilobite extinctions, but in NW Scotland, Olenellus is found after the negative peak of the carbon isotope excursion but before sequence boundary formation.
Background: The present study explored the reliability, validity, and factor structure of a modified version of the Moral Disengagement Scale (MDS), which comprehensively assesses proneness to disengage from different forms of conduct specific to Australian adolescents. Methods: A sample of 452 students (Mage = 12.79; SD = 1.93) completed the modified MDS and the Australian Self-Report Delinquency Scale. A multistep approach was used to evaluate the factor structure of the MDS. The sample was divided into exploratory (n = 221) and cross-validation samples (n = 231). Principal component analysis was conducted with the exploratory sample and multiple factor solutions compared to determine the optimal factor structure of the modified MDS. The final factor solution was confirmed in the cross-validation sample using confirmatory factor analysis. Internal consistency of the final scale and convergent validity with the delinquency questionnaire was also assessed. Results: Analyses resulted in a 22-item MDS for use in Australia, with four factors mapping onto the four conceptual categories of moral disengagement. The individual subscales demonstrated adequate to good internal consistency, and the total scale also demonstrated high internal consistency (α = 0.87). Convergent validity of the scale was established. Conclusions: The 22-item Australian MDS is a reliable and valid instrument for use within an Australian population.
Toxigenic strains of Vibrio cholerae serogroups O1 and O139 have caused cholera epidemics, but other serogroups – such as O75 or O141 – can also produce cholera toxin and cause severe watery diarrhoea similar to cholera. We describe 31 years of surveillance for toxigenic non-O1, non-O139 infections in the United States and map these infections to the state where the exposure probably originated. While serogroups O75 and O141 are closely related pathogens, they differ in how and where they infect people. Oysters were the main vehicle for O75 infection. The vehicles for O141 infection include oysters, clams, and freshwater in lakes and rivers. The patients infected with serogroup O75 who had food traceback information available ate raw oysters from Florida. Patients infected with O141 ate oysters from Florida and clams from New Jersey, and those who only reported being exposed to freshwater were exposed in Arizona, Michigan, Missouri, and Texas. Improving the safety of oysters, specifically, should help prevent future illnesses from these toxigenic strains and similar pathogenic Vibrio species. Post-harvest processing of raw oysters, such as individual quick freezing, heat-cool pasteurization, and high hydrostatic pressurization, should be considered.
The yields of spring barley during a medium-term (7 years) compost and slurry addition experiment and the soil carbon (C) and nitrogen (N) contents, bacterial community structure, soil microbial biomass and soil respiration rates have been determined to assess the effects of repeated, and in some cases very large, organic amendments on soil and crop parameters. For compost, total additions were equivalent to up to 119 t C/ha and 1·7 t N/ha and for slurry they were 25 t C/ha and 0·35 t N/ha over 7 years, which represented very large additions compared to control soil C and N contents (69 t C/ha and 0·3 t N/ha in the 0–30 cm soil depth). There was an initial positive response to compost and slurry addition on barley yield, but over the experiment the yield differential between the amounts of compost addition declined, indicating that repeated addition of compost at a lower rate over several years had the same cumulative effect as a large single compost application. By the end of the experiment it was clear that the addition of compost and slurry increased soil C and N contents, especially towards the top of the soil profile, as well as soil respiration rates. However, the increases in soil C and N contents were not proportional to the amount of C and N added, suggesting either that: (i) a portion of the added C and N was more vulnerable to loss; (ii) that its addition rendered another C or N pool in the soil more susceptible to loss; or (iii) that the C inputs from additional crop productivity did not increase in line with the organic amendments. Soil microbial biomass was depressed at the highest rate of organic amendment, and whilst this may have been due to genuine toxic or inhibitory effects of large amounts of compost, it could also be due to the inaccuracy of the substrate-induced respiration approach used for determining soil biomass when there is a large supply of organic matter. At the highest compost addition, the bacterial community structure was significantly altered, suggesting that the amendments significantly altered soil community dynamics.
In western Canada, more money is spent on wild oat herbicides than on any
other weed species, and wild oat resistance to herbicides is the most
widespread resistance issue. A direct-seeded field experiment was conducted
from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop
species, crop seeding rate, crop usage, and herbicide rate combination
effects on wild oat management and canola yield. Combining 2× seeding rates
of early-cut barley silage with 2× seeding rates of winter cereals and
excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to
similar wild oat density, aboveground wild oat biomass, wild oat seed
density in the soil, and canola yield as a repeated canola–wheat rotation
under a full wild oat herbicide rate regime. Wild oat was similarly well
managed after 3 yr of perennial alfalfa without wild oat herbicides.
Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer
annual crop rotations resulted in higher wild oat density, biomass, and seed
banks. Management systems that effectively combine diverse and optimal
cultural practices against weeds, and limit herbicide use, reduce selection
pressure for weed resistance to herbicides and prolong the utility of
threatened herbicide tools.
Few studies have explored therapists’ views on computerized cognitive behavioural therapy (cCBT) and this study aimed to provide an in-depth understanding of accredited therapists’ views on cCBT's role in treating depression. Twelve therapists constituted this self-selected sample (eight female, four male). Mean age was 52 years (range 46–61). The data obtained from a semi-structured questionnaire were analysed using thematic analysis. Three themes were identified and discussed: (1) the standardized nature of cCBT for depression, (2) the importance of the therapeutic relationship in cCBT, and (3) the pros and cons with cCBT as an alternative to CBT. The therapists in this study emphasized that innovations in CBT delivery formats (e.g. internet-based, computerized) show promise. However, participants expressed some views that clash with the evidence-based viewpoint. More work is needed to improve the implementation of evidence-based practice and policy.
With the changing distribution of infectious diseases, and an increase in the burden of non-communicable diseases, low- and middle-income countries, including those in Africa, will need to expand their health care capacities to effectively respond to these epidemiological transitions. The interrelated risk factors for chronic infectious and non-communicable diseases and the need for long-term disease management, argue for combined strategies to understand their underlying causes and to design strategies for effective prevention and long-term care. Through multidisciplinary research and implementation partnerships, we advocate an integrated approach for research and healthcare for chronic diseases in Africa.
The main objective of our target article was to sketch the empirical case for the importance of selection at the level of groups on cultural variation. Such variation is massive in humans, but modest or absent in other species. Group selection processes acting on this variation is a framework for developing explanations of the unusual level of cooperation between non-relatives found in our species. Our case for cultural group selection (CGS) followed Darwin's classic syllogism regarding natural selection: If variation exists at the level of groups, if this variation is heritable, and if it plays a role in the success or failure of competing groups, then selection will operate at the level of groups. We outlined the relevant domains where such evidence can be sought and characterized the main conclusions of work in those domains. Most commentators agree that CGS plays some role in human evolution, although some were considerably more skeptical. Some contributed additional empirical cases. Some raised issues of the scope of CGS explanations versus competing ones.
Most empirical studies into the covariance structure of psychopathology have been confined to adults. This work is not developmentally informed as the meaning, age-of-onset, persistence and expression of disorders differ across the lifespan. This study investigates the underlying structure of adolescent psychopathology and associations between the psychopathological dimensions and sex and personality risk profiles for substance misuse and mental health problems.
This study analyzed data from 2175 adolescents aged 13.3 years. Five dimensional models were tested using confirmatory factor analysis and the external validity was examined using a multiple-indicators multiple-causes model.
A modified bifactor model, with three correlated specific factors (internalizing, externalizing, thought disorder) and one general psychopathology factor, provided the best fit to the data. Females reported higher mean levels of internalizing, and males reported higher mean levels of externalizing. No significant sex differences emerged in liability to thought disorder or general psychopathology. Liability to internalizing, externalizing, thought disorder and general psychopathology was characterized by a number of differences in personality profiles.
This study is the first to identify a bifactor model including a specific thought disorder factor. The findings highlight the utility of transdiagnostic treatment approaches and the importance of restructuring psychopathology in an empirically based manner.
Potato, dry bean, and sugar beet production have increased markedly in recent years on irrigated cropland in Alberta, Canada. Concerns exist about declining soil quality and increased soil erosion when these low-residue crops are grown in sequence in short-duration rotations. A 12-yr rotation study was conducted to determine the merits of adopting various conservation practices (reduced tillage, cover crops, composted manure) and longer-duration rotations to develop a more sustainable production system for these row crops. This article reports on weed density and weed seedbank data collected in the study. Weed densities recorded prior to applying postemergence herbicides indicated that conservation compared with conventional management treatments had greater weed densities in 30 to 45% of the cases in 3-, 4-, and 5-yr rotations. In contrast, a 6-yr conservation rotation that included 2 yr of timothy forage resulted in similar or lower weed densities than rotations with conventional management practices. Residual weed densities recorded 4 wk after applying postemergence herbicides were only greater in conservation than conventional rotations in 2 of 12 yr, regardless of rotation length. Weed seedbank densities at the conclusion of the 12-yr study were similar for 3- to 6-yr rotations under either conservation or conventional management. These findings indicate that implementing a suite of conservation practices poses little risk of increased weed populations in the long term. This knowledge will facilitate grower adoption of more sustainable agronomic practices for irrigated row crops in this region.
We have mapped cold atomic gas in 21cm line H i self-absorption (HISA) at arcminute resolution over more than 90% of the Milky Way's disk. To probe the formation of H2 clouds, we have compared our HISA distribution with CO J = 1-0 line emission. Few HISA features in the outer Galaxy have CO at the same position and velocity, while most inner-Galaxy HISA has overlapping CO. But many apparent inner-Galaxy HISA-CO associations can be explained as chance superpositions, so most inner-Galaxy HISA may also be CO-free. Since standard equilibrium cloud models cannot explain the very cold H i in many HISA features without molecules being present, these clouds may instead have significant CO-dark H2.
We explored caregiver perspectives on their children’s pain management in both a pediatric (PED) and general emergency department (GED). Study objectives were to: (1) measure caregiver estimates of children’s pain scores and treatment; (2) determine caregiver level of satisfaction; and (3) determine factors associated with caregiver satisfaction.
This prospective survey examined a convenience sample of 97 caregivers (n=51 PED, n=46 GED) with children aged <17 years. A paper-based survey was distributed by research assistants, from 2009–2011.
Most caregivers were female (n=77, 79%) and were the child’s mother (n=69, 71%). Children were treated primarily for musculoskeletal pain (n=41, 42%), headache (n=16, 16%) and abdominal pain (n=7, 7%). Using a 100 mm Visual Analog Scale, the maximum mean reported pain score was 75 mm (95% CI: 70–80) and mean score at discharge was 39 mm (95% CI: 32–46). Ninety percent of caregiver respondents were satisfied (80/89, 90%); three (3/50, 6%) were dissatisfied in the PED and six (6/39, 15%) in the GED. Caregivers who rated their child’s pain at ED discharge as severe were less likely to be satisfied than those who rated their child’s pain as mild or moderate (p=0.034).
Despite continued pain upon discharge, most caregivers report being satisfied with their child’s pain management. Caregiver satisfaction is likely multifactorial, and physicians should be careful not to interpret satisfaction as equivalent to adequate provision of analgesia. The relationship between satisfaction and pain merits further exploration.
We appreciate and endorse Kline's ethological taxonomy and its application. However, the definition of teaching she presents is problematic, as it replaces mentalistic intent with intention on the part of natural selection. We discuss problems with the strict adaptationist view and suggest instead that the five forms of teaching presented in the taxonomy may constitute exaptations rather than adaptations.
Although rare, typhoid fever cases acquired in the United States continue to be reported. Detection and investigation of outbreaks in these domestically acquired cases offer opportunities to identify chronic carriers. We searched surveillance and laboratory databases for domestically acquired typhoid fever cases, used a space–time scan statistic to identify clusters, and classified clusters as outbreaks or non-outbreaks. From 1999 to 2010, domestically acquired cases accounted for 18% of 3373 reported typhoid fever cases; their isolates were less often multidrug-resistant (2% vs. 15%) compared to isolates from travel-associated cases. We identified 28 outbreaks and two possible outbreaks within 45 space–time clusters of ⩾2 domestically acquired cases, including three outbreaks involving ⩾2 molecular subtypes. The approach detected seven of the ten outbreaks published in the literature or reported to CDC. Although this approach did not definitively identify any previously unrecognized outbreaks, it showed the potential to detect outbreaks of typhoid fever that may escape detection by routine analysis of surveillance data. Sixteen outbreaks had been linked to a carrier. Every case of typhoid fever acquired in a non-endemic country warrants thorough investigation. Space–time scan statistics, together with shoe-leather epidemiology and molecular subtyping, may improve outbreak detection.