We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Archival charcoal tree-ring segments from the Mississippian center of Kincaid Mounds provide chronometric information for the history of this important site. However, charcoal recovered from Kincaid was originally treated with a paraffin consolidant, a once common practice in American archaeology. This paper presents data on the efficacy of a solvent pretreatment protocol and new wiggle-matched 14C dates from the largest mound (Mound 10) at Kincaid. FTIR and 14C analysis on known-age charcoal intentionally contaminated with paraffin, as well as archaeological material, show that a chloroform pretreatment is effective at removing paraffin contamination. Wiggle-matched cutting dates from the final construction episodes on Mound 10 at Kincaid, indicate that the mound was used in the late 1300s with the construction of a unique structure on the apex occurring around 1390. This study demonstrates the potential for museum collections of archaeological charcoal to contribute high-resolution chronological information despite past conservation practices that complicate 14C dating.
Plans for allocation of scarce life-sustaining resources during the coronavirus disease 2019 (COVID-19) pandemic often include triage teams, but operational details are lacking, including what patient information is needed to make triage decisions.
Methods:
A Delphi study among Washington state disaster preparedness experts was performed to develop a list of patient information items needed for triage team decision-making during the COVID-19 pandemic. Experts proposed and rated their agreement with candidate information items during asynchronous Delphi rounds. Consensus was defined as ≥80% agreement. Qualitative analysis was used to describe considerations arising in this deliberation. A timed simulation was performed to evaluate feasibility of data collection from the electronic health record.
Results:
Over 3 asynchronous Delphi rounds, 50 experts reached consensus on 24 patient information items, including patients’ age, severe or end-stage comorbidities, the reason for and timing of admission, measures of acute respiratory failure, and clinical trajectory. Experts weighed complex considerations around how information items could support effective prognostication, consistency, accuracy, minimizing bias, and operationalizability of the triage process. Data collection took a median of 227 seconds (interquartile range = 205, 298) per patient.
Conclusions:
Experts achieved consensus on patient information items that were necessary and appropriate for informing triage teams during the COVID-19 pandemic.
The COVID-19 pandemic led to changes in how healthcare was accessed and delivered. It was suggested that COVID-19 will lead to an increased delirium burden in its acute phase, with variable effect on mental health in the longer term. Despite this, there are limited data on the direct effects of the pandemic on psychiatric care.
Objectives
1) describe the mental health presentations of a diverse acute inpatient population, 2) compare findings with the same period in 2019, 3) characterise the SARS-CoV-2 positive cohort of patients.
Methods
We present a descriptive summary of the referrals to a UK psychiatric liaison department during the exponential phase of the pandemic, and compare this to the same period in 2019.
Results
show a 40.3% reduction in the number of referrals in 2020, with an increase in the proportion of referrals for delirium and psychosis. One third (28%) of referred patients tested positive for COVID-19 during their admission, with 39.7% of these presenting with delirium as a consequence of their COVID-19 illness. Our data indicate decreased clinical activity for our service during the pandemic’s peak. There was a marked increase in delirium, though in no other psychiatric presentations.
Conclusions
In preparation for further exponential rises in COVID-19 cases, we would expect seamless integration of liaison psychiatry teams in general hospital wards to optimise delirium management in patients with COVID-19. Further consideration should be given to adequate staffing of community and crisis mental health teams to safely manage the potentially increasing number of people reluctant to visit the emergency department.
In 2018, the Neurodevelopmental and Psychosocial Interventions Working Group of the Cardiac Neurodevelopmental Outcome Collaborative convened through support from an R13 grant from the National Heart, Lung, and Blood Institute to survey the state of neurodevelopmental and psychosocial intervention research in CHD and to propose a slate of critical questions and investigations required to improve outcomes for this growing population of survivors and their families. Prior research, although limited, suggests that individualised developmental care interventions delivered early in life are beneficial for improving a range of outcomes including feeding, motor and cognitive development, and physiological regulation. Interventions to address self-regulatory, cognitive, and social-emotional challenges have shown promise in other medical populations, yet their applicability and effectiveness for use in individuals with CHD have not been examined. To move this field of research forward, we must strive to better understand the impact of neurodevelopmental and psychosocial intervention within the CHD population including adapting existing interventions for individuals with CHD. We must examine the ways in which dedicated cardiac neurodevelopmental follow-up programmes bolster resilience and support children and families through the myriad transitions inherent to the experience of living with CHD. And, we must ensure that interventions are person-/family-centred, inclusive of individuals from diverse cultural backgrounds as well as those with genetic/medical comorbidities, and proactive in their efforts to include individuals who are at highest risk but who may be traditionally less likely to participate in intervention trials.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Efforts to address health disparities and achieve health equity are critically dependent on the development of a diverse research workforce. However, many researchers from underrepresented backgrounds face challenges in advancing their careers, securing independent funding, and finding the mentorship needed to expand their research.
Methods
Faculty from the University of Maryland at College Park and the University of Wisconsin-Madison developed and evaluated an intensive week-long research and career-development institute—the Health Equity Leadership Institute (HELI)—with the goal of increasing the number of underrepresented scholars who can sustain their ongoing commitment to health equity research.
Results
In 2010-2016, HELI brought 145 diverse scholars (78% from an underrepresented background; 81% female) together to engage with each other and learn from supportive faculty. Overall, scholar feedback was highly positive on all survey items, with average agreement ratings of 4.45-4.84 based on a 5-point Likert scale. Eighty-five percent of scholars remain in academic positions. In the first three cohorts, 73% of HELI participants have been promoted and 23% have secured independent federal funding.
Conclusions
HELI includes an evidence-based curriculum to develop a diverse workforce for health equity research. For those institutions interested in implementing such an institute to develop and support underrepresented early stage investigators, a resource toolbox is provided.
Introduction: Pharyngitis is a common presenting complaint at the emergency department (ED). Historically, acute pharyngitis has been overdiagnosed as the result of a bacterial etiology, leading to over-prescription of antibiotics, and overuse of throat culturing. This study attempts to quantify the current management of acute pharyngitis in the ED, and compare to the theoretical management using a modified Centor score. Methods: This was a retrospective chart review of 1640 patients who presented to four EDs in the central zone of the Nova Scotia Health Authority that received a diagnosis of pharyngitis, bacterial pharyngitis or tonsillitis. The primary outcome was the observed rate of each diagnosis in the study population, the rate of antibiotic prescription, and the rate of throat swab cultures performed. The secondary outcomes were the rate of antibiotics and throat swabs ordered using a modified Centor score. Antibiotics as first-line treatment were indicated if the Centor score was three or greater, and throat cultures were indicated if the Centor score was two or greater. Results: A total of 1596 patients were included in the analysis. Antibiotics were given in 893 patients (0.559; 95% CI: {0.535, 0.584}). Cultures were sent on 863 patients (0.541 CI: {0.516, 0.565}). Using the modified Centor thresholds, we would have prescribed antibiotics as the first-line treatment in 77 cases (0.048 CI: {0.038, 0.060}), potentially saving 786 prescriptions, and ordered throat swabs on 502 patients (0.315, CI: {0.292, 0.338}), saving 361 cultures. The most commonly prescribed antibiotic was penicillin, and the least prescribed was metronidazole. Conclusion: Over half of patients that present with acute pharyngitis receive an antibiotic, and over half have a throat swab culture performed. Utilizing a modified Centor score would result in decreased antibiotic prescription rate, and a diminished rate of throat cultures. Incorporation of these Centor criteria could result in diminished antibiotic prescription rates for acute pharyngitis in the ED.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
The extent and severity of wheat take-all (caused by Gaeumannomyces graminis var. tritici (Ggt)) can vary considerably between growing seasons. The current study aimed to identify climatic factors associated with differing concentrations of Ggt DNA in soil and take-all disease at different stages of a sequence of wheat crops. Pre-sowing soil Ggt DNA concentrations and subsequent take-all disease in consecutive wheat crop sequences were compared across six seasons in 90 commercial cropping fields in Canterbury and Southland, New Zealand, between 2003 and 2009. Disease progress was assessed in additional fields in 2004/05 and 2005/06. While a general pattern in inoculum and disease fluctuations was evident, there were exceptions among wheat crop sequences that commenced in different years, especially for first wheat crops. In three consecutive growing seasons, there was very low inoculum increase in the first wheat crop, while increases in first wheat crops during the following three seasons was much greater. Low spring–summer rainfall was associated with low build-up of inoculum in first wheat crops. The inoculum derived from the first wheat then determined the amount of primary inoculum for the subsequent second wheat, thereby influencing the severity of take-all in that crop. Differing combinations of weather conditions during one wheat crop in a sequence and the conditions experienced by the next crop provided explanations of the severity of take-all at grain fill and the resulting post-harvest soil Ggt DNA concentrations in second wheat crops. Examples of contrasting combinations were: (a) a moderate take-all epidemic and high post-harvest inoculum that followed high rainfall during grain fill, despite low pre-sowing soil Ggt DNA concentrations; (b) severe take-all and moderate to high inoculum build-up following high pre-sowing soil Ggt DNA concentrations and non-limiting rainfall; and (c) low spring and early summer rainfall slowing epidemic development in second wheat crops, even where there were high pre-sowing soil Ggt DNA concentrations. The importance of the environmental conditions experienced during a particular growing season was also illustrated by differences between growing seasons in take-all progress in fields in the same take-all risk categories based on pre-sowing soil Ggt DNA concentrations.
Francis Gastrell (1662–1725) served as Bishop of Chester from 1714 until his death. During this time, he compiled historical notes on his diocese from a range of medieval and contemporary sources. His survey contains detailed information on parishes, including their sizes, populations and economies. The notes also provide invaluable data on administrative matters such as the development of the towns within the diocese, notably including records of acts of charity and records of the grammar schools and their governors, finances and statutes. This 1990 publication, prepared by L. A. S. Butler, is the first printed edition of the notes relating to the Yorkshire parishes that had been transferred within the archdeaconry of Richmond to the bishopric of Chester. With full editorial apparatus, and thorough indexes of persons, clergy and places, this work stands as an important resource for church, social and local historians.
Sexed semen technology is now commercially available in many countries around the world, and is primarily used in dairy cattle breeding. Sperm are sorted by flow cytometry on the basis of a 4% difference in DNA content between sperm containing X and Y chromosomes. Despite reliably producing a 90% gender bias, the fertility of the sexed semen product is compromised compared with conventional semen. The negative implications of the reduced fertility of sexed semen are amplified in seasonal systems of dairy production, as the importance of fertility is greater in these systems compared with year-round calving systems. A review of the literature indicates that conception rates (CR) to 1st service with frozen-thawed sexed semen are ~75% to 80% of those achieved with conventional frozen-thawed semen. Preliminary results from a large-scale field trial carried out in Ireland in 2013 suggest that significant improvements in the performance of sexed semen have been made, with CR of 87% of those achieved with conventional semen. The improved fertility of a sexed semen product that delivers a 90% gender bias has considerable implications for the future of breeding management in pasture-based dairy production systems. Sexed semen may facilitate faster, more profitable dairy herd expansion by increasing the number of dairy heifer replacements born. Biosecurity can be improved by maintaining a closed herd during the period of herd expansion. In a non-expansion scenario, sexed semen may be used to increase the value of beef output from the dairy herd. The replacement heifer requirements for a herd could be met by using sexed semen in the 1st 3 weeks of the breeding season, with the remaining animals bred to beef sires, increasing the sale value over that of a dairy bull calf. Alternatively, very short gestation sires could be used to shorten the calving interval. Market prices have a considerable effect on the economics of sexed semen use, and widespread use of sexed semen should be restricted to well managed herds that already achieve acceptable herd fertility performance.
Although usually thought of as external environmental stressors, a significant heritable component has been reported for measures of stressful life events (SLEs) in twin studies.
Method
We examined the variance in SLEs captured by common genetic variants from a genome-wide association study (GWAS) of 2578 individuals. Genome-wide complex trait analysis (GCTA) was used to estimate the phenotypic variance tagged by single nucleotide polymorphisms (SNPs). We also performed a GWAS on the number of SLEs, and looked at correlations between siblings.
Results
A significant proportion of variance in SLEs was captured by SNPs (30%, p = 0.04). When events were divided into those considered to be dependent or independent, an equal amount of variance was explained for both. This ‘heritability’ was in part confounded by personality measures of neuroticism and psychoticism. A GWAS for the total number of SLEs revealed one SNP that reached genome-wide significance (p = 4 × 10−8), although this association was not replicated in separate samples. Using available sibling data for 744 individuals, we also found a significant positive correlation of R2 = 0.08 in SLEs (p = 0.03).
Conclusions
These results provide independent validation from molecular data for the heritability of reporting environmental measures, and show that this heritability is in part due to both common variants and the confounding effect of personality.
Continental shelf ecosystems have high importance for the continental countries of the Wider Caribbean Region. They support important shrimp and groundfish fisheries (Phillips et al. Chapter 15) and snapper fisheries on their outer slopes (Heileman Chapter 13). There are also important linkages between the former fisheries and the many coastal and estuarine lagoons and wetlands that occur in these countries (Yáñez-Arancibia et al. Chapter 17). They support livelihoods (McConney and Salas Chapter 7) and provide critical ecosystem services (Schuhmann et al. Chapter 8). Continental shelf ecosystems have been degraded by many human impacts of both marine and land-based origin (Sweeney and Corbin Chapter 4; Gil and Wells Chapter 5).
This synthesis chapter presents the outputs of a group process aimed at developing a vision and way ahead for ecosystem based management (EBM) for continental shelf ecosystems in the Wider Caribbean, using the methods described earlier (Fanning et al. Chapter 1). In terms of structure, the chapter first describes a vision for continental shelf EBM and reports on the priorities assigned to the identified vision elements. It then discusses how the vision might be achieved by taking into account assisting factors (those that facilitate achievement) and resisting factors (those that inhibit achievement). The chapter concludes with guidance on the strategic direction needed to implement the vision, identifying specific actions to be undertaken for each of the vision elements.
The vision
The occupational breakdown of members of the Continental Shelf Ecosystems Working Group reflected the diversity of affiliations present at the EBM Symposium and included governmental, intergovernmental, academic, non-governmental and private sector (fishers and fishing industry and consulting) representatives. With guidance provided by the facilitator, this diverse group of participants was asked to first address the question of “What do you see in place in 10 years time when EBM/EAF has become a reality in the Caribbean?” This diversity provided for a fruitful and comprehensive discussion which is summarized in Table 24.1, in terms of the key vision elements and their subcomponents, and in Figure 24.1, which illustrates the level of priority assigned to each of the vision elements.
EU milk quota deregulation has forced many farmers to reconsider the factors that will limit milk production into the future. Factors other than milk quota such as land, labour, capital, stock, etc. will become the limiting factor for many in a post-EU milk quota scenario. While it can be postulated what the limits to production will be in a post-quota scenario, how farmers react will determine the future direction of the industry. In order to determine the future attitudes and intentions and to identify the key factors influencing farmers who intend to expand, exit, remain static or contract their businesses in the future, a survey of a large group of Irish commercial dairy farmers was carried out. The telephone survey sample was chosen randomly, based on a proportional representation of suppliers to the largest milk processor in Ireland. The sample (780 suppliers) was broken down by quota size (five quota categories, Q1–Q5), supplier region and system of production. The sample was analysed to determine the effect of key survey variables on the future intentions of dairy farmers. The survey was completed by 659 suppliers (0·82 of the sample). The proportions of farmers intending to expand were 0·28, 0·47, 0·61, 0·61 and 0·56, respectively, for Q1–Q5, while the proportions intending to exit were 0·27, 0·18, 0·08, 0·09 and 0·08, respectively. Farmers who were intent on expanding had larger total farm areas, larger milk tank capacity per litre of milk quota, more modern milking facilities, more available cow housing and more housing that could be converted at a relatively low cost and were more likely to have a successor. Of those expanding, 0·60 wanted milk quotas abolished, while 0·36 of those planning to exit wanted milk quotas abolished. The level of expansion was affected by business scale, dairy stocking rate, the additional labour required with expansion and total and milking platform farm size.