To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Between 2001 and 2017, the Royal Botanic Garden Edinburgh conducted training and research in Belize built around an annual two-week field course, part of the Edinburgh M.Sc. programme in Biodiversity and Taxonomy of Plants, focused on tropical plant identification, botanical-collecting and tropical fieldwork skills. This long-term collaboration in one country has led to additional benefits, most notably capacity building, acquisition of new country records, completion of M.Sc. thesis projects and publication of the findings in journal articles, and continued cooperation. Detailed summaries are provided for the specimens collected by students during the field course or return visits to Belize for M.Sc. thesis projects. Additionally, 15 species not recorded in the national checklist for Belize are reported. The information in this paper highlights the benefits of collaborations between institutions and countries for periods greater than the typical funding cycles of three to five years.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
Shared patient–clinician decision-making is central to choosing between medical treatments. Decision support tools can have an important role to play in these decisions. We developed a decision support tool for deciding between nonsurgical treatment and surgical total knee replacement for patients with severe knee osteoarthritis. The tool aims to provide likely outcomes of alternative treatments based on predictive models using patient-specific characteristics. To make those models relevant to patients with knee osteoarthritis and their clinicians, we involved patients, family members, patient advocates, clinicians, and researchers as stakeholders in creating the models.
Stakeholders were recruited through local arthritis research, advocacy, and clinical organizations. After being provided with brief methodological education sessions, stakeholder views were solicited through quarterly patient or clinician stakeholder panel meetings and incorporated into all aspects of the project.
Participating in each aspect of the research from determining the outcomes of interest to providing input on the design of the user interface displaying outcome predications, 86% (12/14) of stakeholders remained engaged throughout the project. Stakeholder engagement ensured that the prediction models that form the basis of the Knee Osteoarthritis Mathematical Equipoise Tool and its user interface were relevant for patient–clinician shared decision-making.
Methodological research has the opportunity to benefit from stakeholder engagement by ensuring that the perspectives of those most impacted by the results are involved in study design and conduct. While additional planning and investments in maintaining stakeholder knowledge and trust may be needed, they are offset by the valuable insights gained.
Iron deficiency is common in pregnant and lactating women and is associated with reduced cognitive development of the offspring. Since iron affects lipid metabolism, the availability of fatty acids, particularly the polyunsaturated fatty acids required for early neural development, was investigated in the offspring of female rats fed iron-deficient diets during gestation and lactation. Subsequent to the dams giving birth, one group of iron-deficient dams was recuperated by feeding an iron-replete diet. Dams and neonates were killed on postnatal days 1, 3 and 10, and the fatty acid composition of brain and stomach contents was assessed by gas chromatography. Changes in the fatty acid profile on day 3 became more pronounced on day 10 with a decrease in the proportion of saturated fatty acids and a compensatory increase in monounsaturated fatty acids. Long-chain polyunsaturated fatty acids in the n-6 family were reduced, but there was no change in the n-3 family. The fatty acid profiles of neonatal brain and stomach contents were similar, suggesting that the change in milk composition may be related to the changes in the neonatal brain. When the dams were fed an iron-sufficient diet at birth, the effects of iron deficiency on the fatty acid composition of lipids in both dam’s milk and neonates’ brains were reduced. This study showed an interaction between maternal iron status and fatty acid composition of the offspring’s brain and suggests that these effects can be reduced by iron repletion of the dam’s diet at birth.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Less than half of stool samples from people symptomatic with infectious intestinal disease (IID) will identify a causative organism. A secondary data analysis was undertaken to explore whether symptomology alone could be used to make inferences about causative organisms. Data were utilised from the Second Study of Infectious Intestinal Disease in the Community. A total of 844 cases were analysed. Few symptoms differentiated individual pathogens, but grouping pathogens together showed that viral IID was more likely when symptom onset was in winter (odds ratio (OR) 2.08, 95% confidence interval (CI) 1.16–3.75) or spring (OR 1.92, 95% CI 1.11–3.33), the patient was aged under 5 years (OR 3.63, 95% CI 2.24–6.03) and there was loss of appetite (OR 2.19, 95% CI 1.29–3.72). The odds of bacterial IID were higher with diarrhoea in the absence of vomiting (OR 3.54, 95% CI 2.37–5.32), diarrhoea which persisted for >3 days (OR 2.69, 95% CI 1.82–3.99), bloody diarrhoea (OR 4.17, 95% CI 1.63–11.83) and fever (OR 1.67, 95% CI 1.11–2.53). Symptom profiles could be of value to help guide clinicians and public health professionals in the management of IID, in the absence of microbiological confirmation.
Englerophytum and Synsepalum are two closely related genera of trees and shrubs from the African tropics. Previous molecular studies have shown that these genera collectively form a clade within the subfamily Chrysophylloideae (Sapotaceae). However, little is known about the inter-relationships of the taxa within the Englerophytum–Synsepalum clade. In this study, nuclear ribosomal DNA and plastid trnH–psbA sequences were used to estimate the phylogeny within the clade. Results indicate that the clade consists of six major lineages, two composed solely of taxa from the genus Englerophytum and four composed of taxa from the genus Synsepalum. Each lineage can be distinguished by suites of vegetative and floral characters. Leaf venation patterns, calyx fusion, style length and staminodal structure were among the most useful characters for distinguishing clades. Some of the subclades within the Englerophytum–Synsepalum clade were also found to closely fit descriptions of former genera, most of which were described by Aubréville, that have since been placed in synonymy with Englerophytum and Synsepalum. The clade with the type species of Englerophytum also contains the type species of the genera Wildemaniodoxa and Zeyherella, which are confirmed as synonyms.
To enhance enrollment into randomized clinical trials (RCTs), we proposed electronic health record-based clinical decision support for patient–clinician shared decision-making about care and RCT enrollment, based on “mathematical equipoise.”
As an example, we created the Knee Osteoarthritis Mathematical Equipoise Tool (KOMET) to determine the presence of patient-specific equipoise between treatments for the choice between total knee replacement (TKR) and nonsurgical treatment of advanced knee osteoarthritis.
With input from patients and clinicians about important pain and physical function treatment outcomes, we created a database from non-RCT sources of knee osteoarthritis outcomes. We then developed multivariable linear regression models that predict 1-year individual-patient knee pain and physical function outcomes for TKR and for nonsurgical treatment. These predictions allowed detecting mathematical equipoise between these two options for patients eligible for TKR. Decision support software was developed to graphically illustrate, for a given patient, the degree of overlap of pain and functional outcomes between the treatments and was pilot tested for usability, responsiveness, and as support for shared decision-making.
The KOMET predictive regression model for knee pain had four patient-specific variables, and an r2 value of 0.32, and the model for physical functioning included six patient-specific variables, and an r2 of 0.34. These models were incorporated into prototype KOMET decision support software and pilot tested in clinics, and were generally well received.
Use of predictive models and mathematical equipoise may help discern patient-specific equipoise to support shared decision-making for selecting between alternative treatments and considering enrollment into an RCT.
Household surveys are one of the most commonly used tools for generating insight into rural communities. Despite their prevalence, few studies comprehensively evaluate the quality of data derived from farm household surveys. We critically evaluated a series of standard reported values and indicators that are captured in multiple farm household surveys, and then quantified their credibility, consistency and, thus, their reliability. Surprisingly, even variables which might be considered ‘easy to estimate’ had instances of non-credible observations. In addition, measurements of maize yields and land owned were found to be less reliable than other stationary variables. This lack of reliability has implications for monitoring food security status, poverty status and the land productivity of households. Despite this rather bleak picture, our analysis also shows that if the same farm households are followed over time, the sample sizes needed to detect substantial changes are in the order of hundreds of surveys, and not in the thousands. Our research highlights the value of targeted and systematised household surveys and the importance of ongoing efforts to improve data quality. Improvements must be based on the foundations of robust survey design, transparency of experimental design and effective training. The quality and usability of such data can be further enhanced by improving coordination between agencies, incorporating mixed modes of data collection and continuing systematic validation programmes.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
Fe deficiency is relatively common in pregnancy and has both short- and long-term consequences. However, little is known about the effect on the metabolism of other micronutrients. A total of fifty-four female rats were fed control (50 mg Fe/kg) or Fe-deficient diets (7·5 mg/kg) before and during pregnancy. Maternal liver, placenta and fetal liver were collected at day 21 of pregnancy for Cu and Zn analysis and to measure expression of the major genes of Cu and Zn metabolism. Cu levels increased in the maternal liver (P=0·002) and placenta (P=0·018) of Fe-deficient rats. Zn increased (P<0·0001) and Cu decreased (P=0·006) in the fetal liver. Hepatic expression of the Cu chaperones antioxidant 1 Cu chaperone (P=0·042) and cytochrome c oxidase Cu chaperone (COX17, P=0·020) decreased in the Fe-deficient dams, while the expression of the genes of Zn metabolism was unaltered. In the placenta, Fe deficiency reduced the expression of the chaperone for superoxide dismutase 1, Cu chaperone for superoxide dismutase (P=0·030), ceruloplasmin (P=0·042) and Zn transport genes, ZRT/IRT-like protein 4 (ZIP4, P=0·047) and Zn transporter 1 (ZnT1, P=0·012). In fetal liver, Fe deficiency increased COX17 (P=0·020), ZRT/IRT-like protein 14 (P=0·036) and ZnT1 (P=0·0003) and decreased ZIP4 (P=0·004). The results demonstrate that Fe deficiency during pregnancy has opposite effects on Cu and Zn levels in the fetal liver. This may, in turn, alter metabolism of these nutrients, with consequences for development in the fetus and the neonate.
Grommet insertion is a common surgical procedure in children. Long waiting times for grommet insertion are not unusual. This project aimed to streamline the process by introducing a pathway for audiologists to directly schedule children meeting National Institute for Health and Care Excellence Clinical Guideline 60 (‘CG60’) for grommet insertion.
Method and results
A period from June to November 2014 was retrospectively audited. Mean duration between the first audiology appointment and grommet insertion was 294.5 days (median = 310 days). Implementing the direct-listing pathway reduced the duration between first audiology appointment and grommet insertion (mean = 232 days; median = 231 days). There has been a reduction in the time between the first audiology appointment and surgery (mean difference of 62.5 days; p = 0.024), and a reduction in the time between second audiology appointment and surgery (28 days; p = 0.009).
Direct-listing pathways for grommet insertion can reduce waiting times and expedite surgery. Implementation involves a simple alteration of current practice, adhering to National Institute for Health and Care Excellence Clinical Guideline 60. The ultimate decision regarding surgery still rests with ENT specialists.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Planning mental health carer services requires information about the number of carers, their characteristics, service use and unmet support needs. Available Australian estimates vary widely due to different definitions of mental illness and the types of carers included. This study aimed to provide a detailed profile of Australian mental health carers using a nationally representative household survey.
The number of mental health carers, characteristics of carers and their care recipients, caring hours and tasks provided, service use and unmet service needs were derived from the national 2012 Survey of Disability, Ageing and Carers. Co-resident carers of adults with a mental illness were compared with those caring for people with physical health and other cognitive/behavioural conditions (e.g., autism, intellectual disability, dementia) on measures of service use, service needs and aspects of their caring role.
In 2012, there were 225 421 co-resident carers of adults with mental illness in Australia, representing 1.0% of the population, and an estimated further 103 813 mental health carers not living with their care recipient. The majority of co-resident carers supported one person with mental illness, usually their partner or adult child. Mental health carers were more likely than physical health carers to provide emotional support (68.1% v. 19.7% of carers) and less likely to assist with practical tasks (64.1% v. 86.6%) and activities of daily living (31.9% v. 48.9%). Of co-resident mental health carers, 22.5% or 50 828 people were confirmed primary carers – the person providing the most support to their care recipient. Many primary mental health carers (37.8%) provided more than 40 h of care per week. Only 23.8% of primary mental health carers received government income support for carers and only 34.4% received formal service assistance in their caring role, while 49.0% wanted more support. Significantly more primary mental health than primary physical health carers were dissatisfied with received services (20.0% v. 3.2%), and 35.0% did not know what services were available to them.
Results reveal a sizable number of mental health carers with unmet needs in the Australian community, particularly with respect to financial assistance and respite care, and that these carers are poorly informed about available supports. The prominence of emotional support and their greater dissatisfaction with services indicate a need to better tailor carer services. If implemented carefully, recent Australian reforms including the Carer Gateway and National Disability Insurance Scheme hold promise for improving mental health carer supports.
Studies have consistently shown that subthreshold depression is associated with an increased risk of developing major depression. However, no study has yet calculated a pooled estimate that quantifies the magnitude of this risk across multiple studies.
We conducted a systematic review to identify longitudinal cohort studies containing data on the association between subthreshold depression and future major depression. A baseline meta-analysis was conducted using the inverse variance heterogeneity method to calculate the incidence rate ratio (IRR) of major depression among people with subthreshold depression relative to non-depressed controls. Subgroup analyses were conducted to investigate whether IRR estimates differed between studies categorised by age group or sample type. Sensitivity analyses were also conducted to test the robustness of baseline results to several sources of study heterogeneity, such as the case definition for subthreshold depression.
Data from 16 studies (n = 67 318) revealed that people with subthreshold depression had an increased risk of developing major depression (IRR = 1.95, 95% confidence interval 1.28–2.97). Subgroup analyses estimated similar IRRs for different age groups (youth, adults and the elderly) and sample types (community-based and primary care). Sensitivity analyses demonstrated that baseline results were robust to different sources of study heterogeneity.
The results of this study support the scaling up of effective indicated prevention interventions for people with subthreshold depression, regardless of age group or setting.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Risk adjustment is needed to fairly compare central-line–associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes.
Using a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank.
Overall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51–0.59) for the ICU-type model and 0.64 (95% CI, 0.60–0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model.
Our risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals.
Cognitive deficits are a core feature of schizophrenia, and impairments in most domains are thought to be stable over the course of the illness. However, cross-sectional evidence indicates that some areas of cognition, such as visuospatial associative memory, may be preserved in the early stages of psychosis, but become impaired in later established illness stages. This longitudinal study investigated change in visuospatial and verbal associative memory following psychosis onset.
In total 95 first-episode psychosis (FEP) patients and 63 healthy controls (HC) were assessed on neuropsychological tests at baseline, with 38 FEP and 22 HCs returning for follow-up assessment at 5–11 years. Visuospatial associative memory was assessed using the Cambridge Neuropsychological Test Automated Battery Visuospatial Paired-Associate Learning task, and verbal associative memory was assessed using Verbal Paired Associates subtest of the Wechsler Memory Scale - Revised.
Visuospatial and verbal associative memory at baseline did not differ significantly between FEP patients and HCs. However, over follow-up, visuospatial associative memory deteriorated significantly for the FEP group, relative to healthy individuals. Conversely, verbal associative memory improved to a similar degree observed in HCs. In the FEP cohort, visuospatial (but not verbal) associative memory ability at baseline was associated with functional outcome at follow-up.
Areas of cognition that develop prior to psychosis onset, such as visuospatial and verbal associative memory, may be preserved early in the illness. Later deterioration in visuospatial memory ability may relate to progressive structural and functional brain abnormalities that occurs following psychosis onset.