Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Approximately 70% of the 30 000 known bee (Hymenoptera) species and most flower-visiting, solitary wasps (Hymenoptera) nest in the ground. However, nesting behaviours of most ground-nesting bees and wasps are poorly understood. Habitat loss, including nesting habitat, threatens populations of ground-nesting bees and wasps. Most ground-nesting bee and wasp studies implement trapping methods that capture foraging individuals, but provide little insight into the nesting preferences of these taxa. Some researchers have suggested that emergence traps may provide a suitable means by which to determine ground-nesting bee and wasp abundance. We sought to evaluate nest-site selection of ground-nesting bees and wasps using emergence traps in two study systems: (1) planted wildflower enhancement plots and fallow control plots in agricultural land; and (2) upland pine and hammock habitat in forests. Over the course of three years (2015–2017), we collected 306 ground-nesting bees and wasps across all study sites from emergence traps. In one study, we compared captures per trap between coloured pan traps and emergence traps and found that coloured pan traps captured far more ground-nesting bees and wasps than did emergence traps. Based on our emergence trap data, our results also suggest ground-nesting bees and wasps are more apt to nest within wildflower enhancement plots than in fallow control plots, and in upland pine habitats than in hammock forests. In conclusion, emergence traps have potential to be a unique tool to gain understanding of ground-nesting bee and wasp habitat requirements.
We implemented a cross-sectional study in Tana River County, Kenya, a Rift Valley fever (RVF)-endemic area, to quantify the strength of association between RVF virus (RVFv) seroprevalences in livestock and humans, and their respective intra-cluster correlation coefficients (ICCs). The study involved 1932 livestock from 152 households and 552 humans from 170 households. Serum samples were collected and screened for anti-RVFv immunoglobulin G (IgG) antibodies using inhibition IgG enzyme-linked immunosorbent assay (ELISA). Data collected were analysed using generalised linear mixed effects models, with herd/household and village being fitted as random variables. The overall RVFv seroprevalences in livestock and humans were 25.41% (95% confidence interval (CI) 23.49–27.42%) and 21.20% (17.86–24.85%), respectively. The presence of at least one seropositive animal in a household was associated with an increased odds of exposure in people of 2.23 (95% CI 1.03–4.84). The ICCs associated with RVF virus seroprevalence in livestock were 0.30 (95% CI 0.19–0.44) and 0.22 (95% CI 0.12–0.38) within and between herds, respectively. These findings suggest that there is a greater variability of RVF virus exposure between than within herds. We discuss ways of using these ICC estimates in observational surveys for RVF in endemic areas and postulate that the design of the sentinel herd surveillance should consider patterns of RVF clustering to enhance its effectiveness as an early warning system for RVF epidemics.
Adolescence is a critical time point in the lifecourse. LifeLab is an educational intervention engaging adolescents in understanding Developmental Origins of Health and Disease (DOHaD) concepts and the impact of the early life environment on future health, benefitting both their long-term health and that of the next generation. We aimed to assess whether engaging adolescents with DOHaD concepts improves scientific literacy and whether engagement alone improves health behaviours.
Six schools were randomized, three to intervention and three to control. Outcome measures were changed in knowledge, and intended and actual behaviour in relation to diet and lifestyle. A total of 333 students completed baseline and follow-up questionnaires. At 12 months, intervention students showed greater understanding of DOHaD concepts. No sustained changes in behaviours were identified.
Adolescents’ engagement with DOHaD concepts can be improved and maintained over 12 months. Such engagement does not itself translate into behaviour change. The intervention has consequently been revised to include additional components beyond engagement alone.
Novel approaches to improving disaster response have begun to include the use of big data and information and communication technology (ICT). However, there remains a dearth of literature on the use of these technologies in disasters. We have conducted an integrative literature review on the role of ICT and big data in disasters. Included in the review were 113 studies that met our predetermined inclusion criteria. Most studies used qualitative methods (39.8%, n=45) over mixed methods (31%, n=35) or quantitative methods (29.2%, n=33). Nearly 80% (n=88) covered only the response phase of disasters and only 15% (n=17) of the studies addressed disasters in low- and middle-income countries. The 4 most frequently mentioned tools were geographic information systems, social media, patient information, and disaster modeling. We suggest testing ICT and big data tools more widely, especially outside of high-income countries, as well as in nonresponse phases of disasters (eg, disaster recovery), to increase an understanding of the utility of ICT and big data in disasters. Future studies should also include descriptions of the intended users of the tools, as well as implementation challenges, to assist other disaster response professionals in adapting or creating similar tools. (Disaster Med Public Health Preparedness. 2019;13:353–367)
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
This paper discusses the sustainability of livestock systems, emphasising bidirectional relations with animal health. We review conventional and contrarian thinking on sustainability and argue that in the most common approaches to understanding sustainability, health aspects have been under-examined. Literature review reveals deep concerns over the sustainability of livestock systems; we recognise that interventions are required to shift to more sustainable trajectories, and explore approaches to prioritising in different systems, focusing on interventions that lead to better health. A previously proposed three-tiered categorisation of ‘hot spots’, ‘cold spots’ and ‘worried well’ animal health trajectories provides a mental model that, by taking into consideration the different animal health status, animal health risks, service response needs and key drivers in each system, can help identify and implement interventions. Combining sustainability concepts with animal health trajectories allows for a richer analysis, and we apply this to three case studies drawn from North Africa and the Middle East; Bangladesh; and the Eastern Cape of South Africa. We conclude that the quest for sustainability of livestock production systems from the perspective of human and animal health is elusive and difficult to reconcile with the massive anticipated growth in demand for livestock products, mainly in low- and middle-income countries, as well as the aspirations of poor livestock keepers for better lives. Nevertheless, improving the health of livestock can contribute to health sustainability both through reducing negative health impacts of livestock and increasing efficiency of production. However, the choice of the most appropriate options must be under-pinned by an understanding of agro-ecology, economy and values. We argue that a new pillar of One Health should be added to the three traditional sustainability pillars of economics, society and environment when addressing livestock systems.
A clean hot-water drill was used to gain access to Subglacial Lake Whillans (SLW) in late January 2013 as part of the Whillans Ice Stream Subglacial Access Research Drilling (WISSARD) project. Over 3 days, we deployed an array of scientific tools through the SLW borehole: a downhole camera, a conductivity–temperature–depth (CTD) probe, a Niskin water sampler, an in situ filtration unit, three different sediment corers, a geothermal probe and a geophysical sensor string. Our observations confirm the existence of a subglacial water reservoir whose presence was previously inferred from satellite altimetry and surface geophysics. Subglacial water is about two orders of magnitude less saline than sea water (0.37–0.41 psu vs 35 psu) and two orders of magnitude more saline than pure drill meltwater (<0.002 psu). It reaches a minimum temperature of –0.55~C, consistent with depression of the freezing point by 7.019 MPa of water pressure. Subglacial water was turbid and remained turbid following filtration through 0.45 µm filters. The recovered sediment cores, which sampled down to 0.8 m below the lake bottom, contained a macroscopically structureless diamicton with shear strength between 2 and 6 kPa. Our main operational recommendation for future subglacial access through water-filled boreholes is to supply enough heat to the top of the borehole to keep it from freezing.
Macro-morphological features traditionally used to segregate genera in Parmeliaceae have been shown to be highly plastic, placing limits on their taxonomic value. Here we aim to elucidate the evolutionary relationships of the genera Relicina and Relicinopsis and reassess the phenotypic features traditionally used to separate these genera. To this end, we gathered ribosomal DNA sequences of ITS, nuLSU and mtSSU and analyzed them in a phylogenetic framework. Relicina was recovered as paraphyletic, with Relicinopsis nested within, and three different clades were identified within Relicina. Alternative hypothesis tests significantly rejected the monophyly of Relicina. Our results indicate that the presence or absence of bulbate cilia is of limited taxonomic value in this clade. Based on differences in conidia, however, we propose to accept Relicinopsis as a subgenus within Relicina as Relicina subgen. Relicinopsis (Elix & Verdon) Kirika, Divakar & Lumbsch. It is proposed that five new combinations of species previously classified in Relicinopsis be placed in Relicina.
Firestone & Scholl (F&S) rely on three problematic assumptions about the mind (modularity, reflexiveness, and context-insensitivity) to argue cognition does not fundamentally influence perception. We highlight evidence indicating that perception, cognition, and emotion are constructed through overlapping, distributed brain networks characterized by top-down activity and context-sensitivity. This evidence undermines F&S's ability to generalize from case studies to the nature of perception.
We describe a case of anomalous left coronary artery from the pulmonary artery in association with total anomalous pulmonary venous return. The infant was diagnosed with total anomalous pulmonary venous return at 6 weeks of age and underwent successful surgical repair. On routine follow-up, he was found to have an anomalous left coronary artery from the pulmonary artery without evidence of mitral regurgitation or left ventricular dysfunction. The presence of the left-to-right shunt and secondary elevation in pulmonary artery pressures likely masked the usual findings associated with this coronary anomaly.
Children from low-income countries consuming predominantly plant-based diets but little animal products are considered to be at risk of Fe deficiency. The present study determined the Fe status of children from resource-limited rural households.
A cross-sectional study.
Twenty six kebeles (the smallest administrative unit) from six zones of the Amhara region, Ethiopia.
Children aged 54–60 months (n 628).
Grain, roots or tubers were the main dietary components consumed by 100 % of the study participants, followed by pulses, legumes or nuts (66·6 %). Consumption of fruit and vegetables (19·3 %) and meat, poultry and fish (2·2 %) was low. Children had a mean dietary diversity score of 2·1 (sd 0·8). Most children (74·8 %, n 470) were in the lowest dietary diversity group (1–2 food groups). Rate of any morbidity in the preceding 14 d was 22·9 % (n 114). Infection or inflammation (α1-acid glycoprotein >1·2 g/l) was present in 30·2 % (n 184) of children. Children had a high rate of stunting (43·2 %). Of the total sample, 13·6 % (n 82) of children were anaemic, 9·1 % (n 57) were Fe deficient and 5·3 % (n 32) had Fe-deficiency anaemia. Fe-deficiency erythropoiesis was present in 14·2 % (n 60) of children.
Despite consuming a predominantly plant-based diet and little animal-source foods, there was a low prevalence of Fe-deficiency anaemia. This illustrates that dietary patterns can be inharmonious with Fe biochemical status; thus, Fe-related interventions require biochemical screening.
In October 2008, Medicare ceased additional payment for hospital-acquired conditions not present on admission. We evaluated the policy’s differential impact in hospitals with high vs low operating margins. Medicare’s payment policy may have had an impact on reducing central line–associated bloodstream infections in hospitals with low operating margins.
Infect. Control Hosp. Epidemiol. 2015;37(1):100–103
Policymakers may wish to align healthcare payment and quality of care while minimizing unintended consequences, particularly for safety net hospitals.
To determine whether the 2008 Centers for Medicare and Medicaid Services Hospital-Acquired Conditions policy had a differential impact on targeted healthcare-associated infection rates in safety net compared with non–safety net hospitals.
Interrupted time-series design.
SETTING AND PARTICIPANTS
Nonfederal acute care hospitals that reported central line–associated bloodstream infection and ventilator-associated pneumonia rates to the Centers for Disease Control and Prevention’s National Health Safety Network from July 1, 2007, through December 31, 2013.
We did not observe changes in the slope of targeted infection rates in the postpolicy period compared with the prepolicy period for either safety net (postpolicy vs prepolicy ratio, 0.96 [95% CI, 0.84–1.09]) or non–safety net (0.99 [0.90–1.10]) hospitals. Controlling for prepolicy secular trends, we did not detect differences in an immediate change at the time of the policy between safety net and non–safety net hospitals (P for 2-way interaction, .87).
The Centers for Medicare and Medicaid Services Hospital-Acquired Conditions policy did not have an impact, either positive or negative, on already declining rates of central line–associated bloodstream infection in safety net or non–safety net hospitals. Continued evaluations of the broad impact of payment policies on safety net hospitals will remain important as the use of financial incentives and penalties continues to expand in the United States.
To identify factors associated with the development of surgical site infection (SSI) among adult patients undergoing renal transplantation
A retrospective cohort study
An urban tertiary care center in Baltimore, Maryland, with a well-established renal transplantation program that performs ~200–250renal transplant procedures annually
At total of 441 adult patients underwent renal transplantation between January 1, 2010, and December 31, 2011. Of these 441patients, 66 (15%) developed an SSI; of these 66, 31 (47%) were superficial incisional infections and 35 (53%) were deep-incisional or organ-space infections. The average body mass index (BMI) among this patient cohort was 29.7; 84 (42%) were obese (BMI >30). Patients who developed an SSI had a greater mean BMI (31.7 vs 29.4; P=.004) and were more likely to have a history of peripheral vascular disease, rheumatologic disease, and narcotic abuse. History of cerebral vascular disease was protective. Multivariate analysis showed BMI (odds ratio [OR] 1.06; 95% confidence interval [CI], 1.02–1.11) and past history of narcotic use/abuse (OR, 4.86; 95% CI, 1.24–19.12) to be significantly associated with development of SSI after controlling for National Healthcare Surveillance Network (NHSN) score and presence of cerebrovascular, peripheral vascular, and rheumatologic disease.
We identified higher BMI as a risk factor for the development of SSI following renal transplantation. Notably, neither aggregate comorbidity scores nor NHSN risk index were associated with SSI in this population. Additional risk adjustment measures and research in this area are needed to compare SSIs across transplant centers.
Children with a secundum atrial septal defect are usually asymptomatic and are referred for elective closure after 3–4 years of age; however, in premature infants with chronic lung disease, bronchopulmonary dysplasia, or pulmonary hypertension, increased pulmonary blood flow secondary to a left-to-right atrial shunt, may exacerbate their condition. Closure of the atrial septal defect in these patients can result in significant clinical improvement. We report the cases of two premature infants with chronic lung disease, who underwent atrial septal defect closure with the Gore HELEX Septal Occluder and discuss the technical aspects of using the device in these patients and their clinical outcomes.
Breast cancer (BrCa) is the second commonest cause of cancer-related deaths in women. The metastatic breast cancer exhibits a high affinity to bone, leading to debilitating skeletal complications associated with significant morbidity and poor prognosis. Traditional in vitro and in vivo BrCa bone metastasis models contain many inherent limitations with regards to controllability, reproducibility, and flexibility of design. Thus, the objective of this research is to use a 3D bioprinting system and nanomaterials to recreate a biomimetic and tunable bone model suitable for the effective simulation and study of metastatic BrCa invading and colonizing a bone environment. For this purpose, we designed and 3D printed a series of scaffolds, comprised of a bone microstructure and nano hydroxyapatites (nHA, inorganic nano components in bone). The size and geometry of the bone microstructure was varied with 250 and 150 µm pores, in repeating square and hexagon patterns, for a total of four different pore geometries. 3D bioprinted scaffolds were subsequently conjugated with nHA, using an acetylation chemical functionalization process and then characterized by scanning electron microscope (SEM). SEM imaging showed that our designed microfeatures were printable with the predesigned resolutions described above. Imaging further confirmed that acetylation effectively attached nHA to the surface of scaffolds and induced a nanoroughness. Metastatic BrCa cell 4 h adhesion and 1, 3 and 5 day proliferation were investigated in the bone model in vitro. The cell adhesion and proliferation results showed that all scaffolds are cytocompatible for BrCa cell growth; in particular the nHA scaffolds with small hexagonal pores had the highest cell density. Given this data, it can be stipulated that our 3D printed nHA scaffolds may make effective biomimetic environments for studying BrCa bone metastasis.
With the proportion of older adults in Hong Kong projected to double in size in the next 30 years, it is important to develop measures for detecting individuals in the earliest stage of Alzheimer's disease (AD, 0.5 in Clinical Dementia Rating, CDR). We tested the utility of a non-verbal prospective memory task (PM, ability to remember what one has to do when a specific event occurs in the future) as an early marker for AD in Hong Kong Chinese.
A large community dwelling sample of older adults who are healthy controls (CDR 0, N = 125), in the earliest stage of AD (CDR 0.5, N = 125), or with mild AD (CDR 1, N = 30) participated in this study. Their reaction time/accuracy data were analyzed by mixed-factor analyses of variance to compare the performance of the three CDR groups. Logistic regression analyses were performed to test the discriminative power of these measures for CDR 0 versus 0.5 participants.
Prospective memory performance declined as a function of AD severity: CDR 0 > CDR 0.5 > CDR 1, suggesting the effects of early-stage AD and AD progression on PM. After partialling out the variance explained by psychometric measures (e.g., ADAS-Cog), reaction time/accuracy measures that reflected the PM still significantly discriminated between CDR 0 versus 0.5 participants in most of the cases.
The effectiveness of PM measures in discriminating individuals in the earliest stage of AD from healthy older adults suggests that these measures should be further developed as tools for early-stage AD discrimination.