To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Consumption is driven by children’s sensory acceptance, but little is known about the sensory characteristics of vegetables that children commonly eat. A greater understanding could help design more effective interventions to help raise intakes, thus realising beneficial health effects. This study sought to: (1) Understand the vegetable consumption patterns in children, with and without potatoes, using the Australian and WHO definitions. (2) Describe the sensory characteristics of vegetables consumed by children by age group, level of intake and variety. (3) Determine the vegetable preferences of children, by age group, level of intake and variety.
Analysis of National Nutrition Survey data, combining reported vegetable intake with sensory characteristics described by a trained panel.
A nationally representative sample of Australian children and adolescents aged 2–17·9 years (n 2812).
While consumption increased in older age groups, variety remained constant. Greater variety, however, was associated with higher vegetable consumption. Potato intake increased with consumption, contributing over one-third of total vegetable intake for highest vegetable consumption and for older age groups. Children favoured relatively sweet vegetables and reported lower consumption of bitter vegetables. There were no differences in the sensory properties of vegetables consumed by children in different age groups. After potatoes, carrots, sweetcorn, mixtures, fruiting and cruciferous types were preferred vegetables.
Children tend to prefer vegetables with sensory characteristics consistent with innate taste preferences (sweet and low bitterness). Increasing exposure to a variety of vegetables may help increase the persistently low vegetable consumption patterns of children.
Due to shortages of N95 respirators during the coronavirus disease 2019 (COVID-19) pandemic, it is necessary to estimate the number of N95s required for healthcare workers (HCWs) to inform manufacturing targets and resource allocation.
We developed a model to determine the number of N95 respirators needed for HCWs both in a single acute-care hospital and the United States.
For an acute-care hospital with 400 all-cause monthly admissions, the number of N95 respirators needed to manage COVID-19 patients admitted during a month ranges from 113 (95% interpercentile range [IPR], 50–229) if 0.5% of admissions are COVID-19 patients to 22,101 (95% IPR, 5,904–25,881) if 100% of admissions are COVID-19 patients (assuming single use per respirator, and 10 encounters between HCWs and each COVID-19 patient per day). The number of N95s needed decreases to a range of 22 (95% IPR, 10–43) to 4,445 (95% IPR, 1,975–8,684) if each N95 is used for 5 patient encounters. Varying monthly all-cause admissions to 2,000 requires 6,645–13,404 respirators with a 60% COVID-19 admission prevalence, 10 HCW–patient encounters, and reusing N95s 5–10 times. Nationally, the number of N95 respirators needed over the course of the pandemic ranges from 86 million (95% IPR, 37.1–200.6 million) to 1.6 billion (95% IPR, 0.7–3.6 billion) as 5%–90% of the population is exposed (single-use). This number ranges from 17.4 million (95% IPR, 7.3–41 million) to 312.3 million (95% IPR, 131.5–737.3 million) using each respirator for 5 encounters.
We quantified the number of N95 respirators needed for a given acute-care hospital and nationally during the COVID-19 pandemic under varying conditions.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
We critically review potential involvement of trimethylamine N-oxide (TMAO) as a link between diet, the gut microbiota and CVD. Generated primarily from dietary choline and carnitine by gut bacteria and hepatic flavin-containing mono-oxygenase (FMO) activity, TMAO could promote cardiometabolic disease when chronically elevated. However, control of circulating TMAO is poorly understood, and diet, age, body mass, sex hormones, renal clearance, FMO3 expression and genetic background may explain as little as 25 % of TMAO variance. The basis of elevations with obesity, diabetes, atherosclerosis or CHD is similarly ill-defined, although gut microbiota profiles/remodelling appear critical. Elevated TMAO could promote CVD via inflammation, oxidative stress, scavenger receptor up-regulation, reverse cholesterol transport (RCT) inhibition, and cardiovascular dysfunction. However, concentrations influencing inflammation, scavenger receptors and RCT (≥100 µm) are only achieved in advanced heart failure or chronic kidney disease (CKD), and greatly exceed pathogenicity of <1–5 µm levels implied in some TMAO–CVD associations. There is also evidence that CVD risk is insensitive to TMAO variance beyond these levels in omnivores and vegetarians, and that major TMAO sources are cardioprotective. Assessing available evidence suggests that modest elevations in TMAO (≤10 µm) are a non-pathogenic consequence of diverse risk factors (ageing, obesity, dyslipidaemia, insulin resistance/diabetes, renal dysfunction), indirectly reflecting CVD risk without participating mechanistically. Nonetheless, TMAO may surpass a pathogenic threshold as a consequence of CVD/CKD, secondarily promoting disease progression. TMAO might thus reflect early CVD risk while providing a prognostic biomarker or secondary target in established disease, although mechanistic contributions to CVD await confirmation.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
The RemoveDEBRIS mission has been the first mission to successfully demonstrate, in-orbit, a series of technologies that can be used for the active removal of space debris. The mission started late in 2014 and was sponsored by a grant from the EC that saw a consortium led by the Surrey Space Centre to develop the mission, from concept to in-orbit demonstrations, that terminated in March 2019. Technologies for the capture of large space debris, like a net and a harpoon, have been successfully tested together with hardware and software to retrieve data on non-cooperative target debris kinematics from observations carried out with on board cameras. The final demonstration consisted of the deployment of a drag-sail to increase the drag of the satellite to accelerate its demise.
Introduction: Emergency Department (ED) health care professionals are responsible for providing team-based care to critically ill patients. Given this complex responsibility, simulation training is paramount. In situ simulation (ISS) has many cited benefits as a training strategy that targets on-duty staff and occurs in the actual patient environment. Several evidence-based frameworks identify staff buy-in as essential for successful ISS implementation, however, the attitudes of interdisciplinary front-line ED staff in this regard are unknown. The purpose of this study is to identify contextual trends in interdisciplinary opinions on routine ISS in the ED. Methods: Qualitative and quantitative review, exploring the self-reported attitudes of interdisciplinary ED staff: before, during and after the implementation of a routine ISS pilot program (5 sessions in 5 months) at the Charles V Keating Emergency and Trauma Center in Halifax from Feb-Nov, 2018. Results: 149 surveys were received. Baseline support for ISS was high; 83% of respondents believed that the advantages of ISS outweigh the challenges and 47% favoured simulation in the ED, relative the sim bay (26%) and 28% were indifferent. The attitudes of direct participants in ISS were very positive, with 88% believing that the benefits outweighed the challenges after participation and 91% believing that they personally benefited from participating. A department wide post-ISS pilot survey suggested a slight decrease in support. Support for ISS dropped from 83% to 67%, a statistically insignificant reduction (p = 0.098) but a sizeable change that warrants further investigation. Most notably respondents reported increased support for simulation training in a simulation bay relative to ISS in the ED. Respondents still regarded simulation highly overall. Interestingly, when the results were stratified by position, staff physicians were the least positive. Conclusion: Pre-pilot or baseline opinions of ISS were very positive, and participants all responded positively to the simulations. This study generates valuable insight into the perceptions of interdisciplinary ED staff regarding the implementation and perceived impact of routine ISS. This evidence can be used to inform future programming, though further investigation is warranted into why opinions post-intervention may have changed at the department level.
Recent evidence suggests that exercise plays a role in cognition and that the posterior cingulate cortex (PCC) can be divided into dorsal and ventral subregions based on distinct connectivity patterns.
To examine the effect of physical activity and division of the PCC on brain functional connectivity measures in subjective memory complainers (SMC) carrying the epsilon 4 allele of apolipoprotein E (APOE 4) allele.
Participants were 22 SMC carrying the APOE ɛ4 allele (ɛ4+; mean age 72.18 years) and 58 SMC non-carriers (ɛ4–; mean age 72.79 years). Connectivity of four dorsal and ventral seeds was examined. Relationships between PCC connectivity and physical activity measures were explored.
ɛ4+ individuals showed increased connectivity between the dorsal PCC and dorsolateral prefrontal cortex, and the ventral PCC and supplementary motor area (SMA). Greater levels of physical activity correlated with the magnitude of ventral PCC–SMA connectivity.
The results provide the first evidence that ɛ4+ individuals at increased risk of cognitive decline show distinct alterations in dorsal and ventral PCC functional connectivity.
Schistosomiasis in China has been substantially reduced due to an effective control programme employing various measures including bovine and human chemotherapy, and the removal of bovines from endemic areas. To fulfil elimination targets, it will be necessary to identify other possible reservoir hosts for Schistosoma japonicum and include them in future control efforts. This study determined the infection prevalence of S. japonicum in rodents (0–9·21%), dogs (0–18·37%) and goats (6·9–46·4%) from the Dongting Lake area of Hunan province, using a combination of traditional coproparasitological techniques (miracidial hatching technique and Kato-Katz thick smear technique) and molecular methods [quantitative real-time PCR (qPCR) and droplet digital PCR (ddPCR)]. We found a much higher prevalence in goats than previously recorded in this setting. Cattle and water buffalo were also examined using the same procedures and all were found to be infected, emphasising the occurrence of active transmission. qPCR and ddPCR were much more sensitive than the coproparasitological procedures with both KK and MHT considerably underestimating the true prevalence in all animals surveyed. The high level of S. japonicum prevalence in goats indicates that they are likely important reservoirs in schistosomiasis transmission, necessitating their inclusion as targets of control, if the goal of elimination is to be achieved in China.
The buffalo has a seasonal reproduction activity with mating and non-mating periods occurring from late autumn to winter and from late spring to beginning of autumn, respectively. Sperm glycocalyx plays an important role in reproduction as it is the first interface between sperm and environment. Semen quality is poorer during non-mating periods, so we aimed to evaluate if there were also seasonal differences in the surface glycosylation pattern of mating period spermatozoa (MPS) compared with non-mating period spermatozoa (NMPS). The complexity of carbohydrate structures makes their analysis challenging, and recently the high-throughput microarray approach is now providing a new tool into the evaluation of cell glycosylation status. We adopted a novel procedure in which spermatozoa was spotted on microarray slides, incubated with a panel of 12 biotinylated lectins and Cy3-conjugated streptavidin, and then signal intensity was detected using a microarray scanner. Both MPS and NMPS microarrays reacted with all the lectins and revealed that the expression of (i) O-glycans with NeuNAcα2-3Galβ1,3(±NeuNAcα2-6)GalNAc, Galβ1,3GalNAc and GalNAcα1,3(l-Fucα1,2)Galβ1,3/4GlcNAcβ1 was not season dependent; (ii) O-linked glycans terminating with GalNAc, asialo N-linked glycans terminating with Galβ1,4GlcNAc, GlcNAc, as well as α1,6 and α1,2-linked fucosylated oligosaccharides was predominant in MPS; (iii) high mannose- and biantennary complex types N-glycans terminating with α2,6 sialic acids and terminal galactose were lower in MPS. Overall, this innovative cell microarray method was able to identify specific glycosylation changes that occur on buffalo bull sperm surface during the mating and non-mating periods.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
The Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) has found that the proportional elevation in the US Army enlisted soldier suicide rate during deployment (compared with the never-deployed or previously deployed) is significantly higher among women than men, raising the possibility of gender differences in the adverse psychological effects of deployment.
Person-month survival models based on a consolidated administrative database for active duty enlisted Regular Army soldiers in 2004–2009 (n = 975 057) were used to characterize the gender × deployment interaction predicting suicide. Four explanatory hypotheses were explored involving the proportion of females in each soldier's occupation, the proportion of same-gender soldiers in each soldier's unit, whether the soldier reported sexual assault victimization in the previous 12 months, and the soldier's pre-deployment history of treated mental/behavioral disorders.
The suicide rate of currently deployed women (14.0/100 000 person-years) was 3.1–3.5 times the rates of other (i.e. never-deployed/previously deployed) women. The suicide rate of currently deployed men (22.6/100 000 person-years) was 0.9–1.2 times the rates of other men. The adjusted (for time trends, sociodemographics, and Army career variables) female:male odds ratio comparing the suicide rates of currently deployed v. other women v. men was 2.8 (95% confidence interval 1.1–6.8), became 2.4 after excluding soldiers with Direct Combat Arms occupations, and remained elevated (in the range 1.9–2.8) after adjusting for the hypothesized explanatory variables.
These results are valuable in excluding otherwise plausible hypotheses for the elevated suicide rate of deployed women and point to the importance of expanding future research on the psychological challenges of deployment for women.