We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To simulate impact of Ca supplementation on estimated total Ca intakes among women in a population with low dietary Ca intakes, using WHO recommendations: 1·5–2·0 g elemental Ca/d during pregnancy to prevent pre-eclampsia.
Design
Single cross-sectional 24 h dietary recall data were adjusted using IMAPP software to simulate proportions of women who would meet or exceed the Estimated Average Requirement (EAR) and Tolerable Upper Intake Level (UL) assuming full or partial adherence to WHO guidelines.
Setting
Nationally and regionally representative data, Ethiopia’s ‘lean’ season 2011.
Subjects
Women 15–45 years (n 7908, of whom 492 pregnant).
Results
National mean usual Ca intake was 501 (sd 244) mg/d. Approximately 89, 91 and 96 % of all women, pregnant women and 15–18 years, respectively, had dietary Ca intakes below the EAR. Simulating 100 % adherence to 1·0, 1·5 and 2·0 g/d estimated nearly all women (>99 %) would meet the EAR, regardless of dosage. Nationally, supplementation with 1·5 and 2·0 g/d would result in intake exceeding the UL in 3·7 and 43·2 % of women, respectively, while at 1·0 g/d those exceeding the UL would be <1 % (0·74 %) except in one region (4·95 %).
Conclusions
Most Ethiopian women consume insufficient Ca, increasing risk of pre-eclampsia. Providing Ca supplements of 1·5–2·0 g/d could result in high proportions of women exceeding the UL, while universal consumption of 1·0 g/d would meet requirements with minimal risk of excess. Appropriately tested screening tools could identify and reduce risk to high Ca consumers. Research on minimum effective Ca supplementation to prevent pre-eclampsia is also needed to determine whether lower doses could be recommended.
Vitamin B12 is synthesised in the rumen from cobalt (Co) and has a major role in metabolism in the peri-paturient period, although few studies have evaluated the effect of the dietary inclusion of Co, vitamin B12 or injecting vitamin B12 on the metabolism, health and performance of high yielding dairy cows. A total of 56 Holstein-Friesian dairy cows received one of four treatments from 8 weeks before calving to 8 weeks post-calving: C, no added Co; DC, additional 0.2 mg Co/kg dry matter (DM); DB, additional 0.68 mg vitamin B12/kg DM; IB, intra-muscular injection of vitamin B12 to supply 0.71 mg/cow per day prepartum and 1.42 mg/cow per day post-partum. The basal and lactation rations both contained 0.21 mg Co/kg DM. Cows were weighed and condition scored at drying off, 4 weeks before calving, within 24 h of calving and at 2, 4 and 8 weeks post-calving, with blood samples collected at drying off, 2 weeks pre-calving, calving and 2, 4 and 8 weeks post-calving. Liver biopsy samples were collected from all animals at drying off and 4 weeks post-calving. Live weight changed with time, but there was no effect of treatment (P>0.05), whereas cows receiving IB had the lowest mean body condition score and DB the highest (P<0.05). There was no effect of treatment on post-partum DM intake, milk yield or milk fat concentration (P>0.05) with mean values of 21.6 kg/day, 39.6 kg/day and 40.4 g/kg, respectively. Cows receiving IB had a higher plasma vitamin B12 concentration than those receiving any of the other treatments (P<0.001), but there was no effect (P>0.05) of treatment on homocysteine or succinate concentrations, although mean plasma methylmalonic acid concentrations were lower (P=0.019) for cows receiving IB than for Control cows. Plasma β-hydroxybutyrate concentrations increased sharply at calving followed by a decline, but there was no effect of treatment. Similarly, there was no effect (P>0.05) of treatment on plasma non-esterified fatty acids or glucose. Whole tract digestibility of DM and fibre measured at week 7 of lactation were similar between treatments, and there was little effect of treatment on the milk fatty acid profile except for C15:0, which was lower in cows receiving DC than IB (P<0.05). It is concluded that a basal dietary concentration of 0.21 mg Co/kg DM is sufficient to meet the requirements of high yielding dairy cows during the transition period, and there is little benefit from additional Co or vitamin B12.
Indigenous women and children experience some of the most profound health disparities globally. These disparities are grounded in historical and contemporary trauma secondary to colonial atrocities perpetuated by settler society. The health disparities that exist for chronic diseases may have their origins in early-life exposures that Indigenous women and children face. Mechanistically, there is evidence that these adverse exposures epigenetically modify genes associated with cardiometabolic disease risk. Interventions designed to support a resilient pregnancy and first 1000 days of life should abrogate disparities in early-life socioeconomic status. Breastfeeding, prenatal care and early child education are key targets for governments and health care providers to start addressing current health disparities in cardiometabolic diseases among Indigenous youth. Programmes grounded in cultural safety and co-developed with communities have successfully reduced health disparities. More works of this kind are needed to reduce inequities in cardiometabolic diseases among Indigenous women and children worldwide.
The particle size of the forage has been proposed as a key factor to ensure a healthy rumen function and maintain dairy cow performance, but little work has been conducted on ryegrass silage (GS). To determine the effect of chop length of GS and GS:maize silage (MS) ratio on the performance, reticular pH, metabolism and eating behaviour of dairy cows, 16 multiparous Holstein-Friesian cows were used in a 4×4 Latin square design with four periods each of 28-days duration. Ryegrass was harvested and ensiled at two mean chop lengths (short and long) and included at two ratios of GS:MS (100:0 or 40:60 dry matter (DM) basis). The forages were fed in mixed rations to produce four isonitrogenous and isoenergetic diets: long chop GS, short chop GS, long chop GS and MS and short chop GS and MS. The DM intake (DMI) was 3.2 kg/day higher (P<0.001) when cows were fed the MS than the GS-based diets. The short chop length GS also resulted in a 0.9 kg/day DM higher (P<0.05) DMI compared with the long chop length. When fed the GS:MS-based diets, cows produced 2.4 kg/day more (P<0.001) milk than when fed diets containing GS only. There was an interaction (P<0.05) between chop length and forage ratio for milk yield, with a short chop length GS increasing yield in cows fed GS but not MS-based diets. An interaction for DM and organic matter digestibility was also observed (P<0.05), where a short chop length GS increased digestibility in cows when fed the GS-based diets but had little effect when fed the MS-based diet. When fed the MS-based diets, cows spent longer at reticular pH levels below pH 6.2 and pH 6.5 (P<0.01), but chop length had little effect. Cows when fed the MS-based diets had a higher (P<0.05) milk fat concentration of C18 : 2n-6 and total polyunsaturated fatty acids compared with when fed the GS only diets. In conclusion, GS chop length had little effect on reticular pH, but a longer chop length reduced DMI and milk yield but had little effect on milk fat yield. Including MS reduced reticular pH, but increased DMI and milk performance irrespective of the GS chop length.
Improving access to tuberculosis (TB) care and ensuring early diagnosis are two major aims of the WHO End TB strategy and the Collaborative TB Strategy for England. This study describes risk factors associated with diagnostic delay among TB cases in England. We conducted a retrospective cohort study of TB cases notified to the Enhanced TB Surveillance System in England between 2012 and 2015. Diagnostic delay was defined as more than 4 months between symptom onset and treatment start date. Multivariable logistic regression was used to identify demographic and clinical factors associated with diagnostic delay. Between 2012 and 2015, 22 422 TB cases were notified in England and included in the study. A third (7612) of TB cases had a diagnostic delay of more than 4 months. Being female, aged 45 years and older, residing outside of London and having extra-pulmonary TB disease were significantly associated with a diagnostic delay in the multivariable model (aOR = 1.2, 1.2, 1.2, 1.3, 1.8, respectively). This study identifies demographic and clinical factors associated with diagnostic delay, which will inform targeted interventions to improve access to care and early diagnosis among these groups, with the ultimate aim of helping reduce transmission and improve treatment outcomes for TB cases in England.
Background: Temporal lobe epilepsy (TLE) accounts for approximately 20% of pediatric epilepsy cases. Of those, many are considered medically intractable and require surgical interventions. In this study, we hypothesized that mesial temporal sclerosis (MTS) was less common in patients who had undergone surgery for intractable pediatric TLE than in adult series. We further hypothesized that there was a radiological and pathological discordance in identifying the cause of pediatric TLE. Methods: We retrospectively reviewed the charts of pediatric patients with TLE who had undergone surgical treatments as part of the University of Alberta’s Comprehensive Epilepsy Program between 1988 and 2018. Along with preoperative magnetic resonance imaging (MRI) reports, post-surgical pathology results and seizure outcomes were studied Results: Of the 83 pediatric patients who had undergone temporal lobe epilepsy surgery, 28% had tumors, 22% had dual pathologies, 18% had MTS, 11% had focal cortical dysplasia, and 22% had other pathologies. In addition, for 36% of these patients, discordance between their pre-surgical MRI reports and post-surgical pathology reports were found. Conclusions: This was one of the largest retrospective cohort studies of pediatric patients who had undergone surgery for intractable TLE. This study showed that tumors, and not MTS, were the most common pathology in surgical pediatric TLE.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.
Introduction: Pulmonary embolism (PE) is a diagnostic challenge, since it shares symptoms with other conditions. Missed diagnosis puts patients at a risk of a potentially fatal outcome, while false positive results leave them at risk of side effects (bleeding) from unnecessary treatment. Diagnosis involves a multi-step pathway consisting of clinical prediction rules (CPRs), laboratory testing, and diagnostic imaging, but the best strategy in the Canadian context is unclear. Methods: We carried out a systematic review of the diagnostic accuracy, clinical utility, and safety of diagnostic pathways, CPRs, and diagnostic imaging for the diagnosis of PE. Clinical prediction rules were studied by an overview of systematic reviews, and pathways and diagnostic imaging by a primary systematic review. Where feasible, a diagnostic test meta-analysis was conducted, with statistical adjustment for the use of variable and imperfect reference standards across studies. Results: The Wells CPR rule showed greater specificity than the Geneva, but the relative sensitivities were undetermined. Application of a CPR followed by with D-dimer laboratory testing can safely rule out PE. In diagnostic test accuracy meta-analysis, computed tomography (CT) (sensitivity 0.973, 95% CrI 0.921 to 1.00) and ventilation/perfusion single-photon emission CT (VQ-SPECT) (sensitivity 0.974, 95% CrI 0.898 to 1.00) had the highest sensitivity) and CT the highest specificity (0.987, 95% CrI 0.958 to 1.00). VQ and VQ-SPECT had a higher proportion of indeterminate studies, while VQ and VQ-SPECT involved lower radiation exposure than CT. Conclusion: CPR and D-dimer testing can be used to avoid unnecessary imaging. CT is the most accurate single modality, but radiation risk must be assessed. These findings, in conjunction with a recent health technology assessment, may help to inform clinical practice and guidelines.
Although hospital emergency preparedness efforts have been recognized as important, there has been growing pressure on cost containment, as well as consolidation within the US health care system. There is little data looking at what health care emergency preparedness functions have been, could be, or should be centrally coordinated at a system level.
Methods
We developed a questionnaire for academic health systems and asked about program funding, resources provided, governance, and activities. The questionnaire also queried managers’ opinions regarding the appropriate role for the system-level resources in emergency response, as well as about what is most helpful at the system-level supporting preparedness.
Results
Fifty-two of 97 systems (54%) responded. The most frequently occurring system-wide activities included: creating trainings or exercise templates (75%), promoting preparedness for employees in the system (75%), providing access to specific subject matter experts (73%), and developing specific plans for individual member entities within their system (73%). The top resources provided included a common mass notification system (71%), arranging for centralized contracts for goods and services (71%), and providing subject matter expertise (69%).
Conclusions
Currently, there is wide variation in the resources, capabilities, and programs used to support and coordinate system-level emergency preparedness among academic health systems. (Disaster Med Public Health Preparedness. 2018;12:574–577)
Considered as a less hazardous piezoelectric material, potassium sodium niobate (KNN) has been in the fore of the search for replacement of lead (Pb) zirconate titanate for piezoelectrics applications. Here, we challenge the environmental credentials of KNN due to the presence of ~60 wt% Nb2O5, a substance much less toxic to humans than Pb oxide, but whose mining and extraction cause significant environmental damage.
The extent to which coffee agroforestry systems provide ecosystem services depends on local context and management practices. There is a paucity of information about how and why farmers manage their coffee farms in the way that they do and the local knowledge that underpins this. The present research documents local agro-ecological knowledge from a coffee growing region within the vicinity of the Aberdare Forest Reserve in Central Kenya. Knowledge was acquired from over 60 coffee farmers in a purposive sample, using a knowledge-based systems approach, and tested with a stratified random sample of 125 farmers using an attribute ranking survey. Farmers had varying degrees of explanatory knowledge about how trees affected provisioning and regulating ecosystem services. Trees were described as suitable or unsuitable for growing with coffee according to tree attributes such as crown density and spread, root depth and spread, growth rate and their economic benefit. Farmers were concerned that too high a level of shade and competition for water and nutrients would decrease coffee yields, but they were also interested in diversifying production from their coffee farms to include fruits, timber, firewood and other tree products as a response to fluctuating coffee prices. A range of trees were maintained in coffee plots and along their boundaries but most were at very low abundances. Promoting tree diversity rather than focussing on one or two high value exotic species represents a change of approach for extension systems, the coffee industry and farmers alike, but is important if the coffee dominated landscapes of the region are to retain their tree species richness and the resilience this confers.
Several extragalactic HI surveys using a λ21 cm 13-beam focal plane array will begin in early 1997 using the Parkes 64 m telescope. These surveys are designed to detect efficiently nearby galaxies that have failed to be identified optically because of low optical surface brightness or high optical extinction. We discuss scientific and technical aspects of the multibeam receiver, including astronomical objectives, feed, receiver and correlator design and data acquisition. A comparison with other telescopes shows that the Parkes multibeam receiver has significant speed advantages for any large-area λ21 cm galaxy survey in the velocity range range 0–14000 km s−1.
On sub-Antarctic Marion Island, wandering albatross (Diomedea exulans) nests support high abundances of tineid moth, Pringleophaga marioni, caterpillars. Previous work proposed that the birds serve as thermal ecosystem engineers by elevating nest temperatures relative to ambient, thereby promoting growth and survival of the caterpillars. However, only 17 days of temperature data were presented previously, despite year-long nest occupation by birds. Previous sampling was also restricted to old and recently failed nests, though nests from which chicks have recently fledged are key to understanding how the engineering effect is realized. Here we build on previous work by providing nest temperature data for a full year and by sampling all three nest types. For the full duration of nest occupancy, temperatures within occupied nests are significantly higher, consistently by c. 7°C, than those in surrounding soils and abandoned nests, declining noticeably when chicks fledge. Caterpillar abundance is significantly higher in new nests compared to nests from which chicks have fledged, which in turn have higher caterpillar abundances than old nests. Combined with recent information on the life history of P. marioni, our data suggest that caterpillars are incidentally added to the nests during nest construction, and subsequently benefit from an engineering effect.
In total, 20 multiparous Holstein-Friesian dairy cows received one of four diets in each of four periods of 28-day duration in a Latin square design to test the hypothesis that the inclusion of lucerne in the ration of high-yielding dairy cows would improve animal performance and milk fatty acid (FA) composition. All dietary treatments contained 0.55 : 0.45 forage to concentrates (dry matter (DM) basis), and within the forage component the proportion of lucerne (Medicago sativa), grass (Lolium perenne) and maize silage (Zea mays) was varied (DM basis): control (C)=0.4 : 0.6 grass : maize silage; L20=0.2 : 0.2 : 0.6 lucerne : grass : maize silage; L40=0.4 : 0.6 lucerne : maize silage; and L60=0.6 : 0.4 lucerne : maize silage. Diets were formulated to contain a similar CP and metabolisable protein content, with the reduction of soya bean meal and feed grade urea with increasing content of lucerne. Intake averaged 24.3 kg DM/day and was lowest in cows when fed L60 (P<0.01), but there was no effect of treatment on milk yield, milk fat or protein content, or live weight change, which averaged 40.9 kg/day, 41.0, 30.9 g/kg and 0.16 kg/day, respectively. Milk fat content of 18:2 c9 c12 and 18:3 c9 c12 c15 was increased (P<0.05) with increasing proportion of lucerne in the ration. Milk fat content of total polyunsaturated fatty acids was increased by 0.26 g/100 g in L60 compared with C. Plasma urea and β-hydroxybutyrate concentrations averaged 3.54 and 0.52 mmol/l, respectively, and were highest (P<0.001) in cows when fed L60 and lowest in C, but plasma glucose and total protein was not affected (P>0.05) by dietary treatment. Digestibility of DM, organic matter, CP and fibre decreased (P<0.01) with increasing content of lucerne in the diet, although fibre digestibility was similar in L40 and L60. It is concluded that first cut grass silage can be replaced with first cut lucerne silage without any detrimental effect on performance and an improvement in the milk FA profile, although intake and digestibility was lowest and plasma urea concentrations highest in cows when fed the highest level of inclusion of lucerne.
Edited by
John Allan, Consultant Archaeologist to the Dean and Chapter of Exeter Cathedral,Nat Alcock, Emeritus Reader in the Department of Chemistry, University of Warwick,David Dawson, Independent archaeologist and museum and heritage consultant
Edited by
John Allan, Consultant Archaeologist to the Dean and Chapter of Exeter Cathedral,Nat Alcock, Emeritus Reader in the Department of Chemistry, University of Warwick,David Dawson, Independent archaeologist and museum and heritage consultant
Edited by
John Allan, Consultant Archaeologist to the Dean and Chapter of Exeter Cathedral,Nat Alcock, Emeritus Reader in the Department of Chemistry, University of Warwick,David Dawson, Independent archaeologist and museum and heritage consultant
Edited by
John Allan, Consultant Archaeologist to the Dean and Chapter of Exeter Cathedral,Nat Alcock, Emeritus Reader in the Department of Chemistry, University of Warwick,David Dawson, Independent archaeologist and museum and heritage consultant