To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Dietary Zn has significant impacts on the growth and development of breeding rams. The objectives of this study were to evaluate the effects of dietary Zn source and concentration on serum Zn concentration, growth performance, wool traits and reproductive performance in rams. Forty-four Targhee rams (14 months; 68 ± 18 kg BW) were used in an 84-day completely randomized design and were fed one of three pelleted dietary treatments: (1) a control without fortified Zn (CON; n = 15; ~1 × NRC); (2) a diet fortified with a Zn amino acid complex (ZnAA; n = 14; ~2 × NRC) and (3) a diet fortified with ZnSO4 (ZnSO4; n = 15; ~2 × NRC). Growth and wool characteristics measured throughout the course of the study were BW, average daily gain (ADG), dry matter intake (DMI), feed efficiency (G : F), longissimus dorsi muscle depth (LMD), back fat (BF), wool staple length (SL) and average fibre diameter (AFD). Blood was collected from each ram at four time periods to quantify serum Zn and testosterone concentrations. Semen was collected 1 to 2 days after the trial was completed. There were no differences in BW (P = 0.45), DMI (P = 0.18), LMD (P = 0.48), BF (P = 0.47) and AFD (P = 0.9) among treatment groups. ZnSO4 had greater (P ≤ 0.03) serum Zn concentrations compared with ZnAA and CON treatments. Rams consuming ZnAA had greater (P ≤ 0.03) ADG than ZnSO4 and CON. There tended to be differences among groups for G : F (P = 0.06), with ZnAA being numerically greater than ZnSO4 and CON. Wool staple length regrowth was greater (P < 0.001) in ZnSO4 and tended to be longer (P = 0.06) in ZnAA treatment group compared with CON. No differences were observed among treatments in scrotal circumference, testosterone, spermatozoa concentration within ram semen, % motility, % live sperm and % sperm abnormalities (P ≥ 0.23). Results indicated beneficial effects of feeding increased Zn concentrations to developing Targhee rams, although Zn source elicited differential responses in performance characteristics measured.
We have observed the G23 field of the Galaxy AndMass Assembly (GAMA) survey using the Australian Square Kilometre Array Pathfinder (ASKAP) in its commissioning phase to validate the performance of the telescope and to characterise the detected galaxy populations. This observation covers ~48 deg2 with synthesised beam of 32.7 arcsec by 17.8 arcsec at 936MHz, and ~39 deg2 with synthesised beam of 15.8 arcsec by 12.0 arcsec at 1320MHz. At both frequencies, the root-mean-square (r.m.s.) noise is ~0.1 mJy/beam. We combine these radio observations with the GAMA galaxy data, which includes spectroscopy of galaxies that are i-band selected with a magnitude limit of 19.2. Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry is used to determine which galaxies host an active galactic nucleus (AGN). In properties including source counts, mass distributions, and IR versus radio luminosity relation, the ASKAP-detected radio sources behave as expected. Radio galaxies have higher stellar mass and luminosity in IR, optical, and UV than other galaxies. We apply optical and IR AGN diagnostics and find that they disagree for ~30% of the galaxies in our sample. We suggest possible causes for the disagreement. Some cases can be explained by optical extinction of the AGN, but for more than half of the cases we do not find a clear explanation. Radio sources aremore likely (~6%) to have an AGN than radio quiet galaxies (~1%), but the majority of AGN are not detected in radio at this sensitivity.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
An unexpected increase in gastroenteritis cases was reported by healthcare workers on the KwaZulu-Natal Coast, South Africa, January 2017 with >600 cases seen over a 3-week period. A case–control study was conducted to identify the source and risk factors associated with the outbreak so as to recommend control and prevention measures. Record review identified cases and controls and structured-telephonic interviews were conducted to obtain exposure history. Stool specimens were collected from 20 cases along with environmental samples and both screened for enteric pathogens. A total of 126 cases and 62 controls were included in the analysis. The odds of developing gastroenteritis were 6.0 times greater among holiday makers than residents (95% confidence interval (CI) 2.0–17.7). Swimming in the lagoon increased the odds of developing gastroenteritis by 3.3 times (95% CI 1.06–10.38). Lagoon water samples tested positive for norovirus (NoV) GI.6, GII.3 and GII.6, astrovirus and rotavirus. Eleven (55%) stool specimens were positive for NoV with eight genotyped as GI.1 (n = 2), GI.5 (n = 3), GI.6 (n = 2), and GI.7 (n = 1). A reported sewage contamination event impacting the lagoon was the likely source with person-to-person spread perpetuating the outbreak. Restriction to swimming in the lagoon was apparently ineffective at preventing the outbreak, possibly due to inadequate enforcement, communication and signage strategies.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Animal health surveillance enables the detection and control of animal diseases including zoonoses. Under the EU-FP7 project RISKSUR, a survey was conducted in 11 EU Member States and Switzerland to describe active surveillance components in 2011 managed by the public or private sector and identify gaps and opportunities. Information was collected about hazard, target population, geographical focus, legal obligation, management, surveillance design, risk-based sampling, and multi-hazard surveillance. Two countries were excluded due to incompleteness of data. Most of the 664 components targeted cattle (26·7%), pigs (17·5%) or poultry (16·0%). The most common surveillance objectives were demonstrating freedom from disease (43·8%) and case detection (26·8%). Over half of components applied risk-based sampling (57·1%), but mainly focused on a single population stratum (targeted risk-based) rather than differentiating between risk levels of different strata (stratified risk-based). About a third of components were multi-hazard (37·3%). Both risk-based sampling and multi-hazard surveillance were used more frequently in privately funded components. The study identified several gaps (e.g. lack of systematic documentation, inconsistent application of terminology) and opportunities (e.g. stratified risk-based sampling). The greater flexibility provided by the new EU Animal Health Law means that systematic evaluation of surveillance alternatives will be required to optimize cost-effectiveness.
Urban slum environments in the tropics are conducive to the proliferation and the spread of rodent-borne zoonotic pathogens to humans. Calodium hepaticum (Brancroft, 1893) is a zoonotic nematode known to infect a variety of mammalian hosts, including humans. Norway rats (Rattus norvegicus) are considered the most important mammalian host of C. hepaticum and are therefore a potentially useful species to inform estimates of the risk to humans living in urban slum environments. There is a lack of studies systematically evaluating the role of demographic and environmental factors that influence both carriage and intensity of infection of C. hepaticum in rodents from urban slum areas within tropical regions. Carriage and the intensity of infection of C. hepaticum were studied in 402 Norway rats over a 2-year period in an urban slum in Salvador, Brazil. Overall, prevalence in Norway rats was 83% (337/402). Independent risk factors for C. hepaticum carriage in R. norvegicus were age and valley of capture. Of those infected the proportion with gross liver involvement (i.e. >75% of the liver affected, a proxy for a high level intensity of infection), was low (8%, 26/337). Sixty soil samples were collected from ten locations to estimate levels of environmental contamination and provide information on the potential risk to humans of contracting C. hepaticum from the environment. Sixty percent (6/10) of the sites were contaminated with C. hepaticum. High carriage levels of C. hepaticum within Norway rats and sub-standard living conditions within slum areas may increase the risk to humans of exposure to the infective eggs of C. hepaticum. This study supports the need for further studies to assess whether humans are becoming infected within this community and whether C. hepaticum is posing a significant risk to human health.
Burnt mounds, or fulachtaí fiadh as they are known in Ireland, are probably the most common prehistoric site type in Ireland and Britain. Typically Middle–Late Bronze Age in age (although both earlier and later examples are known), they are artefact-poor and rarely associated with settlements. The function of these sites has been much debated with the most commonly cited uses being for cooking, as steam baths or saunas, for brewing, tanning, or textile processing. A number of major infrastructural development schemes in Ireland in the years 2002–2007 revealed remarkable numbers of these mounds often associated with wood-lined troughs, many of which were extremely well-preserved. This afforded an opportunity to investigate them as landscape features using environmental techniques – specifically plant macrofossils and charcoal, pollen, beetles, and multi-element analyses. This paper summarises the results from eight sites from Ireland and compares them with burnt mound sites in Great Britain. The fulachtaí fiadh which are generally in clusters, are all groundwater-fed by springs, along floodplains and at the bases of slopes. The sites are associated with the clearance of wet woodland for fuel; most had evidence of nearby agriculture and all revealed low levels of grazing. Multi-element analysis at two sites revealed elevated heavy metal concentrations suggesting that off-site soil, ash or urine had been used in the trough. Overall the evidence suggests that the most likely function for these sites is textile production involving both cleaning and/or dyeing of wool and/or natural plant fibres and as a functionally related activity to hide cleaning and tanning. Whilst further research is clearly needed to confirm if fulachtaí fiadh are part of the ‘textile revolution’ we should also recognise their important role in the rapid deforestation of the wetter parts of primary woodland and the expansion of agriculture into marginal areas during the Irish and British Bronze Ages.
We made preliminary AMS measurements of 41Ca/Ca ratios in bone and limestone specimens with the Argonne Tandem-Linac Accelerator System (ATLAS). We were able to avoid pre-enrichment of 41Ca used in previous experiments due to a substantial increase in Ca-beam intensity. Most of the measured ratios lie in the 10-14 range, with a few values below 10-14. In general, these values are higher than the ones observed by the AMS group at the University of Pennsylvania. We discuss possible implications of these results. We also present the current status of half-life measurements of 41Ca and discuss 41Ca production processes on earth.
Distribution profiles of radiocarbon in dissolved inorganic carbonate have been measured along two transects in the southern Pacific, east of New Zealand. Use of accelerator mass spectrometry, with its small-sample-size capability, made it possible to sample near-surface waters with a depth resolution of a few tens of meters. Sampling of deeper water was guided by salinity and temperature data transmitted by a conductivity-temperature-depth probe. The measurements, taken over the Chatham Rise, show highly structured profiles that can be correlated with known circulation patterns in this region.
We calibrated portions of the radiocarbon time scale with combined 230Th, 231Pa, 14C measurements of corals collected from Espiritu Santo, Vanuatu and the Huon Peninsula, Papua New Guinea. The new data map 14C variations ranging from the current limit of the tree-ring calibration [11,900 calendar years before present (cal BP), Kromer and Spurk 1998, now updated to 12,400 cal B P, see Kromer et al., this issue], to the 14C-dating limit of 50,000 cal BP, with detailed structure between 14 to 16 cal kyr BP and 19 to 24 cal kyr BP. Samples older than 25,000 cal BP were analyzed with high-precision 231Pa dating methods (Pickett et al. 1994; Edwards et al. 1997) as a rigorous second check on the accuracy of the 230Th ages. These are the first coral calibration data to receive this additional check, adding confidence to the age data forming the older portion of the calibration. Our results, in general, show that the offset between calibrated and 14C ages generally increases with age until about 28,000 cal BP, when the recorded 14C age is nearly 6800 yr too young. The gap between ages before this time is less; at 50,000 cal BP, the recorded 14C age is 4600 yr too young. Two major 14C-age plateaus result from a 130 drop in Δ14C between 14–15 cal kyr BP and a 700 drop in Δ14C between 22–25 cal kyr BP. In addition, a large atmospheric Δ14C excursion to values over 1000 occurs at 28 cal kyr BP. Between 20 and 10 cal kyr BP, a component of atmospheric Δ14C anti-correlates with Greenland ice δ18O, indicating that some portion of the variability in atmospheric Δ14C is related to climate change, most likely through climate-related changes in the carbon cycle. Furthermore, the 28-kyr excursion occurs at about the time of significant climate shifts. Taken as a whole, our data indicate that in addition to a terrestrial magnetic field, factors related to climate change have affected the history of atmospheric 14C.
Background: Planning for neurology training necessitated a reflection on the experience of graduates. We explored practice characteristics, and training experience of recent graduates. Methods: Graduates from 2010-2014 completed a survey. Results: Response rate was 37% of 211. 56% were female. 91% were adult neurologists. 65% practiced in an outpatient setting. 63% worked in academics. 85% completed subspecialty training (median 1 year). 36% work 3 days a week or less. 82% took general call (median 1 night weekly). Role preparation was considered very good or excellent for most; however poor or fair ratings were 17% in advocacy and 8% in leadership. Training feedback was at least “good” for 87%. Burnout a few times a week or more was noted by 5% (6% during residency, particularly PGY1 and 5). 64% felt overly burdened by paperwork. Although most felt training was adequate, it was poor or fair at preparing for practice management (85%) and personal balance (55%). Most conditions were under-observed in training environment. Many noted a need for more independent practice development and community neurology. Conclusions: Although our training was found to be very good, some identified needs included advocacy training, and more training in general neurology in the longitudinal outpatient/community settings.
We present maps in the visible emission lines of [S II] and the infrared emission lines of H2, at 2.12μm, for several bipolar outflow complexes which exhibit jet structures. A comparison of the morphology of this infrared emission and that seen in visible emission lines shows both the visible and the H2 emission exhibit clumpy structure on similar scales. It appears that the brightest H2 emission occurs at the working surfaces of the jets. Virtually no H2 emission is associated with the jets themselves.
Velocity profiles are presented for several objects and possible emission mechanisms are discussed.
Interest in minor planets and comets continued to grow during the triennium, sparked in part by the highly successful 1983 mission of the Infrared Astronomy Satellite (IRAS) as well as by activities associated with the return of P/Halley and the first spacecraft missions to comets. A trial of the Astrometry Network of the International Halley Watch (IHW) on P/Crommelin was quite successful. Yet with an increasing need for precise ephemerides, there continued to be concern for acquisition and timely reporting of astrometric observations of even the best known comets.
The DJEHUTY project is an intensive effort at the Lawrence Livermore National Laboratory (LLNL) to produce a general purpose 3-D stellar structure and evolution code to study dynamic processes in whole stars.
Gene × Environment interaction contributes to externalizing disorders in childhood and adolescence, but little is known about whether such effects are long lasting or present in adulthood. We examined gene–environment interplay in the concurrent and prospective associations between antisocial peer affiliation and externalizing disorders (antisocial behavior and substance use disorders) at ages 17, 20, 24, and 29. The sample included 1,382 same-sex twin pairs participating in the Minnesota Twin Family Study. We detected a Gene × Environment interaction at age 17, such that additive genetic influences on antisocial behavior and substance use disorders were greater in the context of greater antisocial peer affiliation. This Gene × Environment interaction was not present for antisocial behavior symptoms after age 17, but it was for substance use disorder symptoms through age 29 (though effect sizes were largest at age 17). The results suggest adolescence is a critical period for the development of externalizing disorders wherein exposure to greater environmental adversity is associated with a greater expression of genetic risk. This form of Gene × Environment interaction may persist through young adulthood for substance use disorders, but it appears to be limited to adolescence for antisocial behavior.