To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In the mink industry, feed costs are the largest variable expense and breeding for feed efficient animals is warranted. Implementation of selection for feed efficiency must consider the relationships between feed efficiency and the current selection traits BW and litter size. Often, feed intake (FI) is recorded on a cage with a male and a female and there is sexual dimorphism that needs to be accounted for. Study aims were to (1) model group recorded FI accounting for sexual dimorphism, (2) derive genetic residual feed intake (RFI) as a measure of feed efficiency, (3) examine the relationship between feed efficiency and BW in males (BWM) and females (BWF) and litter size at day 21 after whelping (LS21) in Danish brown mink and (4) investigate direct and correlated response to selection on each trait of interest. Feed intake records from 9574 cages, BW records on 16 782 males and 16 875 females and LS21 records on 6446 yearling females were used for analysis. Genetic parameters for FI, BWM, BWF and LS21 were obtained using a multivariate animal model, yielding sex-specific additive genetic variances for FI and BW to account for sexual dimorphism. The analysis was performed in a Bayesian setting using Gibbs sampling, and genetic RFI was obtained from the conditional distribution of FI given BW using genetic regression coefficients. Responses to single trait selection were defined as the posterior distribution of genetic superiority of the top 10% of animals after conditioning on the genetic trends. The heritabilities ranged from 0.13 for RFI in females and LS21 to 0.59 for BWF. Genetic correlations between BW in both sexes and LS21 and FI in both sexes were unfavorable, and single trait selection on BW in either sex showed increased FI in both sexes and reduced litter size. Due to the definition of RFI and high genetic correlation between BWM and BWF, selection on RFI did not significantly alter BW. In addition, selection on RFI in either sex did not affect LS21. Genetic correlation between sexes for FI and BW was high but significantly lower than unity. The high correlations across sex allowed for selection on standardized averages of animals’ breeding values (BVs) for RFI, FI and BW, which yielded selection responses approximately equal to the responses obtained using the sex-specific BVs. The results illustrate the possibility of selecting against RFI in mink with no negative effects on BW and litter size.
The number of immigrants using health services has increased across Europe. For assessing and improving the quality of care provided for immigrants, information is required on how many immigrants use services, what interpreting services are provided and whether staff members are from immigrant groups.
Structured interviews were conducted with 15 health services (9 primary care, 3 emergency departments, 3 mental health) located in areas with high immigrant populations in each of 16 European countries (n = 240). Responses were collected on the availability of data on service use by immigrant patients, the provision of interpreting services and immigrant staff members.
Data on service use by immigrants were recorded by only 15% of services. More than 40% of services did not provide any form of interpreting service and 54% of the services reported having no immigrant staff. Mental health services were more likely to use direct interpreting services, and both mental health and emergency services were more likely to have immigrant staff members.
For assessing and improving the quality of care provided for immigrants, there is a need to improve the availability of data on service use by immigrants in health services throughout Europe and to provide more consistent access to interpreting services.
Early changes in biomarker levels probably occur before bloodstream infection (BSI) is diagnosed. However, this issue has not been fully addressed. We aimed at evaluating the kinetics of C-reactive protein (CRP) and plasma albumin (PA) in the 30 days before community-acquired (CA) BSI diagnosis. From a population-based BSI database we identified 658 patients with at least one measurement of CRP or PA from day −30 (D–30) through day −1 (D–1) before the day of CA-BSI (D0) and a measurement of the same biomarker at D0 or D1. Amongst these, 502 had both CRP and PA measurements which fitted these criteria. CRP and PA concentrations began to change inversely some days before CA-BSI diagnosis, CRP increasing by day −3.1 and PA decreasing by day −1.3. From D–30 to D–4, CRP kinetics (expressed as slopes – rate of concentration change per day) was −1.5 mg/l/day. From D–3 to D1, the CRP slope increased to 36.3 mg/l/day. For albumin, the slope between D–30 to D–2 was 0.1 g/l/day and changed to −1.8 g/l/day between D–1 and D1. We showed that biomarker levels begin to change some days before the CA-BSI diagnosis, CRP 3.1 days and PA 1.3 days before.
Norovirus (NoV) infections occur very frequently yet are rarely diagnosed. In Denmark, NoV infections are not under surveillance. We aimed to collect and describe existing laboratory-based NoV data. National NoV laboratory data were collected for 2011–2018, including information on patient identification number, age and sex, requesting physician, analysis date and result. We defined positive patient-episodes by using a 30-day time window and performed descriptive and time series analysis. Diagnostic methods used were assessed through a survey. We identified 15 809 patient-episodes (11%) out of 142 648 tested patients with an increasing trend, 9366 in 2011 vs. 32 260 in 2018. This corresponded with a gradual introduction of polymerase chain reaction analysis in laboratories. The highest positivity rate was in patients aged <5 years (15%) or >85 years (17%). There was a large difference in test performance over five Danish geographical regions and a marked seasonal variation with peaks from December to February. This is the first analysis of national NoV laboratory data in Denmark. A future laboratory-based surveillance system may benefit public health measures by describing trend, burden and severity of seasons and possibly pinpoint hospital outbreaks.
In Denmark, outbreaks of salmonella with more than 20 cases have become rare. In November 2018, an outbreak of monophasic Salmonella Typhimurium was detected and an investigation initiated with the aim of identifying the source and controlling the outbreak. Outbreak cases were defined based on core genome multilocus sequence types. We conducted hypothesis-generating interviews, a matched case-control study, food sampling and trace-back investigations. We identified 49 cases distributed across Denmark. In univariable analyses a traditional form of raw Danish pork sausage (medister sausage), pork chops and ground veal/pork showed matched odds ratio of 26 (95% CI 3–207), 4 (95% CI 1–13) and 4 (95% CI 1–10), respectively. In a multivariable analysis, only medister sausage remained significant. Several patients described tasting or eating the sausage raw or undercooked. Samples of medister sausage analysed were negative for salmonella and investigations at the production site did not reveal the mechanism of contamination. In conclusion, in spite of having eliminated salmonella in the egg and broiler industry, Denmark is still at risk of major salmonella outbreaks. We identified a raw pork sausage as a particular risk product that needs to be thoroughly cooked before consumption. Tasting raw meat or eating undercooked pork should be discouraged.
Tail biting is a welfare and economical concern in modern pig production. One common preventive measure used throughout the world is tail docking, which is generally considered one of the most effective methods for limiting tail biting. However, tail docking is a painful mutilation and systematic tail docking is not allowed in the EU. Therefore, the aim was to compare pig behaviour and the prevalence of tail biting in finishing pigs with intact tails housed in two different pen designs under Danish commercial conditions. PEN1 was a traditional Danish pen and PEN2 was inspired by Swedish finisher pen design and had a larger proportion of solid floor area (PEN1: 1/3 and PEN2: 2/3), reduced group size (PEN1: 15 and PEN2: 12), increased space allowance per head (PEN1: 0.7 m2 and PEN2: 0.89 m2) and straw allocated on the floor (PEN2) whereas straw was provided in a straw rack in PEN1. Tail damage observations were carried out daily by the stockperson and every 2 weeks one trained research technician assessed tail damages according to a tail scoring system. Tail lesions were observed in 51% of PEN1 and in 11% of PEN2 (P < 0.001). PEN1 had higher prevalence of tail damages than PEN2 (23% v. 5%, P < 0.001). Behavioural observations were carried out by the use of video recordings. Pigs in PEN2 tended to spend more time on tail-directed behaviour than pigs in PEN1 (P = 0.07), whereas pigs in PEN1 tended to spend more time on ear-directed behaviour (P = 0.08). Pigs in PEN2 spent more time on straw-directed behaviour compared to pigs in PEN1 (P < 0.001). Pen design did not affect time spent on other penmate-directed behaviour. In addition, the level of welfare between the two pen designs was compared using the Welfare Quality® protocol. PEN2 received an overall score of ‘excellent’ while PEN1 scored ‘enhanced’. PEN2 scored higher on all principles besides ‘good health’, where PEN1 scored better on lameness and wounds. The main measurements accounting for the differences were water supply, huddling, tail biting, social behaviour and fear of humans. In conclusion, the combination of increased space allowance, increased area of solid flooring, straw allocated onto the floor and reduced group size (PEN2) resulted in fewer tail damaged pigs and a better overall welfare assessment, despite a tendency for more tail-directed behaviour.
Climate and weather conditions may have substantial effects on the ecology of both parasites and hosts in natural populations. The strength and shape of the effects of weather on parasites and hosts are likely to change as global warming affects local climate. These changes may in turn alter fundamental elements of parasite–host dynamics. We explored the influence of temperature and precipitation on parasite prevalence in a metapopulation of avian hosts in northern Norway. We also investigated if annual change in parasite prevalence was related to winter climate, as described by the North Atlantic Oscillation (NAO). We found that parasite prevalence increased with temperature within-years and decreased slightly with increasing precipitation. We also found that a mild winter (positive winter NAO index) was associated with higher mean parasite prevalence the following year. Our results indicate that both local and large scale weather conditions may affect the proportion of hosts that become infected by parasites in natural populations. Understanding the effect of climate and weather on parasite–host relationships in natural populations is vital in order to predict the full consequence of global warming.
Declining sea ice is expected to change the Arctic's physical and biological systems in ways that are difficult to predict. This study used stable isotope compositions (δ13C and δ15N) of archaeological, historic, and modern Pacific walrus (Odobenus rosmarus divergens) bone collagen to investigate the impacts of changing sea ice conditions on walrus diet during the last ~4000 yr. An index of past sea ice conditions was generated using dinocyst-based reconstructions from three locations in the northeastern Chukchi Sea. Archaeological walrus samples were assigned to intervals of high and low sea ice, and δ13C and δ15N were compared across ice states. Mean δ13C and δ15N values were similar for archaeological walruses from intervals of high and low sea ice; however, variability among walruses was greater during low-ice intervals, possibly indicating decreased availability of preferred prey. Overall, sea ice conditions were not a primary driver of changes in walrus diet. The diet of modern walruses was not consistent with archaeological low sea ice intervals. Rather, the low average trophic position of modern walruses (primarily driven by males), with little variability among individuals, suggests that trophic changes to this Arctic ecosystem are still underway or are unprecedented in the last ~4000 yr.
To assess the prevalence of prediabetes and metabolic abnormalities among overweight or obese clozapine- or olanzapine-treated schizophrenia patients, and to identify characteristics of the schizophrenia group with prediabetes.
A cross-sectional study assessing the presence of prediabetes and metabolic abnormalities in schizophrenia clozapine- or olanzapine-treated patients with a body mass index (BMI) ≥27 kg/m2. Procedures were part of the screening process for a randomized, placebo-controlled trial evaluating liraglutide vs placebo for improving glucose tolerance. For comparison, an age-, sex-, and BMI-matched healthy control group without psychiatric illness and prediabetes was included. Prediabetes was defined as elevated fasting plasma glucose and/or impaired glucose tolerance and/or elevated glycated hemoglobin A1c.
Among 145 schizophrenia patients (age = 42.1 years; males = 59.3%) on clozapine or olanzapine (clozapine/olanzapine/both: 73.8%/24.1%/2.1%), prediabetes was present in 69.7% (101 out of 145). While schizophrenia patients with and without prediabetes did not differ regarding demographic, illness, or antipsychotic treatment variables, metabolic abnormalities (waist circumference: 116.7±13.7 vs 110.1±13.6 cm, P = 0.007; triglycerides: 2.3±1.4 vs 1.6±0.9 mmol/L, P = 0.0004) and metabolic syndrome (76.2% vs 40.9%, P<0.0001) were significantly more pronounced in schizophrenia patients with vs without prediabetes. The age-, sex-, and BMI-matched healthy controls had significantly better glucose tolerance compared to both groups of patients with schizophrenia. The healthy controls also had higher levels of high-density lipoprotein compared to patients with schizophrenia and prediabetes.
Prediabetes and metabolic abnormalities were highly prevalent among the clozapine- and olanzapine-treated patients with schizophrenia, putting these patients at great risk for later type 2 diabetes and cardiovascular disease. These results stress the importance of identifying and adequately treating prediabetes and metabolic abnormalities among clozapine- and olanzapine-treated patients with schizophrenia.
The cold, wet climate of the Arctic has led to the extraordinary preservation of archaeological sites and materials that offer important contributions to the understanding of our common cultural and ecological history. This potential, however, is quickly disappearing due to climate-related variables, including the intensification of permafrost thaw and coastal erosion, which are damaging and destroying a wide range of cultural and environmental archives around the Arctic. In providing an overview of the most important effects of climate change in this region and on archaeological sites, the authors propose the next generation of research and response strategies, and suggest how to capitalise on existing successful connections among research communities and between researchers and the public.
Tail biting in domestic pigs relates to a range of risk factors, primarily in the pigs’ environment. Preventive tail docking is widely used, and various experimental approaches suggest that docking reduces the risk of tail biting. However, whether the docking length affects the prevalence of tail biting outbreaks is less studied, as is how a shortened tail will affect pigs’ social behaviour. The aim of this study was to investigate how three different tail docking lengths, measured at docking, as well as retained intact tails (Short: 2.9 cm; Medium: 5.7 cm; Long: 7.5 cm; and Undocked) affected tail biting risk and behaviour directed at other finisher pigs with the same docking length treatment. Tail lesions were scored weekly, as was behaviour at pen level after introduction to finisher pens and until a potential outbreak of tail biting or slaughter. Pigs from four commercial herds (258 litters) entered the study. Before the pigs entered the finisher section and data collection started, some pigs were excluded, mainly due to tail biting outbreaks in the weaner section. The risk of a tail biting outbreak differed significantly between treatments (P=0.001), with a lowered risk of a tail biting outbreak in Short pens compared with Undocked (P<0.001) and Medium (P<0.05), and was affected by herd as well (P<0.001). Pens in the Long and Undocked treatments were pooled for the behavioural analysis due to low representation, especially in the Undocked treatment. The probability of tail contacts, where a pig interacted with a pen mate’s tail, differed between docking length treatments and was highest in the Long/Undocked compared with the Short treatment (P<0.01), but docking length did not affect aggressive behaviour. Docking length affected the risk of a tail biting outbreak and the frequency of tail-directed behaviour in our participating herds, of which three reported a high prevalence of tail biting problems. Only the shortest docking length treatment (Short) reduced the tail biting risk, but did not completely prevent tail biting outbreaks.
Community-acquired bacteraemia patients (n = 2472), Denmark, 2000–2008. Albumin, C-reactive protein (CRP) and haemoglobin (Hb) measured 2000–2010. We assessed daily mean levels of albumin, CRP and Hb from 30 days before to 30 days after bacteraemia and correlations between albumin vs. CRP and albumin vs. Hb. In linear regression models, we evaluated the contribution of CRP, Hb, chronic and acute variables to the albumin level variations. The mean albumin level (33.6 g/l) was steady before day 1, declined to 29.3 g/l on day 1 with little increase afterward. The mean CRP increased from day −5, peaked on day 1 and declined thereafter. The mean Hb level was fairly constant during days −30/30. Albumin was inversely (R range, − 0.18/–0.47, P < 10−4) correlated with the CRP level and positively (R = 0.17–0.46, P < 10−4) correlated with the HB level. In most models, CRP was the first variable that contributed to the albumin variations, 34–70% of the full model. The sudden decrease of albumin levels, without sudden fluctuations of CRP or Hb, indicated that hypoalbuminaemia was a marker of trans-capillary leakage.
Introduction: Collaborative Emergency Centres (CECs) provide access to care in rural communities. After hours, registered nurses (RNs) and paramedics work together in the ED with telephone support by an emergency medical services (EMS) physician. The safety of such a model is unknown. Relapse visits are often used as a proxy measure for safety in emergency medicine. The primary outcome of this study is to measure unscheduled relapses to emergency care. Methods: The electronic patient care record (ePCR) database was queried for all patients who visited two CECs from April 1, 2012 to April 1, 2013. Abstracted data included demographics, time, acuity score, clinical impression, chief complaint, and disposition. Records were searched for each discharged CEC patient to identify unscheduled relapses to emergency care, defined as presenting back to EMS, CEC, or any other ED within the Health Authority within 48 hours of CEC discharge. Results: There were 894 CEC visits, of which 66 were excluded due to missing data. The dispositions from CEC were: 131/828 (15.8%) transferred to regional ED; 264/828 (31.9%) discharged home; 488/828 (58.9%) discharged with follow up visit booked; and 11/82 (1.2%) left the CEC without being seen. There was 37/828 (4.5%) visits which relapsed back to emergency care, all of whom were discharged from CEC or left without being seen: 3/828 (0.4%) relapsed back to EMS (two taken to regional ED and one to CEC); 16/828 (1.9%) relapsed to regional ED (by walking-in); and 18/828 (2.2%) had a relapse to the CEC (walk-in). 516/828 (62.3%) CEC visits were resolved in a single visit. Conclusion: This study was based on only two of the 7 operating CECs due to accessing paper-based charts for multiple health regions. We also acknowledge the limitations of using relapse as a proxy for safety, and that low volumes and acuity will make detection of adverse events challenging. Albeit a proxy measure, the rate of patients who relapse to emergency care was under 5% in this case series of two CECs. Most patients had their concern resolved in a single visit to a CEC. Further research is underway to determine the effectiveness, optimal utilization and safety of this collaborative model of rural emergency care.
The Learning Health System Network clinical data research network includes academic medical centers, health-care systems, public health departments, and health plans, and is designed to facilitate outcomes research, pragmatic trials, comparative effectiveness research, and evaluation of population health interventions.
The Learning Health System Network is 1 of 13 clinical data research networks assembled to create, in partnership with 20 patient-powered research networks, a National Patient-Centered Clinical Research Network.
Results and Conclusions
Herein, we describe the Learning Health System Network as an emerging resource for translational research, providing details on the governance and organizational structure of the network, the key milestones of the current funding period, and challenges and opportunities for collaborative science leveraging the network.
In developing countries with limited access to ENT services, performing emergency cricothyroidotomy in patients with upper airway obstruction may be a life-saving last resort. An established Danish–Zimbabwean collaboration of otorhinolaryngologists enrolled Zimbabwean doctors into a video-guided simulation training programme on emergency cricothyroidotomy. This paper presents the positive effect of this training, illustrated by two case reports.
A 56-year-old female presented with upper airway obstruction due to a rapidly progressing infectious swelling of the head and neck progressing to cardiac arrest. Cardiopulmonary resuscitation was initiated and a secure surgical airway was established via an emergency cricothyroidotomy, saving the patient. A 70-year-old male presented with upper airway obstruction secondary to intubation for an elective procedure. When extubated, the patient exhibited severe stridor followed by respiratory arrest. Re-intubation attempts were unsuccessful and emergency cricothyroidotomy was performed to secure the airway, preserving the life of the patient.
Emergency cricothyroidotomy training should be considered for all surgeons, anaesthetists and, eventually, emergency and recovery room personnel in developing countries. A video-guided simulation training programme on emergency cricothyroidotomy in Zimbabwe proved its value in this regard.
Human milk decreases the risk of necrotising enterocolitis (NEC), a severe gastrointestinal disease that occurs in 5–10 % of preterm infants. The prebiotic and immune-modulatory effects of milk oligosaccharides may contribute to this protection. Preterm pigs were used to test whether infant formula enriched with α1,2-fucosyllactose (2'-FL, the most abundant oligosaccharide in human milk) would benefit gut microbial colonisation and NEC resistance after preterm birth. Caesarean-delivered preterm pigs were fed formula (Controls, n 17) or formula with 5 g/l 2'-FL (2'-FL, n 16) for 5 d; eight 2'-FL pigs (50 %) and twelve Controls (71 %) developed NEC, with no difference in lesion scores (P=0·35); 2'-FL pigs tended to have less anaerobic bacteria in caecal contents (P=0·22), but no difference in gut microbiota between groups were observed by fluorescence in situ hybridisation and 454 pyrosequencing. Abundant α1,2-fucose was detected in the intestine with no difference between groups, and intestinal structure (villus height, permeability) and digestive function (hexose absorption, brush border enzyme activities) were not affected by 2'-FL. Formula enrichment with 2'-FL does not affect gut microbiology, digestive function or NEC sensitivity in pigs within the first few days after preterm birth. Milk 2'-FL may not be critical in the immediate postnatal period of preterm neonates when gut colonisation and intestinal immunity are still immature.
Healthcare provider hands are an important source of intraoperative bacterial transmission events associated with postoperative infection development.
To explore the efficacy of a novel hand hygiene improvement system leveraging provider proximity and individual and group performance feedback in reducing 30-day postoperative healthcare-associated infections via increased provider hourly hand decontamination events.
Randomized, prospective study.
Dartmouth-Hitchcock Medical Center in New Hampshire and UMass Memorial Medical Center in Massachusetts.
Patients undergoing surgery.
Operating room environments were randomly assigned to usual intraoperative hand hygiene or to a personalized, body-worn hand hygiene system. Anesthesia and circulating nurse provider hourly hand decontamination events were continuously monitored and reported. All patients were followed prospectively for the development of 30-day postoperative healthcare-associated infections.
A total of 3,256 operating room environments and patients (1,620 control and 1,636 treatment) were enrolled. The mean (SD) provider hand decontamination event rate achieved was 4.3 (2.9) events per hour, an approximate 8-fold increase in hand decontamination events above that of conventional wall-mounted devices (0.57 events/hour); P<.001. Use of the hand hygiene system was not associated with a reduction in healthcare-associated infections (odds ratio, 1.07 [95% CI, 0.82–1.40], P=.626).
The hand hygiene system evaluated in this study increased the frequency of hand decontamination events without reducing 30-day postoperative healthcare-associated infections. Future work is indicated to optimize the efficacy of this hand hygiene improvement strategy.
Introduction: Nova Scotia has a province wide reperfusion strategy for the treatment of patients presenting with acute ST-Elevation Myocardial Infarction (STEMI). Patients are referred for primary percutaneous coronary intervention (PPCI) if a first medical contact to device time can be achieved within 90 to 120 minutes; otherwise, fibrinolytic therapy is administered, as per guideline recommendations. Since 2011, Nova Scotian paramedics have been providing prehospital fibrinolysis (PHF) and prehospital catheterization (cath) lab activation for STEMI patients outside and within the PPCI catchment area, respectively. Patients who received fibrinolysis are transferred to a PCI facility if rescue PCI is required or if there are other indications for urgent intervention. This province wide approach is unique and the objective of this retrospective cohort study is to compare the impact of this approach on the primary outcome of 30-day mortality. Methods: For the study period, July 2011 to July 2013, STEMI patients who were diagnosed prehospital or in the ED who subsequently underwent reperfusion therapy were identified in the Emergency Health Services (EHS), Cardiovascular Information Systems (CVIS) and Cardiovascular Health Nova Scotia (CVHNS) databases. Baseline demographics and outcomes were then compared according to the treatment received: 1) PHF; 2) ED Fibrinolysis (EDF); 3) prehospital activated PPCI (EHS PPCI); and 4) ED activated PPCI (ED PPCI). Results: There were a total of 1107 STEMI patients identified during the study period, of whom 742 received lytic therapy (146 PHF; 596 EDF) and 332 underwent PPCI (202 EHS PPCI; 130 ED PPCI). Demographic variables were similar across the groups. The primary outcome of 30-day mortality was not significantly different across groups: 5 (3%) in PHF, 26 (4%) in EDF, 8 (4%) in EHS to PPCI and 2 (2%) in ED to PPCI. The number of rescue PCIs was 28 (19%) in PHF and 102 (17%) in EDF. Other outcomes (key timestamps) are pending. Conclusion: Our results show that the 30-day mortality was lowest for patients undergoing PPCI and slightly less for patients receiving pre-hospital fibrinolytic compared to those receiving ED fibrinolytic with no difference in the proportion requiring subsequent rescue PCI. The majority of patients in rural areas received EDF as opposed to PHF; pending results will show if this represents a delay in patient presentation after symptom onset.