To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Assessing the relationship between antimicrobial usage (AMU) and antimicrobial resistance (AMR) requires the accurate and precise utilisation of register data. Therefore, validation of register-based data is essential for evaluating the quality and, subsequently, the internal validity of studies based on the data.
In this study, different smoothing methods for Veterinary Medicine Statistic Program database (VetStat)-records were validated by comparing these with farm-records. Comparison between measurements included accuracy as; completeness and correctness, and precision as; a relative difference of the error, correlation with Fisher's z transformation and reliability coefficient. The most valid methods of those examined were then used in re-analyses of the abundance of AMR genes in 10 finisher batches from a previous study.
Improved accuracy was found when detailed smoothing methods were applied. Although the precision also increased, the effect was not as pronounced, as the usage estimate of all smoothing methods deviated moderately compared with the farm-registrations. Applying the most valid methods to the 10 finisher batches increased estimates of statistical model fit for aminoglycosides, lincosamides, tetracyclines and decreased estimates of statistical model fit for macrolides. The estimates of statistical model fit for sulfonamides and broad-spectrum penicillins remained the same.
Through refined data transformation, VetStat-records can be used to calculate a daily amount of AMU per pig reflecting the true usage accurately and moderately precisely, which is the foundation for calculating lifetime AMU.
Size, or its commonly used proxy live weight, is a necessary input when calculating the energy requirements of an animal. It is also a major factor in determining the intake capacity of an animal. The sole use of live weight as a determinant of size incorporates the implicit assumption that body fat and body protein mass are equivalent for the purposes of calculating energy requirements and intake capacity. Recent evidence indicates that this is not so in either case (Birnie et al., 2000; Friggens et al., 1998). The use of live weight may be acceptable where it can be reasonably assumed that there is a stable relationship between body fat and protein. However, when making breed or parity comparisons there is no reason to assume a stable relationship between body fat and protein. Meaningful comparisons can be made if live weights can be adjusted for differences in body fat content. In the applied context this means adjusting to a standard body condition score. This study provided the opportunity to examine the relationships between condition score and live weight in three breeds across three parities.
The objectives were to present three approaches for calculating antimicrobial (AM) use in pigs that take into account the rearing period and rearing site, and to study the association between these measurements and phenotypical resistance and abundance of resistance genes in faeces samples from 10 finisher batches. The AM use was calculated relative to the rearing period of the batches as (i) ‘Finisher Unit Exposure’ at unit level, (ii) ‘Lifetime Exposure’ at batch level and (iii) ‘Herd Exposure’ at herd level. A significant effect on the occurrence of tetracycline resistance measured by cultivation was identified for Lifetime Exposure for the AM class: tetracycline. Furthermore, for Lifetime Exposure for the AM classes: macrolide, broad-spectrum penicillin, sulfonamide and tetracycline use as well as Herd Unit Exposure for the AM classes: aminoglycoside, lincosamide and tetracycline use, a significant effect was observed on the occurrence of genes coding for the AM resistance classes: aminoglycoside, lincosamide, macrolide, β-lactam, sulfonamide and tetracycline. No effect was observed for Finisher Unit Exposure. Overall, the study shows that Lifetime Exposure is an efficient measurement of AM use in finisher batches, and has a significant effect on the occurrence of resistance, measured either by cultivation or metagenomics.
In order to exploit potentials of 20–40% reduction of herbicide use, as documented by use of Decision Support Systems (DSS), where requirements for manual field inspection constitute a major obstacle, large numbers of digital pictures of weed infestations have been collected and analysed manually by crop advisors. Results were transferred to: 1) DSS, which determined needs for control and connected, optimized options for control returned options for control and 2) convolutional, neural networks, which in this way were trained to enable automatic analysis of future pictures, which support both field- and site-specific integrated weed management.
Introduction: Collaborative Emergency Centres (CECs) provide access to care in rural communities. After hours, registered nurses (RNs) and paramedics work together in the ED with telephone support by an emergency medical services (EMS) physician. The safety of such a model is unknown. Relapse visits are often used as a proxy measure for safety in emergency medicine. The primary outcome of this study is to measure unscheduled relapses to emergency care. Methods: The electronic patient care record (ePCR) database was queried for all patients who visited two CECs from April 1, 2012 to April 1, 2013. Abstracted data included demographics, time, acuity score, clinical impression, chief complaint, and disposition. Records were searched for each discharged CEC patient to identify unscheduled relapses to emergency care, defined as presenting back to EMS, CEC, or any other ED within the Health Authority within 48 hours of CEC discharge. Results: There were 894 CEC visits, of which 66 were excluded due to missing data. The dispositions from CEC were: 131/828 (15.8%) transferred to regional ED; 264/828 (31.9%) discharged home; 488/828 (58.9%) discharged with follow up visit booked; and 11/82 (1.2%) left the CEC without being seen. There was 37/828 (4.5%) visits which relapsed back to emergency care, all of whom were discharged from CEC or left without being seen: 3/828 (0.4%) relapsed back to EMS (two taken to regional ED and one to CEC); 16/828 (1.9%) relapsed to regional ED (by walking-in); and 18/828 (2.2%) had a relapse to the CEC (walk-in). 516/828 (62.3%) CEC visits were resolved in a single visit. Conclusion: This study was based on only two of the 7 operating CECs due to accessing paper-based charts for multiple health regions. We also acknowledge the limitations of using relapse as a proxy for safety, and that low volumes and acuity will make detection of adverse events challenging. Albeit a proxy measure, the rate of patients who relapse to emergency care was under 5% in this case series of two CECs. Most patients had their concern resolved in a single visit to a CEC. Further research is underway to determine the effectiveness, optimal utilization and safety of this collaborative model of rural emergency care.
In developing countries with limited access to ENT services, performing emergency cricothyroidotomy in patients with upper airway obstruction may be a life-saving last resort. An established Danish–Zimbabwean collaboration of otorhinolaryngologists enrolled Zimbabwean doctors into a video-guided simulation training programme on emergency cricothyroidotomy. This paper presents the positive effect of this training, illustrated by two case reports.
A 56-year-old female presented with upper airway obstruction due to a rapidly progressing infectious swelling of the head and neck progressing to cardiac arrest. Cardiopulmonary resuscitation was initiated and a secure surgical airway was established via an emergency cricothyroidotomy, saving the patient. A 70-year-old male presented with upper airway obstruction secondary to intubation for an elective procedure. When extubated, the patient exhibited severe stridor followed by respiratory arrest. Re-intubation attempts were unsuccessful and emergency cricothyroidotomy was performed to secure the airway, preserving the life of the patient.
Emergency cricothyroidotomy training should be considered for all surgeons, anaesthetists and, eventually, emergency and recovery room personnel in developing countries. A video-guided simulation training programme on emergency cricothyroidotomy in Zimbabwe proved its value in this regard.
Mutants of Bacillus subtilis can be developed to overproduce Val in vitro. It was hypothesized that addition of Bacillus subtilis mutants to pig diets can be a strategy to supply the animal with Val. The objective was to investigate the effect of Bacillus subtilis mutants on growth performance and blood amino acid (AA) concentrations when fed to piglets. Experiment 1 included 18 pigs (15.0±1.1 kg) fed one of three diets containing either 0.63 or 0.69 standardized ileal digestible (SID) Val : Lys, or 0.63 SID Val : Lys supplemented with a Bacillus subtilis mutant (mutant 1). Blood samples were obtained 0.5 h before feeding and at 1, 2, 3, 4, 5 and 6 h after feeding and analyzed for AAs. In Experiment 2, 80 piglets (9.1±1.1 kg) were fed one of four diets containing 0.63 or 0.67 SID Val : Lys, or 0.63 SID Val : Lys supplemented with another Bacillus subtilis mutant (mutant 2) or its parent wild type. Average daily feed intake, daily weight gain and feed conversion ratio were measured on days 7, 14 and 21. On day 17, blood samples were taken and analyzed for AAs. On days 24 to 26, six pigs from each dietary treatment were fitted with a permanent jugular vein catheter, and blood samples were taken for AA analysis 0.5 h before feeding and at 1, 2, 3, 4, 5 and 6 h after feeding. In experiment 1, Bacillus subtilis mutant 1 tended (P<0.10) to increase the plasma levels of Val at 2 and 3 h post-feeding, but this was not confirmed in Experiment 2. In Experiment 2, Bacillus subtilis mutant 2 and the wild type did not result in a growth performance different from the negative and positive controls. In conclusion, results obtained with the mutant strains of Bacillus subtilis were not better than results obtained with the wild-type strain, and for both strains, the results were not different than the negative control.
Common beans (Phaseolus vulgaris L.) are a nutrient-dense, low glycemic index food that supports healthy weight management in people and was examined for dogs. The objectives of this study were to evaluate the apparent total tract digestibility (ATTD) and nutrient utilisation of navy (NB) and black (BB) bean-based diets in overweight or obese companion dogs undergoing a weight loss intervention. A nutritionally complete, dry extruded dog food was used as the control (CON) diet and two isocaloric, nutrient matched bean diets, containing either 25% w/w cooked BB or NB powder formed the test diets. Diets were fed to adult, overweight companion dogs for either four weeks (short-term study, n = 30) or for twenty-six weeks (long-term study, n = 15) at 60% of maintenance calories for ideal weight. Apparent weight loss increased over time in both the short- and long-term studies (p < 0.001) but was not different between the three study groups: apparent weight loss was between 4.05% – 6.14% for the short-term study and 14.0% – 17.9% in the long-term study. The ATTD was within expected ranges for all groups, whereby total dry matter and crude protein ATTD was 7–8% higher in the BB diet compared to CON (P < 0.05), crude fat ATTD was similar across all diets, and nitrogen free extract ATTD was 5–6% higher in both BB and NB compared to CON (P < 0.05). Metabolisable energy was similar for all diets, and ranged from 3,434–3,632 kcal/kg. At the end of each study period, dogs had haemoglobin levels ≥12 g/dl, packed cell volume ≥36%, albumin ≥2.4 g/dl, ALP ≤ 300 IU/l and all median values for each group were within defined limits for nutritional adequacy. This investigation demonstrated that BB and NB diets were safe, digestible, and supported weight loss in calorically restricted, overweight or obese, adult companion dogs.
Two major processes underlie human decision-making: experiential (intuitive) and rational (conscious) thinking. The predominant thinking process used by working paramedics and student paramedics to make clinical decisions is unknown.
A survey was administered to ground ambulance paramedics and to primary care paramedic students. The survey included demographic questions and the Rational Experiential Inventory-40, a validated psychometric tool involving 40 questions. Twenty questions evaluated each thinking style: 10 assessed preference and 10 assessed ability to use that style. Responses were provided on a five-point Likert scale, with higher scores indicating higher affinity for the style in question. Analysis included both descriptive statistics and t tests to evaluate differences in thinking style.
The response rate was 88.4% (1172/1326). Paramedics (n=904) had a median age of 36 years (IQR 29–42) and most were male (69.5%) and primary or advanced care paramedics (PCP=55.5%; ACP=32.5%). Paramedic students (n=268) had a median age of 23 years (IQR 21–26), most were male (63.1%) and had completed high school (31.7%) or an undergraduate degree (25.4%) prior to paramedic training. Both groups scored their ability to use and favourability toward rational thinking significantly higher than experiential thinking. The mean score for rational thinking was 3.86/5 among paramedics and 3.97/5 among paramedic students (p<0.001). The mean score for experiential thinking was 3.41/5 among paramedics and 3.35/5 among paramedic students (p=0.06).
Working paramedics and student paramedics prefer and perceive that they have the ability to use rational over experiential thinking. This information adds to our current knowledge on paramedic decision-making and is potentially important for developing continuing education and clinical support tools.
Access to drinking water is essential for animal welfare, but it is unclear if temporary water restriction during the night represents a welfare problem. The aim of the present study was to investigate the effect of various durations of nightly restriction of water on thirst in loose housed lactating sows from day 10 to 28 of lactation. A total of 48 sows were deprived of water for either 0 h (n=12; control), 3 h (n=12; 0500 to 0800 h), 6 h (n=12; 0200 to 0800 h) or 12 h (n=12; 2000 to 0800 h). Control sows consumed 22% of their water intake during the night (2000 to 0800 h), whereas water consumption during this time was reduced to 13%, 7% and 0% in sows restricted for 3, 6 and 12 h. With increased duration of nightly water restriction a reduced latency to drink (26.8, 18.0, 5.3 and 6.7 min for 0, 3, 6 and 12 h sows; P<0.001) and an increased water intake during the 1st hour after water became accessible (2.1, 3.4, 4.7 and 5.6 l for 0, 3, 6 and 12 h sows; P<0.001) was seen. During the last 30 min before water became accessible more sows deprived of water investigated (0%, 50%, 75%,and 50% of 0, 3, 6 and 12 h sows; P<0.01) or forcefully manipulated (0%, 17%, 50% and 33% of 0, 3, 6 and 12 h sows; P<0.05) the water trough, suggesting frustration and a negative experience of thirst. When all signs of imminent water access were provided, but access was delayed by 25 min, a tendency for more of the sows deprived of water for 6 and 12 h to interact forcefully with the water trough was seen (22%, 18%, 42% and 67% of 0, 3, 6 and 12 h sows; P=0.09). Duration of water restriction did not affect water consumption on a 24-h basis, nursing behaviour or performance. In conclusion, behavioural indicators of thirst increased with increasing duration of nightly water restriction in lactating sows.
To limit tail biting incidence, most pig producers in Europe tail dock their piglets. This is despite EU Council Directive 2008/120/EC banning routine tail docking and allowing it only as a last resort. The paper aims to understand what it takes to fulfil the intentions of the Directive by examining economic results of four management and housing scenarios, and by discussing their consequences for animal welfare in the light of legal and ethical considerations. The four scenarios compared are: ‘Standard Docked’, a conventional housing scenario with tail docking meeting the recommendations for Danish production (0.7 m2/pig); ‘Standard Undocked’, which is the same as ‘Standard Docked’ but with no tail docking, ‘Efficient Undocked’ and ‘Enhanced Undocked’, which have increased solid floor area (0.9 and 1.0 m2/pig, respectively) provision of loose manipulable materials (100 and 200 g/straw per pig per day) and no tail docking. A decision tree model based on data from Danish and Finnish pig production suggests that Standard Docked provides the highest economic gross margin with the least tail biting. Given our assumptions, Enhanced Undocked is the least economic, although Efficient Undocked is better economically and both result in a lower incidence of tail biting than Standard Undocked but higher than Standard Docked. For a pig, being bitten is worse for welfare (repeated pain, risk of infections) than being docked, but to compare welfare consequences at a farm level means considering the number of affected pigs. Because of the high levels of biting in Standard Undocked, it has on average inferior welfare to Standard Docked, whereas the comparison of Standard Docked and Enhanced (or Efficient) Undocked is more difficult. In Enhanced (or Efficient) Undocked, more pigs than in Standard Docked suffer from being tail bitten, whereas all the pigs avoid the acute pain of docking endured by the pigs in Standard Docked. We illustrate and discuss this ethical balance using numbers derived from the above-mentioned data. We discuss our results in the light of the EU Directive and its adoption and enforcement by Member States. Widespread use of tail docking seems to be accepted, mainly because the alternative steps that producers are required to take before resorting to it are not specified in detail. By tail docking, producers are acting in their own best interests. We suggest that for the practice of tail docking to be terminated in a way that benefits animal welfare, changes in the way pigs are housed and managed may first be required.
Tail biting is a serious animal welfare and economic problem in pig production. Tail docking, which reduces but does not eliminate tail biting, remains widespread. However, in the EU tail docking may not be used routinely, and some ‘alternative’ forms of pig production and certain countries do not allow tail docking at all. Against this background, using a novel approach focusing on research where tail injuries were quantified, we review the measures that can be used to control tail biting in pigs without tail docking. Using this strict criterion, there was good evidence that manipulable substrates and feeder space affect damaging tail biting. Only epidemiological evidence was available for effects of temperature and season, and the effect of stocking density was unclear. Studies suggest that group size has little effect, and the effects of nutrition, disease and breed require further investigation. The review identifies a number of knowledge gaps and promising avenues for future research into prevention and mitigation. We illustrate the diversity of hypotheses concerning how different proposed risk factors might increase tail biting through their effect on each other or on the proposed underlying processes of tail biting. A quantitative comparison of the efficacy of different methods of provision of manipulable materials, and a review of current practices in countries and assurance schemes where tail docking is banned, both suggest that daily provision of small quantities of destructible, manipulable natural materials can be of considerable benefit. Further comparative research is needed into materials, such as ropes, which are compatible with slatted floors. Also, materials which double as fuel for anaerobic digesters could be utilised. As well as optimising housing and management to reduce risk, it is important to detect and treat tail biting as soon as it occurs. Early warning signs before the first bloody tails appear, such as pigs holding their tails tucked under, could in future be automatically detected using precision livestock farming methods enabling earlier reaction and prevention of tail damage. However, there is a lack of scientific studies on how best to respond to outbreaks: the effectiveness of, for example, removing biters and/or bitten pigs, increasing enrichment, or applying substances to tails should be investigated. Finally, some breeding companies are exploring options for reducing the genetic propensity to tail bite. If these various approaches to reduce tail biting are implemented we propose that the need for tail docking will be reduced.
Optimal use of superabsorbent polymers (SAP) in cement-based materials relies on knowledge on how SAP absorbency is influenced by different physical and chemical parameters. These parameters include salt concentration in the pore fluid, temperature of the system and SAP particle size. The present work shows experimental results on this and presents a new technique to measure the swelling of SAP particles. This new technique is compared with existing techniques that have been recently proposed for the measurement of pore fluid absorption by superabsorbent polymers. It is seen that the concentration of Na+, K+, Ca2+, OH-, and SO2-, in the exposure liquid influences the maximum absorption of SAP. Even very low concentrations of these may reduce the absorption to a third of the value measured in pure water at room temperature. Additionally, the influence of the SAP absorption on the ionic composition of the exposure liquid is investigated with atomic absorption spectroscopy. The paper provides the reader with knowledge about the absorption capacity of SAP in a cementitious environment, and how the absorption process may influence the cement pore fluid.
The material characterization toolbox has recently experienced a number of parallel revolutionary advances, foreshadowing a time in the near future when material scientists can quantify material structure evolution across spatial and temporal space simultaneously. This will provide insight to reaction dynamics in four-dimensions, spanning multiple orders of magnitude in both temporal and spatial space. This study presents the authors’ viewpoint on the material characterization field, reviewing its recent past, evaluating its present capabilities, and proposing directions for its future development. Electron microscopy; atom probe tomography; x-ray, neutron and electron tomography; serial sectioning tomography; and diffraction-based analysis methods are reviewed, and opportunities for their future development are highlighted. Advances in surface probe microscopy have been reviewed recently and, therefore, are not included [D.A. Bonnell et al.: Rev. Modern Phys. in Review]. In this study particular attention is paid to studies that have pioneered the synergetic use of multiple techniques to provide complementary views of a single structure or process; several of these studies represent the state-of-the-art in characterization and suggest a trajectory for the continued development of the field. Based on this review, a set of grand challenges for characterization science is identified, including suggestions for instrumentation advances, scientific problems in microstructure analysis, and complex structure evolution problems involving material damage. The future of microstructural characterization is proposed to be one not only where individual techniques are pushed to their limits, but where the community devises strategies of technique synergy to address complex multiscale problems in materials science and engineering.