We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Agitated behaviors are frequently encountered in the prehospital setting and require emergent treatment to prevent harm to patients and prehospital personnel. Chemical sedation with ketamine works faster than traditional pharmacologic agents, though it has a higher incidence of adverse events, including intubation. Outcomes following varying initial doses of prehospital intramuscular (IM) ketamine use have been incompletely described.
Objective:
To determine whether using a lower dose IM ketamine protocol for agitation is associated with more favorable outcomes.
Methods:
This study was a pre-/post-intervention retrospective chart review of prehospital care reports (PCRs). Adult patients who received chemical sedation in the form of IM ketamine for agitated behaviors were included. Patients were divided into two cohorts based on the standard IM ketamine dose of 4mg/kg and the lower IM dose of 3mg/kg with the option for an additional 1mg/kg if required. Primary outcomes included intubation and hospital admission. Secondary outcomes included emergency department (ED) length of stay, additional chemical or physical restraints, assaults on prehospital or ED employees, and documented adverse events.
Results:
The standard dose cohort consisted of 211 patients. The lower dose cohort consisted of 81 patients, 17 of whom received supplemental ketamine administration. Demographics did not significantly differ between the cohorts (mean age 35.14 versus 35.65 years; P = .484; and 67.8% versus 65.4% male; P = .89). Lower dose subjects were administered a lower ketamine dose (mean 3.24mg/kg) compared to the standard dose cohort (mean 3.51mg/kg). There was no statistically significant difference between the cohorts in intubation rate (14.2% versus 18.5%; P = .455), ED length of stay (14.31 versus 14.88 hours; P = .118), need for additional restraint and sedation (P = .787), or admission rate (26.1% versus 25.9%; P = .677). In the lower dose cohort, 41.2% (7/17) of patients who received supplemental ketamine doses were intubated, a higher rate than the patients in this cohort who did not receive supplemental ketamine (8/64, 12.5%; P <.01).
Conclusion:
Access to effective, fast-acting chemical sedation is paramount for prehospital providers. No significant outcomes differences existed when a lower dose IM ketamine protocol was implemented for prehospital chemical sedation. Patients who received a second dose of ketamine had a significant increase in intubation rate. A lower dose protocol may be considered for an agitation protocol to limit the amount of medication administered to a population of high-risk patients.
Acute kidney injury leads to worse outcomes following paediatric cardiac surgery. There is a lack of literature focusing on acute kidney injury after the Hybrid stage 1 palliation for single ventricle physiology. Patients undergoing the Hybrid Stage 1, as a primary option, may have a lower incidence of kidney injury than previously reported. When present, kidney injury may increase the risk of post-operative morbidity and mortality.
Methods:
A retrospective, single centre review was conducted in patients with hypoplastic left heart syndrome who underwent Hybrid Stage 1 from 2008 to 2018. Acute kidney injury was defined as a dichotomous yes (meeting any injury criteria) or no (no injury) utilising two different criteria utilised in paediatrics. The impact of kidney injury on perioperative characteristics and 30-day mortality was analysed.
Results:
The incidence of acute kidney injury is 13.4–20.7%, with a severe injury rate of 2.4%. Patients without a prenatal diagnosis of hypoplastic left heart syndrome have a higher incidence of kidney injury than those prenatally diagnosed, (40% versus 14.5%, p = 0.024). Patients with acute kidney injury have a significantly higher incidence of 30-day mortality, 27.3%, compared to without, 5.6% (p = 0.047).
Discussion:
The incidence of severe acute kidney injury after the Hybrid Stage 1 palliation is low. A prenatal diagnosis may be associated with a lower incidence of kidney injury following the Hybrid Stage 1. Though uncommon, severe acute kidney injury following Hybrid Stage 1 may be associated with higher 30-day mortality.
This article presents a detailed account (provenance, codicology and contents) of Surrey History Centre, Woking, MS LM/1 083191/35, a late Restoration manuscript of lyra-viol and keyboard music. Originally from the papers of the More-Molyneux family of Loseley Park, LM/1083191135 is a source of otherwise unknown music by John Moss and Gerhard Diesineer. Two of the lyra-viol pieces in particular demonstrate that the Waking manuscript dates to at least 1687 or 1688, making it the latest known English source of viol music in tablature. The primary purpose of the manuscript seems to have been didactic. It was copied by a single scribe, who was evidently a musician actively engaged with the popular music and current political events of mid- to late-1680s London. LM/1083191/35 allows us a rare glimpse into the amateur musical world of 1680s London.
Disease surveillance in wildlife populations presents a logistical challenge, yet is critical in gaining a deeper understanding of the presence and impact of wildlife pathogens. Erinaceus coronavirus (EriCoV), a clade C Betacoronavirus, was first described in Western European hedgehogs (Erinaceus europaeus) in Germany. Here, our objective was to determine whether EriCoV is present, and if it is associated with disease, in Great Britain (GB). An EriCoV-specific BRYT-Green® real-time reverse transcription PCR assay was used to test 351 samples of faeces or distal large intestinal tract contents collected from casualty or dead hedgehogs from a wide area across GB. Viral RNA was detected in 10.8% (38) samples; however, the virus was not detected in any of the 61 samples tested from Scotland. The full genome sequence of the British EriCoV strain was determined using next generation sequencing; it shared 94% identity with a German EriCoV sequence. Multivariate statistical models using hedgehog case history data, faecal specimen descriptions and post-mortem examination findings found no significant associations indicative of disease associated with EriCoV in hedgehogs. These findings indicate that the Western European hedgehog is a reservoir host of EriCoV in the absence of apparent disease.
Radiocarbon accelerator mass spectrometric (AMS) dates on the acid-insoluble fraction from 38 core tops from the western Ross Sea, Antarctica, are used to address these questions: (1) What are the apparent ages of sediments at or close to the present sediment/water interface? (2) Is there a statistically significant pattern to the spatial distribution of core top ages? and (3) Is there a “correction factor” that can be applied to these age determinations to obtain the best possible Holocene (downcore) chronologies? Ages of core top sediments range from 2000 to 21,000 14C yr B.P. Some “old” core top dates are from piston cores and probably represent the loss of sediment during the coring process, but some core top samples >6000 14C yr B.P. may represent little or no Holocene deposition. Four possible sources of variability in dates ≤6000 14C yr B.P. (n = 28) are associated with (1) different sample preparation methods, (2) different sediment recovery systems, (3) different geographic regions, and (4) within-sample lateral age variability. Statistical analysis on an a posteriori design indicates that geographic area is the major cause of variability; there is a difference in mean surface sediment age of nearly 2000 yr between sites in the western Ross Sea and sites east of Ross Bank in south-central Ross Sea. The systematic variability in surface age between areas may be attributed to: (a) variable sediment accumulation rates (SAR) (surface age is inversely related to SAR), (b) differences in the percentage of reworked (dead) carbon between each area, and/or (c) differences in the CO2 exchange between the ocean and the atmosphere.
Recent commentary has suggested that performance management (PM) is fundamentally “broken,” with negative feelings from managers and employees toward the process at an all-time high (Pulakos, Hanson, Arad, & Moye, 2015; Pulakos & O'Leary, 2011). In response, some high-profile organizations have decided to eliminate performance ratings altogether as a solution to the growing disenchantment. Adler et al. (2016) offer arguments both in support of and against eliminating performance ratings in organizations. Although both sides of the debate in the focal article make some strong arguments both for and against utilizing performance ratings in organizations, we believe there continue to be misunderstandings, mischaracterizations, and misinformation with respect to some of the measurement issues in PM. We offer the following commentary not to persuade readers to adopt one particular side over another but as a call to critically reconsider and reevaluate some of the assumptions underlying measurement issues in PM and to dispel some of the pervasive beliefs throughout the performance rating literature.
To aid in preparation of military medic trainers for a possible new curriculum in teaching junctional tourniquet use, the investigators studied the time to control hemorrhage and blood volume lost in order to provide evidence for ease of use.
Hypothesis
Models of junctional tourniquet could perform differentially by blood loss, time to hemostasis, and user preference.
Methods
In a laboratory experiment, 30 users controlled simulated hemorrhage from a manikin (Combat Ready Clamp [CRoC] Trainer) with three iterations each of three junctional tourniquets. There were 270 tests which included hemorrhage control (yes/no), time to hemostasis, and blood volume lost. Users also subjectively ranked tourniquet performance. Models included CRoC, Junctional Emergency Treatment Tool (JETT), and SAM Junctional Tourniquet (SJT). Time to hemostasis and total blood loss were log-transformed and analyzed using a mixed model analysis of variance (ANOVA) with the users represented as random effects and the tourniquet model used as the treatment effect. Preference scores were analyzed with ANOVA, and Tukey’s honest significant difference test was used for all post-hoc pairwise comparisons.
Results
All tourniquet uses were 100% effective for hemorrhage control. For blood loss, CRoC and SJT performed best with least blood loss and were significantly better than JETT; in pairwise comparison, CRoC-JETT (P < .0001) and SJT-JETT (P = .0085) were statistically significant in their mean difference, while CRoC-SJT (P = .35) was not. For time to hemostasis in pairwise comparison, the CRoC had a significantly shorter time compared to JETT and SJT (P < .0001, both comparisons); SJT-JETT was also significant (P = .0087). In responding to the directive, “Rank the performance of the models from best to worst,” users did not prefer junctional tourniquet models differently (P > .5, all models).
Conclusion
The CRoC and SJT performed best in having least blood loss, CRoC performed best in having least time to hemostasis, and users did not differ in preference of model. Models of junctional tourniquet performed differentially by blood loss and time to hemostasis.
KraghJFJr, LunatiMP, KharodCU, CunninghamCW, BaileyJA, StockingerZT, CapAP, ChenJ, AdenJK3d, CancioLC. Assessment of Groin Application of Junctional Tourniquets in a Manikin Model. Prehosp Disaster Med. 2016;31(4):358–363.
Stomatopods, or mantis shrimps, are malacostracan crustaceans. Known as “lean, mean, killing machines” (Watling et al., 2000, p. 1), modern stomatopods are obligate carnivores, feeding exclusively on live prey (Schram, 1986). Characteristically, their second thoracic appendages are enlarged to form powerful, raptorial claws. Modern stomatopods are divided into two broad functional groups based on the shape and usage of their raptorial claws: ‘spearing’ and ‘smashing’ stomatopods (Caldwell and Dingle, 1976).
Finch trichomonosis is an emerging infectious disease affecting European passerines caused by a clonal strain of Trichomonas gallinae. Migrating chaffinches (Fringilla coelebs) were proposed as the likely vector of parasite spread from Great Britain to Fennoscandia. To test for such parasite carriage, we screened samples of oesophagus/crop from 275 Apodiform, Passeriform and Piciform birds (40 species) which had no macroscopic evidence of trichomonosis (i.e. necrotic ingluvitis). These birds were found dead following the emergence of trichomonosis in Great Britain, 2009–2012, and were examined post-mortem. Polymerase chain reactions were used to detect (ITS1/5·8S rRNA/ITS2 region and single subunit rRNA gene) and to subtype (Fe-hydrogenase gene) T. gallinae. Trichomonas gallinae was detected in six finches [three chaffinches, two greenfinches (Chloris chloris) and a bullfinch (Pyrrhula pyrrhula)]. Sequence data had 100% identity to the European finch epidemic A1 strain for each species. While these results are consistent with finches being vectors of T. gallinae, alternative explanations include the presence of incubating or resolved T. gallinae infections. The inclusion of histopathological examination would help elucidate the significance of T. gallinae infection in the absence of macroscopic lesions.
The recent digitisation of the 1641 depositions has opened up that large and controversial collection of manuscripts to renewed study. The significance of a substantial section of that archive generated in 1653–4 by the work of the Cromwellian delinquency commissions has hitherto been poorly understood. This article sheds new light on the workings of the commissions and on the ways in which the ‘delinquency depositions’ that they collected helped to shape the implementation of the Cromwellian and Restoration land settlements in Ireland. It also compares the Irish delinquency proceedings to the approach adopted by the Long Parliament in its dealings with royalists in England in the 1640s. In analysing the actual content of the depositions, the article focuses particular attention on County Wexford. The surviving delinquency depositions enable in-depth exploration of many facets of the 1641 rebellion and its aftermath in that region.
Galactic cold clumps have been identified from the Planck data (Planck Collaboration, 2011a, 2011b, 2015) as 10 342 cold (7 - 19 K) sources that stand out against a warmer environment, with the Early Cold Cores as a subsample of 915 most reliable detections. There is CO emission associated with the Planck Cold Clumps (PCCs), which has been observed with ground-based radio telescopes at higher resolution (Wu et al. 2012, Liu et al. 2014). A subset of PCCs have also been observed with Herschel at higher resolution (Juvela et al. 2012).
A southern sub-sample of the PCCs has been observed with the Mopra 22-m telescope to study the molecular gas. The Mopra telescope has 3-mm, 7-mm and 12-mm bands, with broadband correlator configuration 8-GHz wide with 0.27-MHz channels, or multiple zoom bands 137-MHz wide with 33-KHz channels, within the 8 GHz.
During the 2013 southern winter season we observed 10 clumps. This included observations in the 3-mm band of 12CO, 13CO and C18O and lines around 89 GHz (e.g. HCN, HCO+ and HNC), in the 7-mm band (e.g. CS) and in the 12-mm band (e.g. NH3). These observations were heterogenous, with sources selected by LST in gaps between observations of other projects, and band chosen by weather (i.e. in conditions unsuitable for higher frequencies, lower frequency bands were observed). During the 2014 season we observed 34 positions in 22 clumps, with zoom mode observations of lines around 89 GHz. This was a more well-defined sample of sources.
The mapping of the CO lines shows good spatial correlation of the CO with the dust column density The CO isotoplogues show high optical depth in 12CO and 13CO. The lines of HCN, HCO+ and HNC are weak, but detected in many of the 2014 sample. We are modelling the line results to determine column densities, excitation temperatures and abundances, using tools such as radex (van der Tak et al. 2007).
Studies of the development of organisms can reveal crucial information on homology of structures. Developmental data are not peculiar to living organisms, and they are routinely preserved in the mineralized tissues that comprise the vertebrate skeleton, allowing us to obtain direct insight into the developmental evolution of this most formative of vertebrate innovations. The pattern of developmental processes is recorded in fossils as successive stages inferred from the gross morphology of multiple specimens and, more reliably and routinely, through the ontogenetic stages of development seen in the skeletal histology of individuals. Traditional techniques are destructive and restricted to a 2-D plane with the third dimension inferred. Effective non-invasive methods of visualizing paleohistology to reconstruct developmental stages of the skeleton are necessary.
In a brief survey of paleohistological techniques we discuss the pros and cons of these methods. The use of tomographic methods to reconstruct development of organs is exemplified by the study of the placoderm dentition. Testing evidence for the presence of teeth in placoderms, the first jawed vertebrates, we compare the methods that have been used. These include inferring development from morphology, and using serial sectioning, microCT or synchrotron X-ray tomographic microscopy (SRXTM), to reconstruct growth stages and directions of growth. The ensuing developmental interpretations are biased by the methods and degree of inference. The most direct and reliable method is using SRXTM data to trace sclerochronology. The resulting developmental data can be used to resolve homology and test hypotheses on the origin of evolutionary novelties.
It is a commonplace in historical scholarship that the English understanding of what occurred in Ireland in 1641 helped to shape the severity of the conquest and the land settlement visited on the country after 1649. Among many other concrete connections, we have Oliver Cromwell's own words from 1650: ‘we are come to aske an accompt of the innocent blood that hath been shed, and to endeavour to bring them to an accompt … who, by appearing in arms, seeke to justifie the same’. A decade later, those seeking to defend the recent dispossession of Catholic landowners and their transplantation to Connacht insisted that ‘since above 300,000 Protestants were murdered without provocation… who could blame them for putting those Irish … into such a part of the kingdom, as might most probably confine them from the like wickedness in the future’? There were countless other contemporary instances in which the alleged 1641 massacres were cited as justification for the treatment doled out to Catholics. Yet problems quickly arise if we attempt to discern exactly what information and precisely which publications informed the views and the policies of men such as Cromwell and bodies such as the English parliament.
The explosion of printed material and the tangled complex of personal contacts that linked Protestant Ireland to London certainly offered innumerable channels through which English political and military responses to the Irish rebellion could be influenced. In their efforts to cut a path through this labyrinth, historians have understandably emphasized the contemporary importance of certain well-known publications. For example, Micheál Ó Siochrú and Toby Barnard have quite reasonably linked Cromwell's understanding of the Irish situation to the content of John Temple's infamous History of the Irish Rebellion published in 1646. This essay reassesses the contemporary significance of two further works to which historians have frequently drawn attention. The first, A Remonstrance of Divers Remarkeable Passages concerning the Church and Kingdom of Ireland, was published in London in 1642, while the second, An Abstract of Some Few of those Barbarous Cruell Massacres and Murthers of the Protestants and English in Some Parts of Ireland, appeared a decade later.
Knowledge of evolutionary history is based extensively on relatively rare fossils that preserve soft tissues. These fossils record a much greater proportion of anatomy than would be known solely from mineralized remains and provide key data for testing evolutionary hypotheses in deep time. Ironically, however, exceptionally preserved fossils are often among the most contentious because they are difficult to interpret. This is because their morphology has invariably been affected by the processes of decay and diagenesis, meaning that it is often difficult to distinguish preserved biology from artifacts introduced by these processes. Here we describe how a range of analytical techniques can be used to tease apart mineralization that preserves biological structures from unrelated geological mineralization phases. This approach involves using a series of X-ray, ion, electron and laser beam techniques to characterize the texture and chemistry of the different phases so that they can be differentiated in material that is difficult to interpret. This approach is demonstrated using a case study of its application to the study of fossils from the Ediacaran Doushantuo Biota.
To establish a statewide network to detect, control, and prevent the spread of carbapenem-resistant Enterobacteriaceae (CRE) in a region with a low incidence of CRE infection.
Design.
Implementation of the Drug Resistant Organism Prevention and Coordinated Regional Epidemiology (DROP-CRE) Network.
Setting and Participants.
Oregon infection prevention and microbiology laboratory personnel, including 48 microbiology laboratories, 62 acute care facilities, and 140 long-term care facilities.
Methods.
The DROP-CRE working group, comprising representatives from academic institutions and public health, convened an interdisciplinary advisory committee to assist with planning and implementation of CRE epidemiology and control efforts. The working group established a statewide CRE definition and surveillance plan; increased the state laboratory capacity to perform the modified Hodge test and polymerase chain reaction for carbapenemases in real time; and administered surveys that assessed the needs and capabilities of Oregon infection prevention and laboratory personnel. Results of these inquiries informed CRE education and the response plan.
Results.
Of 60 CRE reported from November 2010 through April 2013, only 3 were identified as carbapenemase producers; the cases were not linked, and no secondary transmission was found. Microbiology laboratories, acute care facilities, and long-term care facilities reported lacking carbapenemase testing capability, reliable interfacility communication, and CRE awareness, respectively. Survey findings informed the creation of the Oregon CRE Toolkit, a state-specific CRE guide booklet.
Conclusions.
A regional epidemiology surveillance and response network has been implemented in Oregon in advance of widespread CRE transmission. Prospective surveillance will determine whether this collaborative approach will be successful at forestalling the emergence of this important healthcare-associated pathogen.