Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The primary objective of this study was to examine the impact of an electronic medical record (EMR)–driven intensive care unit (ICU) antimicrobial stewardship (AMS) service on clinician compliance with face-to-face AMS recommendations. AMS recommendations were defined by an internally developed “5 Moments of Antimicrobial Prescribing” metric: (1) escalation, (2) de-escalation, (3) discontinuation, (4) switch, and (5) optimization. The secondary objectives included measuring the impact of this service on (1) antibiotic appropriateness, and (2) use of high-priority target antimicrobials.
A prospective review was undertaken of the implementation and compliance with a new ICU-AMS service that utilized EMR data coupled with face-to-face recommendations. Additional patient data were collected when an AMS recommendation was made. The impact of the ICU-AMS round on antimicrobial appropriateness was evaluated using point-prevalence survey data.
For the 202 patients, 412 recommendations were made in accordance with the “5 Moments” metric. The most common recommendation made by the ICU-AMS team was moment 3 (discontinuation), which comprised 173 of 412 recommendations (42.0%), with an acceptance rate of 83.8% (145 of 173). Data collected for point-prevalence surveys showed an increase in prescribing appropriateness from 21 of 45 (46.7%) preintervention (October 2016) to 30 of 39 (76.9%) during the study period (September 2017).
The integration of EMR with an ICU-AMS program allowed us to implement a new AMS service, which was associated with high clinician compliance with recommendations and improved antibiotic appropriateness. Our “5 Moments of Antimicrobial Prescribing” metric provides a framework for measuring AMS recommendation compliance.
Currently no national guidelines exist for the management of scabies outbreaks in residential or nursing care homes for the elderly in the United Kingdom. In this setting, diagnosis and treatment of scabies outbreaks is often delayed and optimal drug treatment, environmental control measures and even outcome measures are unclear. We undertook a systematic review to establish the efficacy of outbreak management interventions and determine evidence-based recommendations. Four electronic databases were searched for relevant studies, which were assessed using a quality assessment tool drawing on STROBE guidelines to describe the quality of observational data. Nineteen outbreak reports were identified, describing both drug treatment and environmental management measures. The quality of data was poor; none reported all outcome measures and only four described symptom relief measures. We were unable to make definitive evidence-based recommendations. We draw on the results to propose a framework for data collection in future observational studies of scabies outbreaks. While high-quality randomised controlled trials are needed to determine optimal drug treatment, evidence on environmental measures will need augmentation through other literature studies. The quality assessment tool designed is a useful resource for reporting of outcome measures including patient-reported measures in future outbreaks.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
In lifecourse studies that encompass the adolescent period, the assessment of pubertal status is important, but can be challenging. We aimed to identify current methods for pubertal assessment and assess their appropriateness for population-based research by combining a review of the literature with the views of experts in the field. We searched bibliographic databases, extracted data and assessed study quality to inform a workshop with 21 experts. Acceptability of different approaches was explored with a panel of ten adolescents. We screened 11,935 abstracts, assessed 157 articles and summarised results from 38 articles. Combining these with the opinions of experts, self-assessment was found to be a practical method for use in studies where agreement with the gold standard of clinical assessment by physical examination to within one Tanner stage was acceptable. Serial measures of height and foot size accurately indicated timing of the pubertal growth spurt and age at peak height velocity, and were seen as feasible within longitudinal studies. Hormonal and radiological methods did not offer a practical means of assessing pubertal status. Assessment of voice maturation was promising, but needed validation. Young people thought that self-assessment, foot size and voice assessments were acceptable, and preferred an assessor of the same sex for clinical assessment. This review thus informs researchers working in lifecourse and adolescent health, and identifies future directions in order to improve validity of the methods.
The objective of this study was to explore preferred self-care practices among paramedics and emergency medical technicians (EMTs) who responded to the September 11, 2001 terrorist attack (9/11) in New York City (New York USA).
Design, Setting, and Participants:
Qualitative research methodology with convenience and subsequent snowball sampling was utilized. Participants were adult (at least 18 years of age) paramedics or EMTs who self-reported as responding to the 9/11 terrorist attack in New York City.
Main Outcome Measures:
Preferred self-care practices; participant characteristics; indications and patterns of self-care use; perceived benefits and harms; and views on appropriate availability of support and self-care services were the main outcome measures.
The 9/11 paramedic and EMT participants reported a delay in recognizing the need for self-care. Preferred physical self-care practices included exercise, good nutrition, getting enough sleep, and sticking to routine. Preferred psychosocial self-care practices included spending time with family and friends, participating in peer-support programs and online support forums, and routinely seeing a mental health professional. Self-care was important for younger paramedics and EMTs who reported having less-developed supportive infrastructure around them, as well as for retiring paramedics and EMTs who often felt left behind by a system they had dedicated their lives to. Access to cooking classes and subsidized gym memberships were viewed as favorable, as was the ability to include family members in self-care practices.
A range of physical and psychosocial self-care practices should be encouraged among paramedic students and implemented by Australian ambulance services to ensure the health and well-being of paramedics throughout their career and into retirement.
Antenna-pattern measurements obtained from a double-metal supra-terahertz-frequency (supra-THz) quantum cascade laser (QCL) are presented. The QCL is mounted within a mechanically micro-machined waveguide cavity containing dual diagonal feedhorns. Operating in continuous-wave mode at 3.5 THz, and at an ambient temperature of ~60 K, QCL emission has been directed via the feedhorns to a supra-THz detector mounted on a multi-axis linear scanner. Comparison of simulated and measured far-field antenna patterns shows an excellent degree of correlation between beamwidth (full-width-half-maximum) and sidelobe content and a very substantial improvement when compared with unmounted devices. Additionally, a single output has been used to successfully illuminate and demonstrate an optical breadboard arrangement associated with a future supra-THz Earth observation space-borne payload. Our novel device has therefore provided a valuable demonstration of the effectiveness of supra-THz diagonal feedhorns and QCL devices for future space-borne ultra-high-frequency Earth-observing heterodyne radiometers.
Following publication, errors were discovered in the y-axis labels of the electron and hole concentration plots in the following figure panels: figure 4c, figure 4d, figure 5c, figure 5d, figure 6c, figure 6d, figure 8c and figure 8d. The error does not affect the description, analysis or conclusions. The correct representation of the figure panels are shown here.
Introduction: Identification of latent safety threats (LSTs) in the emergency department is an important aspect of quality improvement that can lead to improved patient care. In situ simulation (ISS) takes place in the real clinical environment and multidisciplinary teams can participate in diverse high acuity scenarios to identify LSTs. The purpose of this study is to examine the influence that the profession of the participant (i.e. physician, registered nurse, or respiratory therapist) has on the identification of LSTs during ISS. Methods: Six resuscitation- based adult and pediatric simulated scenarios were developed and delivered to multidisciplinary teams in the Kingston General Hospital ED. Each ISS session consisted of a 10- minute scenario, followed by 3-minutes of individual survey completion and a 7- minute group debrief led by ISS facilitators. An objective assessor recorded LSTs identified during each debrief. Surveys were completed prior to debrief to reduce response bias. Data was collected on participant demographics and perceived LSTs classified in the following categories: medication; equipment; resources and staffing; teamwork and communication; or other. Two reviewers evaluated survey responses and debrief notes to formulate a list of unique LSTs across scenarios and professions. The overall number and type of LSTs from surveys was identified and stratified by health care provider. Results: Thirteen ISS sessions were conducted with a total of 59 participants. Thirty- four unique LSTs (8 medication, 15 equipment, 5 resource, 4 communication, and 2 miscellaneous issues) were identified from surveys and debrief notes. Overall, MDs (n = 12) reported 19 LSTss (n = 41) reported 77 LSTs, and RTs (n = 6) reported 4 LSTs based on individual survey data. The most commonly identified category of LSTs reported by MDs (36.8%) and RTs (75%) was equipment issues while RNs most commonly identified medication issues (36.4%). Participants with □5 years of experience in their profession, on average identified more LSTs in surveys than participants with >5 years experience (1.9 LSTs vs 1.5 LSTs respectively). Conclusion: Nursing staff identified the highest number of LSTs across all categories. There was fairly unanimous identification of major LSTs across professions, however each profession did identify unique perspectives on LSTs in survey responses. ISS programs with the purpose of LST identification would benefit from multidisciplinary participation.
The first ultraviolet photochemical oxidation (UVox) extraction method for marine dissolved organic carbon (DOC) as CO2 gas was established by Armstrong and co-workers in 1966. Subsequent refinement of the UVox technique has co-evolved with the need for high-precision isotopic (Δ14C, δ13C) analysis and smaller sample size requirements for accelerator mass spectrometry radiocarbon (AMS 14C) measurements. The UVox line at UC Irvine was established in 2004 and the system reaction kinetics and efficiency for isolating seawater DOC rigorously tested for quantitative isolation of ∼1 mg C for AMS 14C measurements. Since then, improvements have been made to sampling, storage, and UVox methods to increase overall efficiency. We discuss our progress, and key UVox system parameters for optimizing precision, accuracy, and efficiency, including (1) ocean to reactor: filtration, storage and preparation of DOC samples, (2) cryogenic trap design, efficiency and quantification of CO2 break through, and (3) use of isotopic standards, blanks and small sample graphitization techniques for the correction of DOC concentrations and Fm values with propagated uncertainties. New DOC UVox systems are in use at many institutions. However, rigorous assessment of quantitative UVox DOC yields and blank contributions, DOC concentrations and carbon isotopic values need to be made. We highlight the need for a community-wide inter-comparison study.
Childhood adversity is associated with poor mental and physical health outcomes across the life span. Alterations in the hypothalamic–pituitary–adrenal axis are considered a key mechanism underlying these associations, although findings have been mixed. These inconsistencies suggest that other aspects of stress processing may underlie variations in this these associations, and that differences in adversity type, sex, and age may be relevant. The current study investigated the relationship between childhood adversity, stress perception, and morning cortisol, and examined whether differences in adversity type (generalized vs. threat and deprivation), sex, and age had distinct effects on these associations. Salivary cortisol samples, daily hassle stress ratings, and retrospective measures of childhood adversity were collected from a large sample of youth at risk for serious mental illness including psychoses (n = 605, mean age = 19.3). Results indicated that childhood adversity was associated with increased stress perception, which subsequently predicted higher morning cortisol levels; however, these associations were specific to threat exposures in females. These findings highlight the role of stress perception in stress vulnerability following childhood adversity and highlight potential sex differences in the impact of threat exposures.
Campylobacter is the leading cause of foodborne bacterial gastroenteritis in humans worldwide, often associated with the consumption of undercooked poultry. In Jordan, the majority of broiler chicken production occurs in semi-commercial farms, where poor housing conditions and low bio-security are likely to promote campylobacter colonisation. While several studies provided estimates of the key parameters describing the within-flock transmission dynamics of campylobacter in typical high-income countries settings, these data are not available for Jordan and Middle-East in general. A Bayesian model framework was applied to a longitudinal dataset on Campylobacter jejuni infection in a Jordan flock to quantify the transmission rate of C. jejuni in broilers within the farm, the day when the flock first became infected, and the within-flock prevalence (WFP) at clearance. Infection with C. jejuni is most likely to have occurred during the first 8 days of the production cycle, followed by a transmission rate value of 0.13 new infections caused by one infected bird/day (95% CI 0.11–0.17), and a WFP at clearance of 34% (95% CI 0.24–0.47). Our results differ from published studies conducted in intensive poultry production systems in high-income countries but are well aligned with the expectations obtained by means of structured questionnaires submitted to academics with expertise on campylobacter in Jordan. This study provides for the first time the most likely estimates and credible intervals of key epidemiological parameters driving the dynamics of C. jejuni infection in broiler production systems commonly found in Jordan and the Middle-East and could be used to inform Quantitative Microbial Risk Assessment models aimed to assess the risk of human exposure/infection to campylobacter through consumption of poultry meat.