To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Aging is associated with numerous stressors that negatively impact older adults’ well-being. Resilience improves ability to cope with stressors and can be enhanced in older adults. Senior housing communities are promising settings to deliver positive psychiatry interventions due to rising resident populations and potential impact of delivering interventions directly in the community. However, few intervention studies have been conducted in these communities. We present a pragmatic stepped-wedge trial of a novel psychological group intervention intended to improve resilience among older adults in senior housing communities.
A pragmatic modified stepped-wedge trial design.
Five senior housing communities in three states in the US.
Eighty-nine adults over age 60 years residing in independent living sector of senior housing communities.
Raise Your Resilience, a manualized 1-month group intervention that incorporated savoring, gratitude, and engagement in value-based activities, administered by unlicensed residential staff trained by researchers. There was a 1-month control period and a 3-month post-intervention follow-up.
Validated self-report measures of resilience, perceived stress, well-being, and wisdom collected at months 0 (baseline), 1 (pre-intervention), 2 (post-intervention), and 5 (follow-up).
Treatment adherence and satisfaction were high. Compared to the control period, perceived stress and wisdom improved from pre-intervention to post-intervention, while resilience improved from pre-intervention to follow-up. Effect sizes were small in this sample, which had relatively high baseline resilience. Physical and mental well-being did not improve significantly, and no significant moderators of change in resilience were identified.
This study demonstrates feasibility of conducting pragmatic intervention trials in senior housing communities. The intervention resulted in significant improvement in several measures despite ceiling effects. The study included several features that suggest high potential for its implementation and dissemination across similar communities nationally. Future studies are warranted, particularly in samples with lower baseline resilience or in assisted living facilities.
Europe’s roadmap to a low-carbon economy aims to cut greenhouse gas (GHG) emissions 80% below 1990 levels by 2050. Beef production is an important source of GHG emissions and is expected to increase as the world population grows. LIFE BEEF CARBON is a voluntary European initiative that aims to reduce GHG emissions per unit of beef (carbon footprint) by 15% over a 10-year period on 2172 farms in four large beef-producing countries. Changes in farms beef carbon footprint are normally estimated via simulation modelling, but the methods current models apply differ. Thus, our initial goal was to develop a common modelling framework to estimate beef farms carbon footprint. The framework was developed for a diverse set of Western Europe farms located in Ireland, Spain, Italy and France. Whole farm and life cycle assessment (LCA) models were selected to quantify emissions for the different production contexts and harmonized. Carbon Audit was chosen for Ireland, Bovid-CO2 for Spain and CAP’2ER for France and Italy. All models were tested using 20 case study farms, that is, 5 per country and quantified GHG emissions associated with on-farm live weight gain. The comparison showed the ranking of beef systems gross carbon footprint was consistent across the three models. Suckler to weaning or store systems generally had the highest carbon footprint followed by suckler to beef systems and fattening beef systems. When applied to the same farm, Carbon Audit’s footprint estimates were slightly lower than CAP’2ER, but marginally higher than Bovid-CO2. These differences occurred because the models were adapted to a specific region’s production circumstances, which meant their emission factors for key sources; that is, methane from enteric fermentation and GHG emissions from concentrates were less accurate when used outside their target region. Thus, for the common modelling framework, region-specific LCA models were chosen to estimate beef carbon footprints instead of a single generic model. Additionally, the Carbon Audit and Bovid-CO2 models were updated to include carbon removal by soil and other environmental metrics included in CAP’2ER, for example, acidification. This allows all models to assess the effect carbon mitigation strategies have on other potential pollutants. Several options were identified to reduce beef farms carbon footprint, for example, improving genetic merit. These options were assessed for beef systems, and a mitigation plan was created by each nation. The cumulative mitigation effect of the LIFE BEEF CARBON plan was estimated to exceed the projects reduction target (−15%).
Knowledge of population structure and breed composition of a population can be advantageous for a number of reasons; these include designing optimal (cross)breeding strategies in order to maximise non-additive genetic effects, maintaining flockbook integrity by authenticating animals being registered and as a quality control measure in the genotyping process. The objectives of the present study were to 1) describe the population structure of 24 sheep breeds, 2) quantify the breed composition of both flockbook-recorded and crossbred animals using single nucleotide polymorphism BLUP (SNP-BLUP), and 3) quantify the accuracy of breed composition prediction from low-density genotype panels containing between 2000 and 6000 SNPs. In total, 9334 autosomal SNPs on 11 144 flockbook-recorded animals and 1172 crossbred animals were used. The population structure of all breeds was characterised by principal component analysis (PCA) as well as the pairwise breed fixation index (Fst). The total number of animals, all of which were purebred, included in the calibration population for SNP-BLUP was 2579 with the number of animals per breed ranging from 9 to 500. The remaining 9559 flockbook-recorded animals, composite breeds and crossbred animals represented the test population; three breeds were excluded from breed composition prediction. The breed composition predicted using SNP-BLUP with 9334 SNPs was considered the gold standard prediction. The pairwise breed Fst ranged from 0.040 (between the Irish Blackface and Scottish Blackface) to 0.282 (between the Border Leicester and Suffolk). Principal component analysis revealed that the Suffolk from Ireland and the Suffolk from New Zealand formed distinct, non-overlapping clusters. In contrast, the Texel from Ireland and that from New Zealand formed integrated, overlapping clusters. Composite animals such as the Belclare clustered close to its founder breeds (i.e., Finn, Galway, Lleyn and Texel). When all 9334 SNPs were used to predict breed composition, an animal that had a majority breed proportion predicted to be ≥0.90 was defined as purebred for the present study. As the panel density decreased, the predicted breed proportion threshold, used to identify animals as purebred, also decreased (≥0.85 with 6000 SNPs to ≥0.60 with 2000 SNPs). In all, results from the study suggest that breed composition for purebred and crossbred animals can be determined with SNP-BLUP using ≥5000 SNPs.
In the present study, we aimed to compare anthropometric indicators as predictors of mortality in a community-based setting.
We conducted a population-based longitudinal study nested in a cluster-randomized trial. We assessed weight, height and mid-upper arm circumference (MUAC) on children 12 months after the trial began and used the trial’s annual census and monitoring visits to assess mortality over 2 years.
Children aged 6–60 months during the study.
Of 1023 children included in the study at baseline, height-for-age Z-score, weight-for-age Z-score, weight-for-height Z-score and MUAC classified 777 (76·0 %), 630 (61·6 %), 131 (12·9 %) and eighty (7·8 %) children as moderately to severely malnourished, respectively. Over the 2-year study period, fifty-eight children (5·7 %) died. MUAC had the greatest AUC (0·68, 95 % CI 0·61, 0·75) and had the strongest association with mortality in this sample (hazard ratio = 2·21, 95 % CI 1·26, 3·89, P = 0·006).
MUAC appears to be a better predictor of mortality than other anthropometric indicators in this community-based, high-malnutrition setting in Niger.
An Early Intervention in Psychosis (EIP) programme aims to engage patients in early assessment and phase-specific interventions which are the key elements of the Irish National Clinical Programme for psychosis. This study aims to describe and review the EIP programme offered by Cork’s North Lee Mental Health Services over a 5-year period.
A retrospective descriptive study design was adopted to describe and review the EIP programme, patient demographics and treatments offered in the service over a 5-year period.
A total of 139 patients were accepted into the programme over the 5-year period. The mean age of onset was 30 years (median = 28, SD = 9.9), and the mean duration of untreated psychosis was 8 months (median = 2.5, SD = 15.3). Two-thirds of patients were single on initial assessment, had a history of substance misuse and were unemployed. The majority of the cohort engaged with the keyworkers and occupational therapy but did not complete the full psychological or family programmes offered. Hospital admission was required for 12% of the cohort.
Patients experiencing their first episode of psychosis can successfully be treated in the community with appropriate professional and family support. However, deficiencies were noted in physical health monitoring, as well as in the availability and engagement with family and psychological therapies. Properly resourced early interventions in psychosis teams are necessary to deliver services at internationally recognised standards.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Introduction: The number of seniors presenting to emergency departments after a fall is increasing. Head injury concerns in this population often leads to a head CT scan. The CT rate among physicians is variable and the reasons for this are unknown. This study examined the role of patient characteristics and country of practice in the decision to order a CT. Methods: This study used a case-based survey of physicians across multiple countries. Each survey included 9 cases pertaining to an 82-year old man who falls. Each case varied in one aspect compared to a base case (aspirin, warfarin, or rivaroxaban use, occipital hematoma, amnesia, dementia, and fall with no head trauma). For each case, participants indicated how “likely” they were to order a head CT scan, measured on a 100-point scale. A response of 80 or more was defined a priori as ‘likely to order a CT scan’. The survey was piloted among emergency residents for feedback on design and comprehension, and was published in French and English. Recruitment was through the Canadian Association of Emergency Physicians, Twitter and CanadiEM. For each case we compared the proportion of physicians who were ‘likely to scan’ with relative to the base case. We also compared the proportion of participants who were ‘likely to scan’ each case in the USA, UK and Australia, relative to Canada. Results: Data was collected from 484 respondents (Canada-308, USA-64, UK-67, Australia-27, and 18 from other countries). Social media distribution limited our ability to estimate of the response rate. Physicians were most likely to scan in the anticoagulation cases (90% likely to order a scan compared to 36% for the base case (p = <0.001)). Other features associated with increased scans were occipital hematoma (48%), multiple falls (68%), and amnesia (68%) (all p < 0.005). Compared to Canada, US physicians were more likely to order CT scans for all cases (p = <0.05). Compared to Canada, UK physicians were significantly less likely to order CT for patients in every case except in the patient with amnesia. Finally, Australian physicians differed from Canada only for the occipital hematoma case where they were significantly more likely to order CT scan. Conclusion: Anticoagulation, amnesia and a history of multiple falls appear to drive the ordering a head CT scan in elderly patients who had fallen. We observed variations in practice between countries. Future clinical decision rules will likely have variable impact on head CT scan rates depending on baseline practice variation.
Introduction: Simulation has assumed an integral role in the Canadian healthcare system with applications in quality improvement, systems development, and medical education. High quality simulation-based research (SBR) is required to ensure the effective and efficient use of this tool. This study sought to establish national SBR priorities and describe the barriers and facilitators of SBR in Emergency Medicine (EM) in Canada. Methods: Simulation leads (SLs) from all fourteen Canadian Departments or Divisions of EM associated with an adult FRCP-EM training program were invited to participate in three surveys and a final consensus meeting. The first survey documented active EM SBR projects. Rounds two and three established and ranked priorities for SBR and identified the perceived barriers and facilitators to SBR at each site. Surveys were completed by SLs at each participating institution, and priority research themes were reviewed by senior faculty for broad input and review. Results: Twenty SLs representing all 14 invited institutions participated in all three rounds of the study. 60 active SBR projects were identified, an average of 4.3 per institution (range 0-17). 49 priorities for SBR in Canada were defined and summarized into seven priority research themes. An additional theme was identified by the senior reviewing faculty. 41 barriers and 34 facilitators of SBR were identified and grouped by theme. Fourteen SLs representing 12 institutions attended the consensus meeting and vetted the final list of eight priority research themes for SBR in Canada: simulation in CBME, simulation for interdisciplinary and inter-professional learning, simulation for summative assessment, simulation for continuing professional development, national curricular development, best practices in simulation-based education, simulation-based education outcomes, and simulation as an investigative methodology. Conclusion: Conclusion: This study has summarized the current SBR activity in EM in Canada, as well as its perceived barriers and facilitators. We also provide a consensus on priority research themes in SBR in EM from the perspective of Canadian simulation leaders. This group of SLs has formed a national simulation-based research group which aims to address these identified priorities with multicenter collaborative studies.
The commercially available collar device MooMonitor+ was evaluated with regards to accuracy and application potential for measuring grazing behavior. These automated measurements are crucial as cows feed intake behavior at pasture is an important parameter of animal performance, health and welfare as well as being an indicator of feed availability. Compared to laborious and time-consuming visual observation, the continuous and automated measurement of grazing behavior may support and improve the grazing management of dairy cows on pasture. Therefore, there were two experiments as well as a literature analysis conducted to evaluate the MooMonitor+ under grazing conditions. The first experiment compared the automated measurement of the sensor against visual observation. In a second experiment, the MooMonitor+ was compared to a noseband sensor (RumiWatch), which also allows continuous measurement of grazing behavior. The first experiment on n = 12 cows revealed that the automated sensor MooMonitor+ and visual observation were highly correlated as indicated by the Spearman’s rank correlation coefficient (rs) = 0.94 and concordance correlation coefficient (CCC) = 0.97 for grazing time. An rs-value of 0.97 and CCC = 0.98 was observed for rumination time. In a second experiment with n = 12 cows over 24-h periods, a high correlation between the MooMonitor+ and the RumiWatch was observed for grazing time as indicated by an rs-value of 0.91 and a CCC-value of 0.97. Similarly, a high correlation was observed for rumination time with an rs-value of 0.96 and a CCC-value of 0.99. While a higher level of agreement between the MooMonitor+ and both visual observation and RumiWatch was observed for rumination time compared to grazing time, the overall results showed a high level of accuracy of the collar device in measuring grazing and rumination times. Therefore, the collar device can be applied to monitor cow behavior at pasture on farms. With regards to the application potential of the collar device, it may not only be used on commercial farms but can also be applied to research questions when a data resolution of 15 min is sufficient. Thus, at farm level, the farmer can get an accurate and continuous measurement of grazing behavior of each individual cow and may then use those data for decision-making to optimize the animal management.
Infant protein intake has been associated with child growth, however, research on maternal protein intake during pregnancy is limited. Insulin-like growth factors (IGF) play a role in early fetal development and maternal protein intake may influence child body composition via IGF-1. The aim of this study was to investigate the association of maternal protein intake throughout pregnancy on cord blood IGF-1 and child body composition from birth to 5 years of age. Analysis was carried out on 570 mother–child dyads from the Randomised cOntrol trial of LOw glycaemic index diet study. Protein intake was recorded using 3-d food diaries in each trimester of pregnancy and protein intake per kg of maternal weight (g/d per kg) was calculated. Cord blood IGF-1 was measured at birth. Infant anthropometry was measured at birth, 6 months, 2 and 5 years of age. Mixed modelling, linear regression, and mediation analysis were carried out. Birth weight centiles were positively associated with early-pregnancy protein intake (g/d per kg), while weight centiles from 6 months to 5 years were negatively associated (B=−21·6, P<0·05). These associations were not mediated by IGF-1. Our findings suggest that high protein intake in early-pregnancy may exert an in utero effect on offspring body composition with a higher weight initially at birth but slower growth rates into childhood. Further research is needed to elucidate the exact mechanisms by which dietary protein modulates fetal growth.
Important Bird and Biodiversity Areas (IBAs) are sites identified as being globally important for the conservation of bird populations on the basis of an internationally agreed set of criteria. We present the first review of the development and spread of the IBA concept since it was launched by BirdLife International (then ICBP) in 1979 and examine some of the characteristics of the resulting inventory. Over 13,000 global and regional IBAs have so far been identified and documented in terrestrial, freshwater and marine ecosystems in almost all of the world’s countries and territories, making this the largest global network of sites of significance for biodiversity. IBAs have been identified using standardised, data-driven criteria that have been developed and applied at global and regional levels. These criteria capture multiple dimensions of a site’s significance for avian biodiversity and relate to populations of globally threatened species (68.6% of the 10,746 IBAs that meet global criteria), restricted-range species (25.4%), biome-restricted species (27.5%) and congregatory species (50.3%); many global IBAs (52.7%) trigger two or more of these criteria. IBAs range in size from < 1 km2 to over 300,000 km2 and have an approximately log-normal size distribution (median = 125.0 km2, mean = 1,202.6 km2). They cover approximately 6.7% of the terrestrial, 1.6% of the marine and 3.1% of the total surface area of the Earth. The launch in 2016 of the KBA Global Standard, which aims to identify, document and conserve sites that contribute to the global persistence of wider biodiversity, and whose criteria for site identification build on those developed for IBAs, is a logical evolution of the IBA concept. The role of IBAs in conservation planning, policy and practice is reviewed elsewhere. Future technical priorities for the IBA initiative include completion of the global inventory, particularly in the marine environment, keeping the dataset up to date, and improving the systematic monitoring of these sites.
Mycobacterium ulcerans is recognised as the third most common mycobacterial infection worldwide. It causes necrotising infections of skin and soft tissue and is classified as a neglected tropical disease by the World Health Organization (WHO). However, despite extensive research, the environmental reservoir of the organism and mode of transmission of the infection to humans remain unknown. This limits the ability to design and implement public health interventions to effectively and consistently prevent the spread and reduce the incidence of this disease. In recent years, the epidemiology of the disease has changed. In most endemic regions of the world, the number of cases reported to the WHO are reducing, with a 64% reduction in cases reported worldwide in the last 9 years. Conversely, in a smaller number of countries including Australia and Nigeria, reported cases are increasing at a rapid rate, new endemic areas continue to appear, and in Australia cases are becoming more severe. The reasons for this changing epidemiology are unknown. We review the epidemiology of M. ulcerans disease worldwide, and document recent changes. We also outline and discuss the current state of knowledge on the ecology of M. ulcerans, possible transmission mechanisms to humans and what may be enabling the spread of M. ulcerans into new endemic areas.
We compared sepsis “time zero” and Centers for Medicare and Medicaid Services (CMS) SEP-1 pass rates among 3 abstractors in 3 hospitals. Abstractors agreed on time zero in 29 of 80 (36%) cases. Perceived pass rates ranged from 9 of 80 cases (11%) to 19 of 80 cases (23%). Variability in time zero and perceived pass rates limits the utility of SEP-1 for measuring quality.
To date, Ireland has been a leading light in the provision of youth mental health services. However, cognisant of the efforts of governmental and non-governmental agencies working in youth mental health, there is much to be done. Barriers into care as well as discontinuity of care across the spectrum of services remain key challenges. This editorial provides guidance for the next stage of development in youth mental care and support that will require significant national engagement and resource investment.
Current standard-of-care for glioblastoma (GBM) includes surgery, radiation and temozolomide. Most tumors recur within a year from diagnosis and median survival for recurrent GBM (rGBM) is 3-9 months. Unmethylated promoter status for O6-methylguanine-DNA-methyltransferase (MGMT) is a validated biomarker for temozolomide-resistance, exhibited by most GBM patients. VAL-083 is a DNA-targeting agent with a mechanism-of-action that is independent of MGMT. VAL-083 overcomes temozolomide-resistance in GBM cell-lines, cancer stem cells, and in vivo models. VAL-083 readily crosses the blood-brain barrier and accumulates in brain-tumor tissue. We recently completed a VAL-083 dose-escalation trial in temozolomide- and bevacizumab-refractory rGBM and determined that 40mg/m2/day given intravenously on days 1,2,3 of a 21-day cycle is generally well-tolerated. This dosing regimen was selected for subsequent GBM trials, including an ongoing single-arm, biomarker-driven Phase 2 trial (N=48) in temolozomide-refractory, bevacizumab-naïve rGBM , MGMT-unmethylated (Clinicaltrials.gov:NCT02717962). The primary objective of this study is to determine if VAL-083 improves OS compared to a historical control of 7.15 months for MGMT-unmethylated rGBM patients treated with lomustine (EORTC26101). In addition, another single-arm, biomarker-driven, Phase 2 study (N=25) of VAL-083 in combination with radiotherapy in newly diagnosed GBM, MGMT-unmethylated is ongoing (Clinicaltrials.gov:NCT03050736). This trial aims to determine a dose for further study of VAL-083 in combination with radiotherapy and explore if VAL-083 improves PFS and OS compared to historical results in newly diagnosed GBM. Enrollment and safety data updates will be provided at the meeting. The results of these studies, if successful, may support VAL-083 as part of a new chemotherapeutic treatment paradigm for GBM.
Introduction: Head injury is a common presentation to all emergency departments. Previous research has shown that such injuries may be complicated by delayed intracranial hemorrhage (D-ICH) after the initial scan is negative. Exposure to anticoagulant or anti-platelet medications (ACAP) may be a risk factor for D-ICH. We have conducted a systematic review and meta-analysis to determine the incidence of delayed traumatic intracranial hemorrhage in patients taking anticoagulants, anti-platelets or both. Methods: The literature search was conducted in March 2017 with an update in April 2017. Keyword and MeSH terms were used to search OVID Medline, Embase and the Cochrane database as well as grey literature sources. All cohort and experimental studies were eligible for selection. Inclusion criteria included pre-injury exposure to oral anticoagulant and / or anti-platelet medication and a negative initial CT scan of the brain (CT1). The primary outcome was delayed intracranial hemorrhage present on repeat CT scan (CT2) within 48 hours of the presentation. Only patients who were rescanned or observed minimally were included. Clinically significant D-ICH were those that required neurosurgery, caused death or necessitated a change in management strategy, such as admission. Results: Fifteen primary studies were ultimately identified, comprising a total of 3801 patients. Of this number, 2111 had a control CT scan. 39 cases of D-ICH were identified, with the incidence of D-ICH calculated to be 1.31% (95% CI [0.56, 2.27]). No more than 12 of these patients had a clinically significant D-ICH representing 0.09% (95% CI [0.00, 0.31]). 10 of them were on warfarin and two on aspirin. There were three deaths recorded and three patients needed neurosurgery. Conclusion: The relatively low incidence suggests that repeat CT should not be mandatory for patients without ICH on first CT. This is further supported by the negligibly low rate of clinically significant D-ICH. Evidence-based assessments should be utilised to indicate the appropriate discharge plan, with further research required to guide the balance between clinical observation and repeat CT.
Introduction: Inspired by the Choosing Wisely® campaign, St. Michaels Hospital (SMH) launched an initiative to reduce unnecessary tests, treatments and procedures that may cause patient harm. Stakeholder engagement identified inappropriate ordering of urine culture & sensitivities (C&S) in the emergency department (ED) as a focus area. Inappropriate urine C&S increase workload, healthcare costs and detection of asymptomatic bacteriuria which can lead to unnecessary antibiotics. The project’s purposes were to describe the scope of inappropriately ordered urine C&S in the SMH ED and to conduct a root-cause analysis to inform future quality improvement interventions. Methods: Criteria for determining appropriateness was developed a priori using evidence-based guidelines from the University Health Network together with additional literature review. A retrospective chart review was performed on all urine C&S ordered in the ED from Jun 1 Aug 30, 2016. Each chart was reviewed for order appropriateness, demographic information and ordering provider. All inappropriate urine C&S were reviewed to identify root causes which were then grouped into common themes. A pareto chart was constructed to analyze the frequency of causes. Results: Of 425 urine C&S ordered, 75 (17.7%) were inappropriate. The top 3 reasons were: inappropriate urosepsis work-ups (53%), order processing errors (17%) and inappropriate work-ups for weakness (16%). Inappropriate urosepsis work-ups were defined as urine C&S that were ordered empirically despite there being a clear focus for infection elsewhere (i.e. cough, cellulitis) and in the absence of urinary symptoms. Order processing errors were defined as urine C&S which were sent despite there being no documented order. Inappropriate testing was more likely to occur overnight, in females and when a urine routine and microscopy was not ordered prior to C&S. 29% of patients with inappropriate C&S received antibiotics. Conclusion: 17.7% of urine C&S ordered in the SMH ED during the 3-month study period were inappropriate. The top cause was septic patients who were empirically tested despite having another source for infection identified from the outset. A possible reason for this is the recent ED emphasis on early recognition of sepsis which may encourage early use of antibiotics and empiric urine C&S. One question to resolve is whether a 17.7% overutilization rate is sufficient to make it a target for change. Interventions designed to reduce inappropriate urine C&S may inadvertently increase the number of missed cultures in patients admitted with sepsis not yet diagnosed. Next steps involve discussions between the ED, Internal Medicine, Infectious Disease and Microbiology, and patient partners to identify patient-centered change ideas and sustainable strategies. This may involve establishing guidelines for ordering urine C&S and incorporating lab services to provide oversight into urine C&S processing.
We present a multi-frequency study of the intermediate spiral SAB(r)bc type galaxy NGC 6744, using available data from the Chandra X-Ray telescope, radio continuum data from the Australia Telescope Compact Array and Murchison Widefield Array, and Wide-field Infrared Survey Explorer infrared observations. We identify 117 X-ray sources and 280 radio sources. Of these, we find nine sources in common between the X-ray and radio catalogues, one of which is a faint central black hole with a bolometric radio luminosity similar to the Milky Way’s central black hole. We classify 5 objects as supernova remnant (SNR) candidates, 2 objects as likely SNRs, 17 as H ii regions, 1 source as an AGN; the remaining 255 radio sources are categorised as background objects and one X-ray source is classified as a foreground star. We find the star-formation rate (SFR) of NGC 6744 to be in the range 2.8–4.7 M⊙~yr − 1 signifying the galaxy is still actively forming stars. The specific SFR of NGC 6744 is greater than that of late-type spirals such as the Milky Way, but considerably less that that of a typical starburst galaxy.