To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study empirically examines preparedness with a kit, medication, and a disaster plan on disaster outcomes including perceived recovery, property damage, and use of medical or mental health services.
Using a cross-sectional, retrospective study design, 1114 households in New York City were interviewed 21-34 months following Super Storm Sandy. Bivariate associations were examined and logistic regression models fit to predict the odds of disaster outcomes given the level of preparedness.
Respondents with an evacuation plan were more likely to report not being recovered (odds ratio [OR] = 2.4; 95% confidence interval [CI]: 1.5-3.8), property damage (OR =1.4; 95% CI: 1.1-1.9), and use of medical services (OR = 2.3; 95% CI: 1.1-4.5). Respondents reporting a supply of prescription medication were more likely to report using mental health (OR = 3.5; 95% CI: 1.2-9.8) and medical services (OR = 2.3; 95% CI: 1.1-4.8)
Having a kit, plan, and medication did not reduce risk of adverse outcomes in Superstorm Sandy in this sample. Disaster managers should consider the lack of evidence for preparedness when making public education and resource allocation decisions. Additional research is needed to identify preparedness measures that lead to better outcomes for more efficient and effective response and recovery.
To sustainably improve cleaning of high-touch surfaces (HTSs) in acute-care hospitals using a multimodal approach to education, reduction of barriers to cleaning, and culture change for environmental services workers.
The study was conducted in 2 academic acute-care hospitals, 2 community hospitals, and an academic pediatric and women’s hospital.
Frontline environmental services workers.
A 5-module educational program, using principles of adult learning theory, was developed and presented to environmental services workers. Audience response system (ARS), videos, demonstrations, role playing, and graphics were used to illustrate concepts of and the rationale for infection prevention strategies. Topics included hand hygiene, isolation precautions, personal protective equipment (PPE), cleaning protocols, and strategies to overcome barriers. Program evaluation included ARS questions, written evaluations, and objective assessments of occupied patient room cleaning. Changes in hospital-onset C. difficile infection (CDI) and methicillin-resistant S. aureus (MRSA) bacteremia were evaluated.
On average, 357 environmental service workers participated in each module. Most (93%) rated the presentations as ‘excellent’ or ‘very good’ and agreed that they were useful (95%), reported that they were more comfortable donning/doffing PPE (91%) and performing hand hygiene (96%) and better understood the importance of disinfecting HTSs (96%) after the program. The frequency of cleaning individual HTSs in occupied rooms increased from 26% to 62% (P < .001) following the intervention. Improvement was sustained 1-year post intervention (P < .001). A significant decrease in CDI was associated with the program.
A novel program that addressed environmental services workers’ knowledge gaps, challenges, and barriers was well received and appeared to result in learning, behavior change, and sustained improvements in cleaning.
Childhood maltreatment (CM) plays an important role in the development of major depressive disorder (MDD). The aim of this study was to examine whether CM severity and type are associated with MDD-related brain alterations, and how they interact with sex and age.
Within the ENIGMA-MDD network, severity and subtypes of CM using the Childhood Trauma Questionnaire were assessed and structural magnetic resonance imaging data from patients with MDD and healthy controls were analyzed in a mega-analysis comprising a total of 3872 participants aged between 13 and 89 years. Cortical thickness and surface area were extracted at each site using FreeSurfer.
CM severity was associated with reduced cortical thickness in the banks of the superior temporal sulcus and supramarginal gyrus as well as with reduced surface area of the middle temporal lobe. Participants reporting both childhood neglect and abuse had a lower cortical thickness in the inferior parietal lobe, middle temporal lobe, and precuneus compared to participants not exposed to CM. In males only, regardless of diagnosis, CM severity was associated with higher cortical thickness of the rostral anterior cingulate cortex. Finally, a significant interaction between CM and age in predicting thickness was seen across several prefrontal, temporal, and temporo-parietal regions.
Severity and type of CM may impact cortical thickness and surface area. Importantly, CM may influence age-dependent brain maturation, particularly in regions related to the default mode network, perception, and theory of mind.
Low and middle-income countries (LMICs) bear a disproportionately high burden of sepsis, contributing to an estimated 90% of global sepsis-related deaths. Critical care capabilities needed for septic patients, such as continuous vital sign monitoring, are often unavailable in LMICs.
This study aimed to assess the feasibility and accuracy of using a small wireless, wearable biosensor device linked to a smartphone, and a cloud analytics platform for continuous vital sign monitoring in emergency department (ED) patients with suspected sepsis in Rwanda.
This was a prospective observational study of adult and pediatric patients (≥ 2 months) with suspected sepsis presenting to Kigali University Teaching Hospital ED. Biosensor devices were applied to patients’ chest walls and continuously recorded vital signs (including heart rate and respiratory rate) for the duration of their ED course. These vital signs were compared to intermittent, manually-collected vital signs performed by a research nurse every 6-8 hours. Pearson’s correlation coefficients were calculated over the study population to determine the correlation between the vital signs obtained from the biosensor device and those collected manually.
42 patients (20 adults, 22 children) were enrolled. Mean duration of monitoring with the biosensor device was 34.4 hours. Biosensor and manual vital signs were strongly correlated for heart rate (r=0.87, p<0.001) and respiratory rate (r=0.74 p<0.001). Feasibility issues occurred in 9/42 (21%) patients, although were minor and included biosensor falling off (4.8%), technical/connectivity problems (7.1%), removal by a physician (2.4%), removal for a procedure (2.4%), and patient/parent desire to remove the device (4.8%).
Wearable biosensor devices can be feasibly implemented and provide accurate continuous vital sign measurements in critically ill pediatric and adult patients with suspected sepsis in a resource-limited setting. Further prospective studies evaluating the impact of biosensor devices on improving clinical outcomes for septic patients are needed.
Herbicides registered in vegetable soybean often fail to control waterhemp. The objective of this research was to quantify vegetable soybean tolerance to preemergence herbicides for early-season waterhemp control, including flumioxazin applied alone PRE or in mixture with chlorimuron, metribuzin, or pyroxasulfone at use rates in grain-type soybean. Crop tolerance to the herbicides was tested in field trials with 20 vegetable soybean cultivars and four grain-type cultivars through 4 wk after treatment (WAT). Flumioxazin-based treatments were equally safe, resulting in only minor, transitory crop response (<5% injury 2 WAT) and no effect on crop emergence or early season growth. Flumioxazin mixtures provided greater than 99% control of waterhemp 4 WAT, as evidenced by reduced weed density from 29.7 plants m−2 in the nontreated control to no waterhemp. Flumioxazin applied alone or in tank mixture with chlorimuron, metribuzin, or pyroxasulfone were as safe in vegetable soybean as previously reported in grain-type soybean. Registration of these products in vegetable soybean would provide the industry with additional options for managing waterhemp.
Decreases in Fe status have been reported in military women during initial training periods of 8–10 weeks. The present study aimed to characterise Fe status and associations with physical performance in female New Zealand Army recruits during a 16-week basic combat training (BCT) course. Fe status indicators – Hb, serum ferritin (sFer), soluble transferrin receptor (sTfR), transferrin saturation (TS) and erythrocyte distribution width (RDW) – were assessed at the beginning (baseline) and end of BCT in seventy-six volunteers without Fe-deficiency non-anaemia (sFer <12 µg/l; Hb ≥120 g/l) or Fe-deficiency anaemia (sFer <12 µg/l; Hb <120 g/l) at baseline or a C-reactive protein >10 mg/l at baseline or end. A timed 2·4 km run followed by maximum press-ups were performed at baseline and midpoint (week 8) to assess physical performance. Changes in Fe status were investigated using paired t tests and associations between Fe status and physical performance evaluated using Pearson correlation coefficients. sFer (56·6 (sd 33·7) v. 38·4 (sd 23·8) µg/l) and TS (38·8 (sd 13·9) v. 34·4 (sd 11·5) %) decreased (P<0·001 and P=0·014, respectively), while sTfR (1·21 (sd 0·27) v. 1·39 (sd 0·35) mg/l) and RDW (12·8 (sd 0·6) v. 13·2 (sd 0·7) %) increased (P<0·001) from baseline to end. Hb (140·6 (sd 7·5) v. 142·9 (sd 7·9) g/l) increased (P=0·009) during BCT. At end, sTfR was positively (r 0·29, P=0·012) and TS inversely associated (r –0·32, P=0·005) with midpoint run time. There were no significant correlations between Fe status and press-ups. Storage and functional Fe parameters indicated a decline in Fe status in female recruits during BCT. Correlations between tissue-Fe indicators and run times suggest impaired aerobic fitness. Optimal Fe status appears paramount for enabling success in female recruits during military training.
The National Institute for Health and Care Excellence (NICE) invited the manufacturer of olaratumab (Lartruvo®), Eli Lilly & Company Limited, to submit evidence for the clinical and cost effectiveness of this drug, in combination with doxorubicin, for advanced soft tissue sarcoma (STS) not amenable to surgery or radiotherapy, as part of the Institute's Single Technology Appraisal. The Peninsula Technology Assessment Group critically reviewed the submitted evidence.
Clinical effectiveness was derived from an open-label, randomized controlled trial, JGDG. The economic analysis was based on a partitioned survival model with a time horizon of 25 years. The perspective was of the UK National Health Service (NHS) and Personal Social Services. Costs and benefits were discounted at 3.5 percent per year. The company's evidence was submitted in anticipation that olaratumab would be considered as an alternative to doxorubicin, which has been used as a first-line treatment for advanced STS. To improve the cost effectiveness of olaratumab, the company offered a discount through a Commercial Access Agreement with the NHS England.
In the company's submission, the mean base-case and probabilistic incremental cost-effectiveness ratios (ICERs) for olaratumab plus doxorubicin versus doxorubicin alone were GBP 46,076 (USD 61,403) and GBP 47,127 (USD 62,804) per quality-adjusted life-year (QALY) gained, respectively; the probability of this treatment being cost effective at the willingness-to-pay threshold of GBP 50,000 (USD 66,632) per QALY gained, applicable to end-of-life treatments, was 0.54. The respective estimates in our analysis were approximately GBP 60,000 (USD 79,959) per QALY gained, and the probability of cost-effectiveness was 0.21. The increase in the ICERs was primarily due to differences in extrapolation of overall survival, and drug administration costs.
Based on the available evidence, olaratumab in combination with doxorubicin improves the survival of patients with advanced STS. However, this treatment is unlikely to be cost-effective. Olaratumab is now recommended for use within the Cancer Drugs Fund.
Radiocarbon (14C) dating is widely used to determine the age of organic material in palaeoenvironmental research. Here we compare 14C dates (n=17) resulting from macro-charcoal (>250 μm), short-lived plant macrofossils and pollen-rich residues isolated from two mire environments in eastern Australia. In most samples we found that short-lived plant macrofossils were the youngest organic component, the charcoal samples most often fell into the middle and the pollen-rich residues consistently returned older dates than the other samples. Although pollen-rich residues have been widely used for 14C dating in Australasia we suggest some caution in their use, perhaps because in our fire-prone environments these samples often also contain fine charcoal and other oxidative resistant organic matter that is older than the surrounding sediment matrix. The macro-charcoal samples also often returned older calibrated ages compared to short-lived plant macrofossils from the same depth, although this difference was relatively small (<245 years). Our results demonstrate that 14C dating of short-lived plant macrofossils are likely to yield more accurate chronologies and we advocate their routine use in palaeoenvironmental research when they are available.
Sepsis – syndrome of infection complicated by organ dysfunction – is responsible for over 750 000 hospitalisations and 200 000 deaths in the USA annually. Despite potential nutritional benefits, the association of diet and sepsis is unknown. Therefore, we sought to determine the association between adherence to a Mediterranean-style diet (Med-style diet) and long-term risk of sepsis in the REasons for Geographic Differences in Stroke (REGARDS) cohort. We analysed data from REGARDS, a population-based cohort of 30 239 community-dwelling adults age ≥45 years. We determined dietary patterns from a baseline FFQ. We defined Med-style diet as a high consumption of fruit, vegetables, legumes, fish, cereal and low consumption of meat, dairy products, fat and alcohol categorising participants into Med-style diet tertiles (low: 0–3, moderate: 4–5, high: 6–9). We defined sepsis events as hospital admission for serious infection and at least two systematic inflammatory response syndrome criteria. We used Cox proportional hazard models to determine the association between Med-style diet tertiles and first sepsis events, adjusting for socio-demographics, lifestyle factors, and co-morbidities. We included 21 256 participants with complete dietary data. Dietary patterns were: low Med-style diet 32·0 %, moderate Med-style diet 42·1 % and high Med-style diet 26·0 %. There were 1109 (5·2 %) first sepsis events. High Med-style diet was independently associated with sepsis risk; low Med-style diet referent, moderate Med-style diet adjusted hazard ratio (HR) 0·93 (95 % CI 0·81, 1·08), high Med-style diet adjusted HR=0·74 (95 % CI 0·61, 0·88). High Med-style diet adherence is associated with lower risk of sepsis. Dietary modification may potentially provide an option for reducing sepsis risk.
Whether leading a small team or a multinational corporation, within the public or private sector, a thorough understanding of the theory and best practice of leadership is essential. Leadership: Regional and Global Perspectives provides a fresh approach to leading in contemporary business environments. The theory component is complemented by a focus on strategic application. Each chapter features case studies highlighting the practical application of key concepts by organisational leaders in the Australasian region. Case studies at the end of each chapter provide a more nuanced analysis of the theory, while accompanying questions encourage students to think critically. Learning is further supported through the inclusion of learning objectives, key terms, further readings and review questions. An extensive bank of web resources is available to lecturers to support their teaching. Written by an expert team of academics from across Australia, Leadership gives students the tools they need to navigate their leadership journey.