We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: The effectiveness of PPE in preventing self-contamination of healthcare workers (HCWs) and transmission of pathogens (airborne and contact) in the emergency department (ED) is highly dependent on consistent, appropriate use of and other interactions (eg, storing, cleaning, etc) with the PPE. Pre–COVID-19 studies focused primarily on individual HCW contributions to incorrect or suboptimal PPE use. We conducted an analysis of ED video recordings using a human-factors engineering framework (ie, The Systems Engineering Initiative for Patient Safety, SEIPS), to identify work-system–level contributions to inappropriate PPE usage by HCWs while they provide care in their actual clinical care environment. Methods: In total, 47 video sessions (each ~15 minute) were recorded between June 2020 and May 2021 using a GoPro camera in an 8-bed pod area, designated for persons under investigation (PUI) and confirmed COVID-19–positive patients, in an ED of a large, tertiary-care, academic medical center. These recordings captured a ‘landscape view’: 2 video cameras were set up to capture the entire ED pod area and HCWs as they provided care. A team with hemorrhagic fever expertise, infection prevention and control expertise, and ED expertise reviewed each video together and extracted data using a semistructured form. Results: Guided by the 5 components of the SEIPS work system model, (ie, task, physical environment, person, organization, tools and technology), multiple work system failure points influencing HCWs appropriate use of PPE were identified. For example, under the task component, HCWs were observed not doffing and donning in recommended sequence. Also, inconsistencies with COVID-19 status signage on a patient’s door and ambiguous labelling of work areas designated as clean (donning) and dirty (doffing) sites acted as a barrier to appropriate PPE use under the physical environment section. Conclusions: Human factors–based analysis of video recordings of actual ED work identified a variety of work system factors that impede appropriate or correct use of PPE by HCWs. Future efforts to improve appropriate PPE use should focus on eliminating or mitigating the effects of these work system factors.
Funding: US CDC
Disclosures: The authors gratefully acknowledge the CDC for funding this work. This material is based upon work supported by the Naval Sea Systems Command (under contract No. N00024-13-D-6400, Task Order NH076). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Naval Sea Systems Command (NAVSEA) or the US CDC.
Background: Candidemia is associated with high morbidity and mortality. Although risk factors for candidemia and other bloodstream infections (BSIs) overlap, little is known about patient characteristics and the outcomes of polymicrobial infections. We used data from the CDC Emerging Infections Program (EIP) candidemia surveillance to describe polymicrobial candidemia infections and to assess clinical differences compared with Candida-only BSIs. Methods: During January 2017–December 2017 active, population-based candidemia surveillance was conducted in 45 counties in 9 states covering ~6% of the US population through the CDC EIP. A case was defined as a blood culture with Candida spp in a surveillance-area resident; a blood culture >30 days from the initial culture was considered a second case. Demographic and clinical characteristics were abstracted from medical records by trained EIP staff. We examined characteristics of polymicrobial cases, in which Candida and ≥1 non-Candida organism were isolated from a blood specimen on the same day, and compared these to Candida-only cases using logistic regression or t tests using SAS v 9.4 software. Results: Of the 1,221 candidemia cases identified during 2017, 215 (10.2%) were polymicrobial. Among polymicrobial cases, 50 (23%) involved ≥3 organisms. The most common non-Candida organisms were Staphylococcus epidermidis (n = 30, 14%), Enterococcus faecalis (n = 26, 12%), Enterococcus faecium (n = 17, 8%), and Staphylococcus aureus, Klebsiella pneumoniae, and Stenotrophomonas maltophilia (n = 15 each, 7%). Patients with polymicrobial cases were significantly younger than those with Candida-only cases (54.3 vs 60.7 years; P < .0004). Healthcare exposures commonly associated with candidemia like total parenteral nutrition (relative risk [RR], 0.82; 95% CI, 0.60–1.13) and surgery (RR, 0.99; 95% CI, 0.77–1.29) were similar between the 2 groups. Polymicrobial cases had shorter median time from admission to positive culture (1 vs 4 days, P < .001), were more commonly associated with injection drug use (RR, 1.95; 95% CI, 1.46–2.61), and were more likely to be community onset-healthcare associated (RR, 1.91; 95% CI, 1.50–2.44). Polymicrobial cases were associated with shorter hospitalization (14 vs 17 days; P = .031), less ICU care (RR, 0.7; 95% CI, 0.51–0.83), and lower mortality (RR, 0.7; 95% CI, 0.50–0.92). Conclusions: One in 10 candidemia cases were polymicrobial, with nearly one-quarter of those involving ≥3 organisms. Lower mortality among polymicrobial cases is surprising but may reflect the younger age and lower severity of infection of this population. Greater injection drug use, central venous catheter use, and long-term care exposures among polymicrobial cases suggest that injection or catheter practices play a role in these infections and may guide prevention opportunities.
A new fossil site in a previously unexplored part of western Madagascar (the Beanka Protected Area) has yielded remains of many recently extinct vertebrates, including giant lemurs (Babakotia radofilai, Palaeopropithecus kelyus, Pachylemur sp., and Archaeolemur edwardsi), carnivores (Cryptoprocta spelea), the aardvark-like Plesiorycteropus sp., and giant ground cuckoos (Coua). Many of these represent considerable range extensions. Extant species that were extirpated from the region (e.g., Prolemur simus) are also present. Calibrated radiocarbon ages for 10 bones from extinct primates span the last three millennia. The largely undisturbed taphonomy of bone deposits supports the interpretation that many specimens fell in from a rock ledge above the entrance. Some primates and other mammals may have been prey items of avian predators, but human predation is also evident. Strontium isotope ratios (87Sr/86Sr) suggest that fossils were local to the area. Pottery sherds and bones of extinct and extant vertebrates with cut and chop marks indicate human activity in previous centuries. Scarcity of charcoal and human artifacts suggests only occasional visitation to the site by humans. The fossil assemblage from this site is unusual in that, while it contains many sloth lemurs, it lacks ratites, hippopotami, and crocodiles typical of nearly all other Holocene subfossil sites on Madagascar.
Objectives: Amphetamine improves vigilance as assessed by continuous performance tests (CPT) in children and adults with attention deficit hyperactivity disorder (ADHD). Less is known, however, regarding amphetamine effects on vigilance in healthy adults. Thus, it remains unclear whether amphetamine produces general enhancement of vigilance or if these effects are constrained to the remediation of deficits in patients with ADHD. Methods: We tested 69 healthy adults (35 female) on a standardized CPT (Conner’s CPT-2) after receiving 10- or 20-mg d-amphetamine or placebo. To evaluate potential effects on learning, impulsivity, and perseveration, participants were additionally tested on the Iowa Gambling Task (IGT) and Wisconsin Card Sorting Task (WCST). Results: Participants receiving placebo exhibited the classic vigilance decrement, demonstrated by a significant reduction in attention (D’) across the task. This vigilance decrement was not observed, however, after either dose of amphetamine. Consistent with enhanced vigilance, the 20-mg dose also reduced reaction time variability across the task and the ADHD confidence index. The effects of amphetamine appeared to be selective to vigilance since no effects were observed on the IGT, WCST, or response inhibition/perseveration measures from the CPT. Conclusions: The present data support the premise that amphetamine improves vigilance irrespective of disease state. Given that amphetamine is a norepinephrine/dopamine transporter inhibitor and releaser, these effects are informative regarding the neurobiological substrates of attentional control. (JINS, 2018, 24, 283–293)
Skin is the parchment upon which identity is written; class, race, ethnicity, and gender are all legible upon the human surface. Removing skin tears away identity, and leaves a blank slate upon whichlaw, punishment, sanctity, or monstrosity can be inscribed; whether as an act of penal brutality, as a comic device, or as a sign of spiritual sacrifice, it leaves a lasting impression about the qualities and nature of humanity. Flaying often functioned as an imaginative resource for medieval and early modern artists and writers, even though it seems to have been rarely practiced in reality. From images of Saint Bartholomew holding his skin in his arms, to scenes of execution in Havelok the Dane, to laws that prescribed it as a punishment for treason, this volume explores the ideaand the reality of skin removal - flaying - in the Middle Ages. It interrogates the connection between reality and imagination in depictions of literal skin removal, rather than figurative or theoretical interpretations of flaying, and offers a multilayered view of medieval and early modern perceptions of flaying and its representations in European culture. Its two parts consider practice and representation, capturing the evolution of flaying as both an idea and a practice in the premodern world.
Larissa Tracy is Associate Professor, Longwood University.
Contributors: Frederika Bain, Peter Dent, Kelly DeVries, Valerie Gramling, Perry Neil Harrison, Jack Hartnell, Emily Leverett, Michael Livingston, Sherry C.M. Lindquist, Asa Mittman, Mary Rambaran-Olm, William Sayers, Christina Sciacca, Susan Small, Larissa Tracy, Renée Ward
The Proterozoic carbonate stromatolites of the Pahrump Group from the Crystal Spring formation exhibit interesting layering patterns. In continuous vertical formations, there are sections of chevron-shaped stromatolites alternating with sections of simple horizontal layering. This apparent cycle of stromatolite formation and lack of formation repeats several times over a vertical distance of at least 30 m at the locality investigated. Small representative samples from each layer were taken and analysed using X-ray diffraction (XRD), X-ray fluorescence (XRF), environmental scanning electron microscopy – energy dispersive X-ray spectrometry, and were optically analysed in thin section. Optical and spectroscopic analyses of stromatolite and of non-stromatolite samples were undertaken with the objective of determining the differences between them. Elemental analysis of samples from within each of the four stromatolite layers and the four intervening layers shows that the two types of layers are chemically and mineralogically distinct. In the layers that contain stromatolites the Ca/Si ratio is high; in layers without stromatolites the Ca/Si ratio is low. In the high Si layers, both K and Al are positively correlated with the presence and levels of Si. This, together with XRD analysis, suggested a high K-feldspar (microcline) content in the non-stromatolitic layers. This variation between these two types of rocks could be due to changes in biological growth rates in an otherwise uniform environment or variations in detrital influx and the resultant impact on biology. The current analysis does not allow us to choose between these two alternatives. A Mars rover would have adequate resolution to image these structures and instrumentation capable of conducting a similar elemental analysis.
Long-acting injectable formulations of antipsychotics are treatment alternatives to oral agents.
Aims
To assess the efficacy of aripiprazole once-monthly compared with oral aripiprazole for maintenance treatment of schizophrenia.
Method
A 38-week, double-blind, active-controlled, non-inferiority study; randomisation (2:2:1) to aripiprazole once-monthly 400 mg, oral aripiprazole (10–30 mg/day) or aripiprazole once-monthly 50mg (a dose below the therapeutic threshold for assay sensitivity). (Trial registration: clinicaltrials.gov, NCT00706654.)
Results
A total of 1118 patients were screened, and 662 responders to oral aripiprazole were randomised. Kaplan–Meier estimated impending relapse rates at week 26 were 7.12% for aripiprazole once-monthly 400mg and 7.76% for oral aripiprazole. This difference (−0.64%, 95% CI −5.26 to 3.99) excluded the predefined non-inferiority margin of 11.5%. Treatments were superior to aripiprazole once-monthly 50mg (21.80%, P⩽0.001).
Conclusions
Aripiprazole once-monthly 400mg was non-inferior to oral aripiprazole, and the reduction in Kaplan–Meier estimated impending relapse rate at week 26 was statistically significant v. aripiprazole once-monthly 50 mg.
Individuals with life-threatening illness often engage in some form of spirituality to meet increased needs for meaning and purpose. This study aimed to identify the role of spirituality in persons who had reported positive, life-transforming change in relation to life-threatening cancer or cardiac events, and to connect these roles to palliative and supportive care.
Method:
A purposive sample of 10 cardiac survivors and 9 cancer survivors was recruited. Once the participants had given informed consent and passed screening in relation to life-transforming change and distress, they engaged in a semistructured one-hour qualitative interview on the theme of how their life-transforming change occurred in the context of their life-threatening illness. In the present article, our phenomenological analysis focuses on participants' references to purpose and meaning in their lives, with particular attention to the role and context of participants' spirituality.
Results:
Participants mentioned spirituality, meaning, and purpose in many contexts, including connecting with family and friends, nature, art, music, and sometimes creating a relationship with God. Participants often accessed spirituality by enhancing connections in their own lives: with a higher power, people, their work, or themselves. These enhanced connections gave participants greater meaning and purpose in their lives, and substantially helped participants to adjust to their life-threatening illnesses.
Significance of results:
Understanding the roles and contexts of spirituality among patients with a life-threatening illness allows us to develop better palliative and supportive care plans. Spiritually oriented supportive care may include support groups, yoga, meditation, nature, music, prayer, or referral to spiritual or religious counselors. A quantitative scale is needed to help healthcare clinicians assess the spiritual and coping needs of individuals with life-threatening illness.
Sensorimotor inhibition, or the ability to filter out excessive or irrelevant information, theoretically supports a variety of higher-level cognitive functions. Impaired inhibition may be associated with increased impulsive and risky behavior in everyday life. Individuals infected with HIV frequently show impairment on tests of neurocognitive function, but sensorimotor inhibition in this population has not been studied and may be a contributor to the profile of HIV-associated neurocognitive disorders (HAND). Thirty-seven HIV-infected individuals (15 with HAND) and 48 non-infected comparison subjects were assessed for prepulse inhibition (PPI), an eyeblink startle paradigm measuring sensorimotor gating. Although HIV status alone was not associated with PPI deficits, HIV-positive participants meeting criteria for HAND showed impaired PPI compared to cognitively intact HIV-positive subjects. In HIV-positive subjects, PPI was correlated with working memory but was not associated with antiretroviral therapy or illness factors. In conclusion, sensorimotor disinhibition in HIV accompanies deficits in higher-order cognitive functions, although the causal direction of this relationship requires investigation. Subsequent research on the role of sensorimotor gating on decision-making and risk behaviors in HIV may be indicated. (JINS, 2013, 19, 1–9)