We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An impairment in recognizing distress is implicated in the development and severity of antisocial behavior. It has been hypothesized that a lack of attention to the eyes plays a role, but supporting evidence is limited. We developed a computerized training to improve emotion recognition in children and examined the role of eye gaze before and after training. Children referred into an intervention program to prevent antisocial outcomes completed an emotion recognition task with concurrent eye tracking. Those with emotion recognition impairments (n = 54, mean age: 8.72 years, 78% male) completed the training, while others (n = 38, mean age: 8.95 years, 84% male) continued with their usual interventions. Emotion recognition and eye gaze were reassessed in all children 8 weeks later. Impaired negative emotion recognition was significantly related to severity of behavioral problems at pretest. Children who completed the training significantly improved in emotion recognition; eye gaze did not contribute to impairment or improvement in emotion recognition. This study confirms the role of emotion recognition in severity of disruptive behavior and shows that a targeted intervention can quickly improve emotion impairments. The training works by improving children's ability to appraise emotional stimuli rather than by influencing their visual attention.
To develop a pediatric research agenda focused on pediatric healthcare-associated infections and antimicrobial stewardship topics that will yield the highest impact on child health.
Participants:
The study included 26 geographically diverse adult and pediatric infectious diseases clinicians with expertise in healthcare-associated infection prevention and/or antimicrobial stewardship (topic identification and ranking of priorities), as well as members of the Division of Healthcare Quality and Promotion at the Centers for Disease Control and Prevention (topic identification).
Methods:
Using a modified Delphi approach, expert recommendations were generated through an iterative process for identifying pediatric research priorities in healthcare associated infection prevention and antimicrobial stewardship. The multistep, 7-month process included a literature review, interactive teleconferences, web-based surveys, and 2 in-person meetings.
Results:
A final list of 12 high-priority research topics were generated in the 2 domains. High-priority healthcare-associated infection topics included judicious testing for Clostridioides difficile infection, chlorhexidine (CHG) bathing, measuring and preventing hospital-onset bloodstream infection rates, surgical site infection prevention, surveillance and prevention of multidrug resistant gram-negative rod infections. Antimicrobial stewardship topics included β-lactam allergy de-labeling, judicious use of perioperative antibiotics, intravenous to oral conversion of antimicrobial therapy, developing a patient-level “harm index” for antibiotic exposure, and benchmarking and or peer comparison of antibiotic use for common inpatient conditions.
Conclusions:
We identified 6 healthcare-associated infection topics and 6 antimicrobial stewardship topics as potentially high-impact targets for pediatric research.
Background: Antimicrobial prophylaxis is one of the strongest surgical site infection (SSI) prevention measures. Current guidelines recommend the use of cefazolin as antimicrobial prophylaxis for abdominal hysterectomy procedures. However, there is growing evidence that anaerobes play a role in abdominal hysterectomy SSIs. We assessed the impact of adding anaerobic coverage on abdominal hysterectomy SSI rates in our institution. Methods: The University of Iowa Hospitals & Clinics is an 811-bed academic medical center that serves as a referral center for Iowa and neighboring states. Each year, ~33,000 major surgical operations are performed here, and on average, 600 are abdominal hysterectomies. Historically, patients have received cefazolin only, but beginning November 2017, patients undergoing abdominal hysterectomy received cefazolin + metronidazole for antimicrobial prophylaxis. Order sets within the electronic medical record were modified, and education was provided to surgeons, anesthesiologists, and other ordering providers. Procedures and subsequent SSIs were monitored and reported using National Healthcare Safety Network (NHSN) definitions. Infection rates are calculated using all depths (superficial, deep and organ space) and by deep and organ space only, as this is how they are publicly reported. We used numerator (SSIs) and denominator (number of abdominal hysterectomy procedures) data from the NHSN from January 2015 through September 2019. We performed an interrupted time-series analysis to determine how the addition of metronidazole was associated with abdominal hysterectomy SSIs (all depths, and deep and organ space). Results: From January 2015 through October 2017, the hysterectomy SSI rates were 3.2% (all depths) and 1.5% (deep and organ space). After the adjustment was made to antimicrobial prophylaxis in November 2017, the rates decreased to 1.6% (all depths) and 0.6% (deep and organ space). Of the SSIs with pathogens identified, the proportion of anaerobes decreased from 59% to 25% among all depths and from 82% to 50% among deep and organ-space SSIs. The rate of SSI decline after the intervention was statistically significant (P = .01) for deep and organ-space infections but not for all depths (P = .73). Conclusions: The addition of anaerobic coverage with metronidazole was associated with a decrease in deep and organ-space abdominal hysterectomy SSI rates at our institution. Hospitals should assess the microbiology of abdominal hysterectomy SSIs and should consider adding metronidazole to their antimicrobial prophylaxis.
Background: Including infection preventionists (IPs) in hospital design, construction, and renovation projects is important. According to the Joint Commission, “Infection control oversights during building design or renovations commonly result in regulatory problems, millions lost and even patient deaths.” We evaluated the number of active major construction projects at our 800-bed hospital with 6.0 IP FTEs and the IP time required for oversight. Methods: We reviewed construction records from October 2018 through October 2019. We classified projects as active if any construction occurred during the study period. We describe the types of projects: inpatient, outpatient, non–patient care, and the potential impact to patient health through infection control risk assessments (ICRA). ICRAs were classified as class I (non–patient-care area and minimal construction activity), class II (patients are not likely to be in the area and work is small scale), class III (patient care area and work requires demolition that generates dust), and class IV (any area requiring environmental precautions). We calculated the time spent visiting construction sites and in design meetings. Results: During October 2018–October 2019, there were 51 active construction projects with an average of 15 active sites per week. These sites included a wide range of projects from a new bone marrow transplant unit, labor and delivery expansion and renovation, space conversion to an inpatient unit to a project for multiple air handler replacements. All 51 projects were classified as class III or class IV. We visited, on average, 4 construction sites each week for 30 minutes per site, leaving 11 sites unobserved due to time constraints. We spent an average of 120 minutes weekly, but 450 minutes would have been required to observe all 15 sites. Yearly, the required hours to observe these active construction sites once weekly would be 390 hours. In addition to the observational hours, 124 hours were spent in design meetings alone, not considering the preparation time and follow-up required for these meetings. Conclusions: In a large academic medical center, IPs had time available to visit only a quarter of active projects on an ongoing basis. Increasing dedicated IP time in construction projects is essential to mitigating infection control risks in large hospitals.
Background: Manual cleaning is the recommended method of environmental disinfection; it plays a key role in the prevention of healthcare-associated infections. Recently, automated no-touch disinfection technologies, such as ultraviolet (UV) light, have been proposed as a supplement to manual cleaning. However, UV light adds time to the cleaning process and may decrease the quality of manual cleaning. We evaluated the impact of adding UV light on the quality of manual cleaning and on room turnover times. Methods: During January–September 2019, we assessed the thoroughness of disinfection cleaning (TDC) of environmental surfaces in rooms identified for discharge. According to hospital policy, contact precautions rooms use UV light after manual cleaning with an EPA-approved sporicidal agent (bleach). Non–contact precautions rooms are disinfected using quaternary ammonium only. Rooms were identified after patient admission, selected randomly, and marked once discharge orders were placed. Fluorescent markers were applied on high-touch surfaces before discharge and were assessed after the cleaning process was completed. TDC scores were defined as the percentage of cleaned surfaces of the total of examined surfaces. UV-light disinfection time is determined automatically based on room size. We compared TDC scores and manual cleaning times between contact precautions rooms and noncontact precautions rooms. We also calculated UV-light cycle durations. Results: We assessed 2,383 surfaces in 24 contact precautions rooms with UV-light disinfection and 201 noncontact precautions rooms without UV-light disinfection. The TDC score was similar in contact precautions rooms (243 of 273 surfaces) and noncontact precautions rooms (1,835 of 2,110 surfaces; 89% vs 87%). The median manual cleaning time for contact precautions rooms was 56 minutes (IQR, 37–79), and for noncontact precautions rooms the median manual cleaning time was 33 minutes (IQR, 22–43). UV-light use added a median of 49 minutes (IQR, 35–67) to the overall cleaning process. The median turnover time for contact precautions rooms was 156 minutes (IQR, 87–216) versus 58 minutes (IQR, 40–86) in noncontact precautions room. Conclusions: In a setting with an objective assessment of environmental cleaning, there was no difference in quality of manual cleaning between contact precautions rooms (UV light) and noncontact precautions rooms (UV light). Adding UV light following manual disinfection increased the overall cleaning time and delayed room availability.
Background: Inappropriate antibiotic prescription leads to increased Clostridiodes difficile infections, adverse effects including organ toxicity, and generation of antibiotic-resistant bacteria. Despite efforts to improve antibiotic use in acute-care settings, unnecessary and inappropriate prescription still occur in 30%–50% of patients. Objectives: We assessed factors associated with inappropriate antibiotic prescription at 2 time points: (1) initial, empiric therapy and (2) 3–5 days after therapy initiation. Methods: As part of a multicenter study investigating strategies to reduce antibiotic therapy after 3–5 days of use, antibiotic prescription data were collected from 11 adult and pediatric intensive care and general medical units at 6 hospitals in Maryland in 2014 and 2015. We performed a retrospective cohort study of all hospitalized patients who received any of 23 common antibiotics for at least 3 days. Each medical record was reviewed for demographics, admission and discharge dates, patient comorbidities, and antibiotic regimen by at least 1 infectious disease physician or pharmacist. Classification of antibiotic inappropriateness was based on each institution’s guidelines and standards. Bivariate analyses were performed using logistic regression for both initial therapy and therapy at days 3–5. Multivariable logistic regression was performed using covariates meeting the significance level of P < .05. Results: In total, 3,436 antibiotic courses were assessed at time of initial therapy, and 1541 regimens were continued and reviewed again at days 3–5 of therapy. For the initial therapy, 1,255 regimens (37%) were inappropriate; 45% of these were considered unnecessary and 41% were too broad in spectrum. In the multivariable regression, older age and antibiotic prescription during the summer were associated with the receipt of inappropriate antibiotics (Table 1). Having end-stage renal disease as a comorbid condition was protective against inappropriate use. At days 3–5 of therapy, 688 (45%) of the antibiotic courses were inappropriate. Reasons regimens were considered inappropriate included unnecessary antibiotic prescriptions (49%) and antibiotics being too broad (38%). Older age and receiving cefepime or piperacillin-tazobactam on day 3 of therapy were factors associated with inappropriate use (Table 2). Having undergone a transplant or a surgical procedure was protective of inappropriate antimicrobial use at days 3–5 of therapy. Conclusions: Older patients are more likely to receive inappropriate antibiotics at both initial regimen and 3–5 days later. Patients receiving cefepime or piperacillin-tazobactam are at greater risk of receiving inappropriate antibiotics at days 3–5 due to failure to de-escalate. Antibiotic stewardship strategies targeting these patient populations may limit inappropriate use.
Background: Central-line–associated bloodstream infection (CLABSI) rates have steadily decreased as evidence-based prevention bundles were implemented. Bone marrow transplant (BMT) patients are at increased risk for CLABSI due to immunosuppression, prolonged central-line utilization, and frequent central-line accesses. We assessed the impact of an enhanced prevention bundle on BMT nonmucosal barrier injury CLABSI rates. Methods: The University of Iowa Hospitals & Clinics is an 811-bed academic medical center that houses the only BMT program in Iowa. During October 2018, we added 3 interventions to the ongoing CLABSI prevention bundle in our BMT inpatient unit: (1) a standardized 2-person dressing change team, (2) enhanced quality daily chlorhexidine treatments, and (3) staff and patient line-care stewardship. The bundle included training of nurse champions to execute a team approach to changing central-line dressings. Standard process description and supplies are contained in a cart. In addition, 2 sets of sterile hands and a second person to monitor for breaches in sterile procedure are available. Site disinfection with chlorhexidine scrub and dry time are monitored. Training on quality chlorhexidine bathing includes evaluation of preferred product, application per product instructions for use and protection of the central-line site with a waterproof shoulder length glove. In addition to routine BMT education, staff and patients are instructed on device stewardship during dressing changes. CLABSIs are monitored using NHSN definitions. We performed an interrupted time-series analysis to determine the impact of our enhanced prevention bundle on CLABSI rates in the BMT unit. We used monthly CLABSI rates since January 2017 until the intervention (October 2018) as baseline. Because the BMT changed locations in December 2018, we included both time points in our analysis. For a sensitivity analysis, we assessed the impact of the enhanced prevention bundle in a hematology-oncology unit (March 2019) that did not change locations. Results: During the period preceding bundle implementation, the CLABSI rate was 2.2 per 1,000 central-line days. After the intervention, the rate decreased to 0.6 CLABSI per 1,000 central-line days (P = .03). The move in unit location did not have a significant impact on CLABSI rates (P = .85). CLABSI rates also decreased from 1.6 per 1,000 central-line days to 0 per 1,000 central-line days (P < .01) in the hematology-oncology unit. Conclusions: An enhanced CLABSI prevention bundle was associated with significant decreases in CLABSI rates in 2 high-risk units. Novel infection prevention bundle elements should be considered for special populations when all other evidence-based recommendations have been implemented.
Background: Surveillance for surgical site infections (SSI) is recommended by the CDC. Currently, colon and abdominal hysterectomy SSI rates are publicly available and impact hospital reimbursement. However, the CDC NHSN allows surgical procedures to be abstracted based on International Classification of Diseases, Tenth Revision (ICD-10) or current procedural terminology (CPT) codes. We assessed the impact of using ICD and/or CPT codes on the number of cases abstracted and SSI rates. Methods: We retrieved administrative codes (ICD and/or CPT) for procedures performed at the University of Iowa Hospitals & Clinics over 1 year: October 2018–September 2019. We included 10 procedure types: colon, hysterectomy, cesarean section, breast, cardiac, craniotomy, spinal fusion, laminectomy, hip prosthesis, and knee prosthesis surgeries. We then calculated the number of procedures that would be abstracted if we used different permutations in administration codes: (1) ICD codes only, (2) CPT codes only, (3) both ICD and CPT codes, and (4) at least 1 code from either ICD or CPT. We then calculated the impact on SSI rates based on any of the 4 coding permutations. Results: In total, 9,583 surgical procedures and 180 SSIs were detected during the study period using the fourth method (ICD or CPT codes). Denominators varied according to procedure type and coding method used. The number of procedures abstracted for breast surgery had a >10-fold difference if reported based on ICD only versus ICD or CPT codes (104 vs 1,109). Hip prosthesis had the lowest variation (638 vs 767). For SSI rates, cesarean section showed almost a 3-fold increment (2.6% when using ICD only to 7.32% with both ICD & CPT), whereas abdominal hysterectomy showed nearly a 2-fold increase (1.14% when using CPT only to 2.22% with both ICD & CPT codes). However, SSI rates remained fairly similar for craniotomy (0.14% absolute difference), hip prosthesis (0.24% absolute difference), and colon (0.09% absolute difference) despite differences in the number of abstracted procedures and coding methods. Conclusions: Denominators and SSI rates vary depending on the coding method used. Variations in the number of procedures abstracted and their subsequent impact on SSI rates were not predictable. Variations in coding methods used by hospitals could impact interhospital comparisons and benchmarking, potentially leading to disparities in public reporting and hospital penalties.
Background: The CDC recently updated recommendations on tuberculosis (TB) screening in healthcare facilities, suggesting the discontinuation of annual TB screening. However, hospitals may opt to continue based on their local TB epidemiology. We assessed TB infection control parameters in our facility to guide the implementation of the new CDC recommendations. Methods:We retrieved data for patients with an International Classification of Disease, Tenth Revision (ICD-10) code for TB treated at the University of Iowa Hospitals and Clinics during 2016–2019. We supplemented our search with microbiology data: culture or PCR for Mycobacterium tuberculosis. Based on manual chart review, we adjudicated each patient as active TB, latent TB, previously treated TB, unclear history, or no TB. We further labeled active TB cases based on their risk of transmission (pulmonary or extrapulmonary cases that underwent an aerosol generating procedure). We then calculated the number of exposure events associated with those patients and tuberculin skin test (TST) conversion rates among the exposed. Results: During 2016–2019, we identified 197 patients based on ICD-10 codes. In total, 10 additional patients were detected by microbiology data review. Of these 207 patients, 48 (23.2%) had active TB: lung, n = 24 (50%); lymph node, n = 9 (19%); bone or spine, n = 5 (10%); eye, n = 3 (6%); disseminated, n = 2 (4%); pleura, n = 2 (4%); skin abscess, n = 2 (4%); and meningitis, n = 1 (2%). Of the 24 pulmonary patients, 6 (25%) had either a positive smear or a cavity on imaging. In total, 159 patients were excluded: no TB, n = 22 (14%); latent TB, n = 27 (17%); old or treated TB, n = 93 (58%); and unclear history, n = 9 (6%). Of the 48 cases with active TB, 31 (65%) were deemed potentially infectious. Also, 10 cases (32%) led to the exposure of 204 healthcare workers (HCWs). Baseline and postexposure TST were available for 179 HCWs (88%); 72 (35%) followed up in the employee health clinic within the 8–12 weeks after exposure. Of 161 HCWs with a negative TST at baseline, no conversions occurred. Of 18 HCWs with positive TST at baseline, no HCW developed symptoms during the observation period. Conclusions: Nearly one-third of infectious TB cases led to HCW exposures in a low-incidence setting. However, no TST conversions or active TB infections were seen. Exposure and conversion rates are useful indicators of TB infection control in healthcare facilities and may help guide implementation of the new CDC TB control recommendations.
We sought to characterize patients’ preferences for the role of religious and spiritual (R&S) beliefs and practices during cancer treatment and describe the R&S resources desired by patients during the perioperative period.
Method
A cross-sectional survey was administered to individuals who underwent cancer-directed surgery. Data on demographics and R&S beliefs/preferences were collected and analyzed.
Results
Among 236 participants, average age was 58.8 (SD = 12.10) years; the majority were female (76.2%), white (94.1%), had a significant other or spouse (60.2%), and were breast cancer survivors (43.6%). Overall, more than one-half (55.9%) of individuals identified themselves as being religious, while others identified as only spiritual (27.9%) or neither (16.2%). Patients who identified as religious wanted R&S integrated into their care more often than patients who were only spiritual or neither (p < 0.001). Nearly half of participants (49.6%) wanted R&S resources when admitted to the hospital including the opportunity to speak with an R&S leader (e.g., rabbi; 72.1%), R&S texts (64.0%), and journaling materials (54.1%). Irrespective of R&S identification, 68.0% of patients did not want their physician to engage with them about R&S topics.
Significance of results
Access to R&S resources is important during cancer treatment, and incorporating R&S into cancer care may be especially important to patients that identify as religious. R&S needs should be addressed as part of the cancer care plan.
We performed a retrospective analysis of the impact of using the International Classification of Diseases, Tenth Revision procedure coding system (ICD-10) or current procedural terminology (CPT) codes to calculate surgical site infection (SSI) rates. Denominators and SSI rates vary depending on the coding method used. The coding method used may influence interhospital performance comparisons.
Demographical and clinical characteristics have been reported to modulate the risk for suicide. This study analysed demographical and clinical characteristics with respect to lifetime suicide attempts in 500 individuals affected with schizophrenic or affective disorders. Suicide attempts were associated with poor premorbid social adjustment, low age at onset, low scores on the “Global Assessment Scale” and childlessness in females.
The aim of this systematic review was to locate and analyze United States state crisis standards of care (CSC) documents to determine their prevalence and quality. Following PRISMA guidelines, Google search for “allocation of scarce resources” and “crisis standards of care (CSC)” for each state. We analyzed the plans based on the 2009 Institute of Medicine (IOM) report, which provided guidance for establishing CSC for use in disaster situations, as well as the 2014 CHEST consensus statement’s 11 core topic areas. The search yielded 42 state documents, and we excluded 11 that were not CSC plans. Of the 31 included plans, 13 plans were written for an “all hazards” approach, while 18 were pandemic influenza specific. Eighteen had strong ethical grounding. Twenty-one plans had integrated and ongoing community and provider engagement, education, and communication. Twenty-two had assurances regarding legal authority and environment. Sixteen plans had clear indicators, triggers, and lines of responsibility. Finally, 28 had evidence-based clinical processes and operations. Five plans contained all 5 IOM elements: Arizona, Colorado, Minnesota, Nevada, and Vermont. Colorado and Minnesota have all hazards documents and processes for both adult and pediatric populations and could be considered exemplars for other states.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
${\sim}60\%$
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
There is variation in care of secundum atrial septal defects. Defects <3 mm and patent foramen ovale are not clinically significant. Defects >3 mm are often followed clinically and may require closure. Variation in how these lesions are monitored may result in over-utilisation of routine studies and higher than necessary patient charges.
Purpose:
To determine utilisation patterns for patients with secundum atrial septal defects diagnosed within the first year of life and compare to locally developed optimal utilisation standard to assess charge savings.
Methods:
This was a retrospective chart review of patients with secundum atrial septal defects diagnosed within the first year of life. Patients with co-existing cardiac lesions were excluded. Total number of clinic visits, electrocardiograms, and echocardiograms were recorded. Total charge was calculated based on our standard institutional charges. Patients were stratified based on lesion and provider type and then compared to “optimal utilisation” using analysis of variance statistical analysis.
Results:
Ninety-seven patients were included, 40 had patent foramen ovale (or atrial septal defect <3 mm), 43 had atrial septal defects not requiring intervention and 14 had atrial septal defects requiring intervention. There was a statistically significant difference in mean charge above optimal for these lesions of $1033, $2885, and $5722 (p < 0.02), respectively. There was statistically significant variation of charge among types of provider as well. Average charge savings per patient would be $2530 with total charge savings of $242,472 if the optimal utilisation pathway was followed.
Conclusion:
Using optimal utilisation and decreasing variation could save the patient significant unnecessary charges.
Brannerite (UTi2O6) is among the major uranium-bearing minerals found in ore deposits, however as it has been long considered as a refractory mineral for leaching it is currently disregarded in ore deposits. Brannerite is found in a variety of geological environments with the most common occurrences being hydrothermal and pegmatitic. On the basis of scanning electron microscopy observations coupled with electron probe micro-analyses and laser ablation inductively-coupled plasma mass spectrometer analyses, this study describes the morphological features and the major- and trace-element abundances of brannerite samples from five hydrothermal and five pegmatitic localities across the world. Mineral compositions are also compared with observations from transmission electron microscopy and Raman spectrometry showing that brannerite is amorphous. Significant results include the definition of substitution trends and REE patterns, which are characteristics of either an occurrence or genetic type (hydrothermal and pegmatitic). Hence, in combination, it is possible to obtain reliable constraints for establishing a geochemical classification of brannerite. Inferred fingerprints have direct implications for forensic science and the exploration industry; they also contribute to a better understanding of metallogenic processes and to optimising the extraction of uranium.