To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate antibiotic prescribing behavior (APB) among physicians with various specialties in five Asian countries.
Survey of antibiotics prescribing behavior in three stages (initial, on-treatment, and de-escalation stages).
Participants included internists, infectious diseases (ID) specialists, hematologists, intensivists, and surgeons. Participants’ characteristics, patterns of APB, and perceptions of antimicrobial stewardship were collected. A multivariate analysis was conducted to evaluate factors associated with appropriate APB.
There were 367 participants. The survey response rate was 82.5% (367/445). For the initial stage, different specialties had different choices for empiric treatment. For the on-treatment stage, if the patient does not respond to empiric treatment, most respondents will step up to broader-spectrum antibiotics (273/367: 74.39%). For the de-escalation stage, the rate of de-escalation was 10%–60% depending on the specialty. Most respondents would de-escalate antibiotics based on guidelines (250/367: 68.12%). De-escalation was mostly reported by ID specialists (66/106: 62.26%). Respondents who reported that they performed laboratory investigations prior to empirical antibiotic prescriptions (aOR = 2.83) were associated with appropriate use, while respondents who reported ID consultation were associated with appropriate antibiotic management for infections not responding to empiric treatment (aOR = 40.87); adherence with national guidelines (aOR = 2.57) was associated with reported successful carbapenem de-escalation.
This study highlights the variation in practices and gaps in appropriate APB on three stages of antibiotic prescription among different specialties. Education on appropriate investigation, partnership with ID specialist, and availability and adherence with national guidelines are critical to help guide appropriate APB among different specialties.
Background: Although small- and medium-sized hospitals comprise most healthcare providers in South Korea, data on antibiotic usage is limited in these facilities. We evaluated the pattern of antibiotic usage and its appropriateness in hospitals with <400 beds in South Korea. Methods: A multicenter retrospective study was conducted in 10 hospitals (6 long-term care hospitals, 3 acute-care hospitals, and 1 orthopedic hospital), with <400 beds in South Korea. We analyzed patterns of antibiotic prescription and their appropriateness in the participating hospitals. Data on the monthly antibiotic prescriptions and patient days for hospitalized patients were collected using electronic databases from each hospital. To avoid the effect of the COVID-19 pandemic, data were collected from January to December 2019. For the evaluation of the appropriateness of the prescription, 25 patients under antibiotic therapy were randomly selected at each hospital over 2 separate periods. Due to the heterogeneity of their characteristics, the orthopedics hospital was excluded from the analysis. The collected data were reviewed, and the appropriateness of antibiotic prescriptions was evaluated by 5 specialists in infectious diseases (adult and pediatric). Data from 2 hospitals were assigned to each specialist. The appropriateness of antibiotic prescriptions was evaluated from 3 aspects: route of administration, dose, and class. If the 3 aspects were ‘optimal,’ the prescription was considered ‘optimal.’ If only the route was ‘optimal,’ and the dose and/or class was ‘suboptimal,’ but not ‘inappropriate,’ it was considered ‘suboptimal.’ If even 1 aspect was ‘inappropriate,’ it was classified as ‘inappropriate.’ Results: The most commonly prescribed antibiotics in long-term care hospitals was fluoroquinolone, followed by β-lactam/β-lactamase inhibitor (antipseudomonal). In acute-care hospitals, these were third-generation cephalosporin, followed by first-generation cephalosporin and second-generation cephalosporin. The major antibiotics that were prescribed in the orthopedics hospital was first-generation cephalosporin. Only 2.3% of the antibiotics were administered inappropriately. In comparison, 15.3% of patients were prescribed an inappropriate dose. The proportion of inappropriate antibiotic prescriptions was 30.6% of the total antibiotic prescriptions. Conclusions: The antibiotic usage patterns vary between small- and medium-sized hospitals in South Korea. The proportion of inappropriate prescriptions exceeded 30% of the total antibiotic prescriptions.
Rapid diagnostic testing (RDT) can provide prompt, accurate identification of infectious organisms and be a key component of antimicrobial stewardship (AMS) programs. However, their use is less widespread in Asia Pacific than western countries. Cost can be prohibitive, particularly in less resource-replete settings. A selective approach is required, possibly focusing on the initiation of antimicrobials, for differentiating bacterial versus viral infections and identifying locally relevant tropical diseases. Across Asia Pacific, more data are needed on RDT use within AMS, focusing on the impact on antimicrobial usage, patient morbidity and mortality, and cost effectiveness. Moreover, in the absence of formal guidelines, regional consensus statements to guide clinical practice are warranted. These will provide a regionally relevant definition for RDT; greater consensus on its role in managing infections; advice on implementation and overcoming barriers; and guidance on optimizing human resource capacity. By addressing these issues, the outcomes of AMS programs should improve.
We calculated the human resources required for an antimicrobial stewardship program (ASP) in Korean hospitals.
Multicenter retrospective study.
Eight Korean hospitals ranging in size from 295 to 1,337 beds.
The time required for performing ASP activities for all hospitalized patients under antibiotic therapy was estimated and converted into hours per week. The actual time spent on patient reviews of each ASP activity was measured with a small number of cases, then the total time was estimated by applying the determined times to a larger number of cases. Full-time equivalents (FTEs) were measured according to labor laws in Korea (52 hours per week).
In total, 225 cases were reviewed to measure time spent on patient reviews. The median time spent per patient review for ASP activities ranged from 10 to 16 minutes. The total time spent on the review for all hospitalized patients was estimated using the observed number of ASP activities for 1,534 patients who underwent antibiotic therapy on surveillance days. The most commonly observed ASP activity was ‘review of surgical prophylactic antibiotics’ (32.7%), followed by ‘appropriate antibiotics recommendations for patients with suspected infection without a proven site of infection but without causative pathogens’ (28.6%). The personnel requirement was calculated as 1.20 FTEs (interquartile range [IQR], 1.02–1.38) per 100 beds and 2.28 FTEs (IQR, 1.93–2.62) per 100 patients who underwent antibiotic therapy, respectively.
The estimated time required for human resources performing extensive ASP activities on all hospitalized patients undergoing antibiotic therapy in Korean hospitals was ~1.20 FTEs (IQR, 1.02–1.38) per 100 beds.
Background: Contact isolation (ie, patient isolation with contact precautions) has been frequently used for preventing healthcare-associated infections caused by epidemiologically important pathogens (eg, vancomycin-resistant enterococcus [VRE]) via direct or indirect contact with patients. Based on ineffective components of routine contact isolations (eg, fewer healthcare provider visits), some studies have reported an association between the likelihood of adverse events and contact isolation. Objective: Given no strong evidence for this association due to most studies’ invalid study designs and systematic misclassification, we compared adverse events between a VRE isolation cohort and a matched comparison cohort, using a propensity score matching cohort study design. Methods: This study was conducted at a 1,337-bed, tertiary-care, university-affiliated, Korean hospital equipped with a full electronic medical record (EMR) system for all patient records. With institutional review board approval, all relevant EMR records were extracted for the study period 2015–2017. All contact isolation information of VRE patients were confirmed through EMR manual review by 1 trained research nurse. For propensity score matching, risk factors for adverse events (ie, decubitus ulcer, fall, and cardiopulmonary resuscitation [CPR]) were selected based on literature reviews: length of stay, age, gender, diabetes mellitus, hypertension, albumin, Charlson comorbidity index, Braden scale score, and Hendrich II fall risk. For each VRE case, the 1:1 matched case was selected through the nearest neighbor matching with calculated propensity scores. The retrospective observation period was from the cohort entry date (ie, contact isolation start date) to the cohort exit date (ie, discharge or discontinue of contact isolation). A time-to-event analysis with a Cox proportional hazard model was conducted using SAS version 9.4 software. Results: Among the 98,527 inpatients (323 VRE positive; 98,204 VRE negative), the VRE cohort (N = 141 of 216, 65% of total VRE patients admitted to general wards without adverse event history before contact isolation) and the matched comparison (no isolation) cohort (N = 141, 0.1%) showed no differences in characteristic comparisons (Table 1). The Cox proportional hazard model was not applicable for CPR because no CPR case was available in the matched comparison cohort. The hazard ratios for adverse events showed no statistically significant difference for both cohorts: decubitus ulcer (hazard ratio [HR], 1.049; 95% CI, 0.328–3.352; fall (HR, 0.418; 95% CI, 0.051–3.349) (Table 2). Conclusions: Based on the full EMR records for 3 years, our propensity-score–matched cohort study reported no association between the likelihoods of adverse events and contact isolation.
Funding: This work was supported by the Collaborative Research Program of Medical Science and Nursing Science from Seoul National University College of Medicine (Grant no. 800-20180001 & 810-20180001).
Background: After the Middle East respiratory syndrome coronavirus outbreak in Korea in 2015, the government newly established the additional reimbursement for infection prevention to encourage infection control activities in the hospitals. The new policy was announced in December 2015 and was implemented in September 2016. We evaluated how infection control activities improved in hospitals after the change of government policy in Korea. Methods: Three cross-sectional surveys using the WHO Hand Hygiene Self-Assessment Framework (HHSAF) were conducted in 2013, 2015, and 2017. Using multivariable linear regression model including hospital characteristics, we analyzed the changes in total HHSAF scores according to the survey time. Results: In total, 32 hospitals participated in the survey in 2013, 52 in 2015, and 101 in 2017. The number of inpatient beds per infection control professionals decreased from 324 in 2013 to 303 in 2015 and 179 in 2017. Most hospitals were at intermediate or advanced levels of progress (90.6% in 2013, 86.6% in 2015, and 94.1% in 2017). In a multivariable linear regression model, the total HHSAF scores were significantly associated with hospital teaching status (β coefficient of major teaching hospital, 52.6; 95% CI, 8.9–96.4; P = .018), bed size (β coefficient of 100-bed increase, 5.1; 95% CI, 0.3–9.8; P = .038), and survey time (β coefficient of 2017 survey, 45.1; 95% CI, 19.3–70.9; P = .001). Conclusions: After the national policy implementation, the number of infection control professionals increased, and the promotion of hand hygiene activities was strengthened in Korean hospitals.
Decreased hemoglobin levels increase the risk of developing dementia among the elderly. However, the underlying mechanisms that link decreased hemoglobin levels to incident dementia still remain unclear, possibly due to the fact that few studies have reported on the relationship between low hemoglobin levels and neuroimaging markers. We, therefore, investigated the relationships between decreased hemoglobin levels, cerebral small-vessel disease (CSVD), and cortical atrophy in cognitively healthy women and men.
Cognitively normal women (n = 1,022) and men (n = 1,018) who underwent medical check-ups and magnetic resonance imaging (MRI) were enrolled at a health promotion center. We measured hemoglobin levels, white matter hyperintensities (WMH) scales, lacunes, and microbleeds. Cortical thickness was automatically measured using surface based methods. Multivariate regression analyses were performed after controlling for possible confounders.
Decreased hemoglobin levels were not associated with the presence of WMH, lacunes, or microbleeds in women and men. Among women, decreased hemoglobin levels were associated with decreased cortical thickness in the frontal (Estimates, 95% confidence interval, −0.007, (−0.013, −0.001)), temporal (−0.010, (−0.018, −0.002)), parietal (−0.009, (−0.015, −0.003)), and occipital regions (−0.011, (−0.019, −0.003)). Among men, however, no associations were observed between hemoglobin levels and cortical thickness.
Our findings suggested that decreased hemoglobin levels affected cortical atrophy, but not increased CSVD, among women, although the association is modest. Given the paucity of modifiable risk factors for age-related cognitive decline, our results have important public health implications.
To compare the characteristics and risk factors for surgical site infections (SSIs) after total hip arthroplasty (THA) and total knee arthroplasty (TKA) in a nationwide survey, using shared case detection and recording systems.
Retrospective cohort study.
Twenty-six hospitals participating in the Korean Nosocomial Infections Surveillance System (KONIS).
From 2006 to 2009, all patients undergoing THA and TKA in KONIS were enrolled.
SSI occurred in 161 (2.35%) of 6,848 cases (3,422 THAs and 3,426 TKAs). Pooled mean SSI rates were 1.69% and 2.82% for THA and TKA, respectively. Of the cases we examined, 42 (26%) were superficial-incisional SSIs and 119 (74%) were “severe” SSIs; of the latter, 24 (15%) were deep-incisional SSIs and 95 (59%) were organ/space SSIs. In multivariate analysis, a duration of preoperative hospital stay of greater than 3 days was a risk factor for total SSI after both THA and TKA. Diabetes mellitus, revision surgery, prolonged duration of surgery (above the 75th percentile), and the need for surgery due to trauma were independent risk factors for total and severe SSI after THA, while male sex and an operating room without artificial ventilation were independent risk factors for total and severe SSI after TKA. A large volume of surgeries (more than 10 procedures per month) protected against total and severe SSI, but only in patients who underwent TKA.
Risk factors for SSI after arthroplasty differ according to the site of the arthroplasty. Therefore, clinicians should take into account the site of arthroplasty in the analysis of SSI and the development of strategies for reducing SSI.
To evaluate the risk factors for surgical site infection (SSI) after gastric surgery in patients in Korea.
A nationwide prospective multicenter study.
Twenty university-affiliated hospitals in Korea.
The Korean Nosocomial Infections Surveillance System (KONIS), a Web-based system, was developed. Patients in 20 Korean hospitals from 2007 to 2009 were prospectively monitored for SSI for up to 30 days after gastric surgery. Demographic data, hospital characteristics, and potential perioperative risk factors were collected and analyzed, using multivariate logistic regression models.
Of the 4,238 case patients monitored, 64.9% (2,752) were male, and mean age (±SD) was 58.8 (±12.3) years. The SSI rates were 2.92, 6.45, and 10.87 per 100 operations for the National Nosocomial Infections Surveillance system risk index categories of 0, 1, and 2 or 3, respectively. The majority (69.4%) of the SSIs observed were organ or space SSIs. The most frequently isolated microorganisms were Staphylococcus aureus and Klebsiella pneumoniae. Male sex (odds ratio [OR], 1.67 [95% confidence interval (CI), 1.09–2.58]), increased operation time (1.20 [1.07–1.34] per 1-hour increase), reoperation (7.27 [3.68–14.38]), combined multiple procedures (1.79 [1.13–2.83]), prophylactic administration of the first antibiotic dose after skin incision (3.00 [1.09–8.23]), and prolonged duration (≥7 days) of surgical antibiotic prophylaxis (SAP; 2.70 [1.26–5.64]) were independently associated with increased risk of SSI.
Male sex, inappropriate SAP, and operation-related variables are independent risk factors for SSI after gastric surgery.
To evaluate the clinical features of ciprofloxacin-resistant Enterobacter bacteremia and to examine the risk factors for ciprofloxacin resistance in Enterobacter species isolates causing bacteremia.
A case-control study.
A 1,500-bed, tertiary-care university hospital and referral center.
All patients older than 16 years with Enterobacter species isolated from blood were enrolled. The medical records of 183 patients with clinically significant Enterobacter bacteremia from January 1998 to December 2002 were identified. We compared patients with bacteremia caused by ciprofloxacin-susceptible isolates with patients with bacteremia caused by ciprofloxacin-resistant isolates.
Twenty-three (12.6%) of the patients had bacteremia caused by isolates resistant to ciprofloxacin. There were no significant differences in age, gender, underlying diseases, primary site of infection, or Acute Physiology and Chronic Health Evaluation II score between the ciprofloxacin-resistant and the ciprofloxacin-susceptible groups. Broad-spectrum cephalosporin resistance, defined as resistance to cefotaxime or ceftazidime in vitro, was detected in 21 (91.3%) of 23 ciprofloxacin-resistant isolates compared with 65 (40.6%) of 160 ciprofloxacin-susceptible isolates (P < .001). Multivariate analysis revealed that independent risk factors for ciprofloxacin resistance were the prior receipt of fluoroquinolones (P < .001) and broad-spectrum cephalosporin resistance (P < .001).
In Enterobacter species isolates causing bacteremia, ciprofloxacin resistance was closely associated with the prior receipt of fluoroquinolones and broad-spectrum cephalosporin resistance. The close relationship between ciprofloxacin resistance and broad-spectrum cephalosporin resistance is particularly troublesome because it severely restricts the therapeutic options for Enterobacter species infection.
To evaluate risk factors and treatment outcomes of bloodstream infections caused by extended-spectrum beta-lactamase-producing Klebsiella pneumoniae (ESBL-KP).
Retrospective case-control study. Stored blood isolates of K. pneumoniae were tested for ESBL production by NCCLS guidelines, double-disk synergy test, or both.
A 1,500-bed, tertiary-care university hospital and referral center.
Sixty case-patients with bacteremia due to ESBL-KP were compared with 60 matched control-patients with non-ESBL-KP.
There were no significant differences in age, gender, APACHE II score, or underlying diseases between the groups. Independent risk factors for infections caused by ESBL-KP were urinary catheterization, invasive procedure within the previous 72 hours, and an increasing number of antibiotics administered within the previous 30 days. Complete response rate, evaluated 72 hours after initial antimicrobial therapy, was higher among control-patients (13.3% vs 36.7%; P = .003). Treatment failure rate was higher among case-patients (35.0% vs 15%; P = .011). Overall 30-day mortality rate was 30% for case-patients and 28.3% for control-patients (P = .841). Case-patients who received imipenem or ciprofloxacin as a definitive antibiotic had 10.5% mortality. The mortality rate for initially ineffective therapy was no higher than that for initially effective therapy (9.1% vs 11.1%; P = 1.000), but statistical power was low for evaluating mortality in the absence of septic shock.
For K. pneumoniae bacteremia, patients with ESBL-KP had a higher initial treatment failure rate but did not have higher mortality if antimicrobial therapy was appropriately adjusted in this study with limited statistical power.
Carbon nanotubes (CNTs) have attracted remarkable attention as reinforcement for composites owing to their outstanding properties1-3. CNT/Cu nanocomposites were fabricated by mixing the nano-sized Cu powders with multi-wall carbon nanotubes and followed by the spark plasma sintering process. The CNT/Cu nanocomposite fabricated from nano-sized Cu powders shows more homogeneous distribution of CNTs in matrix compared to that fabricated from macro-sized Cu powders. The hardness of CNT/Cu nanocomposite fabricated from nano-sized Cu powders increases with increasing the volume fraction of CNTs, while the hardness of that fabricated from macro-sized Cu powders decreases with the addition of CNTs.
To evaluate the outcome of attempted Hickman catheter salvage in neutropenic cancer patients with Staphylococcus aureus bacteremia who were not using antibiotic lock therapy.
Retrospective cohort study.
A university-affiliated, tertiary-care hospital with 1,500 beds for adult patients.
All neutropenic cancer patients who had a Hickman catheter andS. aureus bacteremia (32 episodes in 29 patients) between January 1998 and March 2002.
Salvage attempts were defined as cases where the Hickman catheter was not removed until we obtained the results of follow-up blood cultures performed 48 to 72 hours after starting treatment with antistaphylococcal antibiotics. Salvage was considered to be successful if the Hickman catheter was still in place 3 months later without recurrent bacteremia or death.
Catheter salvage was attempted in 24 (75%) of the 32 episodes. Of the salvage attempts, the success rate was 50% (12 of 24). Salvage attempts were successful in 14% (1 of 7) of the episodes with positive follow-up blood cultures, and in 65% (11 of 17) of those with negative follow-up blood cultures (P = .07). If the analysis is confined to cases with no external signs of catheter infection, salvage attempts were successful in 14% (1 of 7) of the episodes with positive follow-up blood cultures and in 80% (8 of 10) of those with negative follow-up blood cultures (P = .02).
In neutropenic cancer patients with S. aureus bacteremia, attempted catheter salvage without antibiotic lock therapy was successful in 50% of the cases.
Email your librarian or administrator to recommend adding this to your organisation's collection.