To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Although small- and medium-sized hospitals comprise most healthcare providers in South Korea, data on antibiotic usage is limited in these facilities. We evaluated the pattern of antibiotic usage and its appropriateness in hospitals with <400 beds in South Korea. Methods: A multicenter retrospective study was conducted in 10 hospitals (6 long-term care hospitals, 3 acute-care hospitals, and 1 orthopedic hospital), with <400 beds in South Korea. We analyzed patterns of antibiotic prescription and their appropriateness in the participating hospitals. Data on the monthly antibiotic prescriptions and patient days for hospitalized patients were collected using electronic databases from each hospital. To avoid the effect of the COVID-19 pandemic, data were collected from January to December 2019. For the evaluation of the appropriateness of the prescription, 25 patients under antibiotic therapy were randomly selected at each hospital over 2 separate periods. Due to the heterogeneity of their characteristics, the orthopedics hospital was excluded from the analysis. The collected data were reviewed, and the appropriateness of antibiotic prescriptions was evaluated by 5 specialists in infectious diseases (adult and pediatric). Data from 2 hospitals were assigned to each specialist. The appropriateness of antibiotic prescriptions was evaluated from 3 aspects: route of administration, dose, and class. If the 3 aspects were ‘optimal,’ the prescription was considered ‘optimal.’ If only the route was ‘optimal,’ and the dose and/or class was ‘suboptimal,’ but not ‘inappropriate,’ it was considered ‘suboptimal.’ If even 1 aspect was ‘inappropriate,’ it was classified as ‘inappropriate.’ Results: The most commonly prescribed antibiotics in long-term care hospitals was fluoroquinolone, followed by β-lactam/β-lactamase inhibitor (antipseudomonal). In acute-care hospitals, these were third-generation cephalosporin, followed by first-generation cephalosporin and second-generation cephalosporin. The major antibiotics that were prescribed in the orthopedics hospital was first-generation cephalosporin. Only 2.3% of the antibiotics were administered inappropriately. In comparison, 15.3% of patients were prescribed an inappropriate dose. The proportion of inappropriate antibiotic prescriptions was 30.6% of the total antibiotic prescriptions. Conclusions: The antibiotic usage patterns vary between small- and medium-sized hospitals in South Korea. The proportion of inappropriate prescriptions exceeded 30% of the total antibiotic prescriptions.
The risk factors of environmental contamination by SARS-CoV-2 are largely unknown. We analyzed 1,320 environmental samples obtained from COVID-19 patients over 1 year. The risk factors for contamination of COVID-19 patients’ surrounding environment were higher viral load in the respiratory tract and shorter duration from symptom onset to sample collection.
This study aimed to determine the effect of donor-transmitted atherosclerosis on the late aggravation of cardiac allograft vasculopathy in paediatric heart recipients aged ≥7 years.
In total, 48 patients were included and 23 had donor-transmitted atherosclerosis (baseline maximal intimal thickness of >0.5 mm on intravascular ultrasonography). Logistic regression analyses were performed to identify risk factors for donor-transmitted atherosclerosis. Rates of survival free from the late aggravation of cardiac allograft vasculopathy (new or worsening cardiac allograft vasculopathy on following angiograms, starting 1 year after transplantation) in each patient group were estimated using the Kaplan–Meier method and compared using the log-rank test. The effect of the results of intravascular ultrasonography at 1 year after transplantation on the late aggravation of cardiac allograft vasculopathy, correcting for possible covariates including donor-transmitted atherosclerosis, was examined using the Cox proportional hazards model.
The mean follow-up duration after transplantation was 5.97 ± 3.58 years. The log-rank test showed that patients with donor-transmitted atherosclerosis had worse survival outcomes than those without (p = 0.008). Per the multivariate model considering the difference of maximal intimal thickness between baseline and 1 year following transplantation (hazard ratio, 22.985; 95% confidence interval, 1.948–271.250; p = 0.013), donor-transmitted atherosclerosis was a significant covariate (hazard ratio, 4.013; 95% confidence interval, 1.047–15.376; p = 0.043).
Paediatric heart transplantation recipients with donor-transmitted atherosclerosis aged ≥7 years had worse late cardiac allograft vasculopathy aggravation-free survival outcomes.
The explosive outbreak of COVID-19 led to a shortage of medical resources, including isolation rooms in hospitals, healthcare workers (HCWs) and personal protective equipment. Here, we constructed a new model, non-contact community treatment centres to monitor and quarantine asymptomatic and mildly symptomatic COVID-19 patients who recorded their own vital signs using a smartphone application. This new model in Korea is useful to overcome shortages of medical resources and to minimise the risk of infection transmission to HCWs.
Given the dynamic characteristic of an individual’s drinking behaviours, comprehensive consideration of alcohol consumption variation using repeated measures may improve insight into the nature of its association with blood pressure (BP) change. We examined the association between longitudinal alcohol consumption (trajectory and quantity) and changes in BP and pulse pressure (PP) among Korean aged ≥ 40 years living in rural areas. Totally, 1682 hypertension-free participants who completed all three health examinations (median, 5·3 years) were included. All three visits were used to determine the cumulative trajectory of and quantity of alcohol consumption and the latest two visits and the last visit were used for the recent trajectory and the most recent quantity of alcohol consumption, respectively. Changes in BP and PP from the baseline to the third visit were used as outcome. In men, ≥30 ml/d cumulative average alcohol consumption was associated with the greatest increase in systolic BP (SBP) in both baseline outcome-unadjusted (2·9 mmHg, P = 0·032) and -adjusted models (3·6 mmHg, P = 0·001), and the given association for the most recent alcohol consumption was observed in the baseline outcome-adjusted model (3·9 mmHg, P = 0·003). For PP, similar associations were observed only in the baseline outcome-adjusted model. No meaningful associations in diastolic BP in men and any BP or PP in women existed. The quantity of alcohol consumption rather than the trajectory may be significantly related to raised SBP, and a possible short-term influence of the most recent alcohol consumption may exist when baseline SBP is adjusted in men.
Lipid metabolism and inflammation contribute to CVD development. This study investigated whether the consumption of cranberries (CR; Vaccinium macrocarpon) can alter HDL metabolism and prevent inflammation in mice expressing human apo A-I transgene (hApoAITg), which have similar HDL profiles to those of humans. Male hApoAITg mice were fed a modified American Institute of Nutrition-93M high-fat/high-cholesterol diet (16 % fat, 0·25 % cholesterol, w/w; n 15) or the high-fat/high-cholesterol diet containing CR (5 % dried CR powder, w/w, n 16) for 8 weeks. There were no significant differences in body weight between the groups. Serum total cholesterol, non-HDL-cholesterol and TAG concentrations were significantly lower in the control than CR group with no significant differences in serum HDL-cholesterol and apoA-I. Mice fed CR showed significantly lower serum lecithin–cholesterol acyltransferase activity than the control. Liver weight and steatosis were not significantly different between the groups, but hepatic expression of genes involved in cholesterol metabolism was significantly lower in the CR group. In the epididymal white adipose tissue (eWAT), the CR group showed higher weights with decreased expression of genes for lipogenesis and fatty acid oxidation. The mRNA abundance of F4/80, a macrophage marker and the numbers of crown-like structures were less in the CR group. In the soleus muscle, the CR group also demonstrated higher expression of genes for fatty acid β-oxidation and mitochondrial biogenesis than those of the control. In conclusion, although CR consumption elicited minor effects on HDL metabolism, it prevented obesity-induced inflammation in eWAT with concomitant alterations in soleus muscle energy metabolism.
To date, there have been few studies on dietary supplement (DS) use in Korean children and adolescents, using nationally representative data. This study aimed to investigate the current status of DS use and its related factors, among Korean children and adolescents from the Korean National Health and Nutrition Examination Survey (KNHANES) data.
A cross-sectional study.
Data from the KNHANES 2015–2017. Participants completed 24-h dietary recall interviews, including DS products that the subjects consumed.
The study population was 4380 children and adolescents aged 1–18 years.
Approximately 2013 % of children and adolescents were using DS; the highest use was among children aged 1–3 years old, and the lowest use was among adolescents aged 16–18 years. The most frequently used DS was prebiotics/probiotics, followed by multivitamin/mineral supplements. Factors that were associated with DS use were lower birth weight in children aged <4 years; younger age, higher household income, regular breakfast intake and lower BMI in children aged 4–9 years; and regular breakfast intake and use of nutrition facts label in adolescents aged 10–18 years. Feeding patterns in infancy and having chronic diseases were not associated with DS use.
We report that over 20 % of children and adolescents use DS. Nutritional education for parents and children about proper DS consumption is needed.
Background: Recently, healthcare-associated infections (HAIs) in long-term care hospitals (LTCHs) have markedly increased, but no infection control policy has been established in South Korea. We investigated the current HAI surveillance system and executed a point-prevalence pilot study in LTCHs. Methods: HAIs were defined by newly established surveillance manual based on McGeer criteria revised in 2012. Three LTCHs in Seoul and Gyeonggi province were voluntarily recruited, and data were collected from up to 50 patients who were hospitalized on August 1. The medical records from September to November 2018 were retrospectively reviewed by a charge nurse for infection control per each hospitals after 1 day of training specific for LTCH surveillance. All data were reviewed by a senior researcher visiting onsite. Results: The participating hospitals had 272.33 ± 111.01 beds. Only 1 hospital had an onsite microbiological laboratory. In total, 156 patients were enrolled and 5 HAIs were detected, for a prevalence rate of 3.2%. The average patient age was 79.04 ± 9.92 years. The HAIs included 2 urinary tract infections, skin and soft-tissue infection, low respiratory infection, and conjunctivitis. Conclusions: This is the first survey of HAI in LTCHs in South Korea. The 3.2% prevalence rate is lower than those from previous reports from the European Union or the United States. This study supports the development of a national HAI surveillance and infection control system in LTCHs, although implementation may be limited due to the lack of laboratory support and infection control infrastructure in Korea.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
The Internet is commonly used in modern society; however, Internet use may become a problematic behaviour. There is an increasing need for research on problematic Internet use (PIU) and its’ associated risk factors.
This study aims to explore the prevalence and health correlates of problematic Internet use among South Korean adults.
We recruited the participants aged between 18 and 84 years old among the online panel of an online research service. The sample size of the survey was 500. Of these 500 participants, 51.4% (n = 257) were men and 48.6% (n = 243) were women. A participant was classified as a problematic Internet use (PIU) if his/her total score of Young's Internet Addiction Scale (YIA) was above 50. Stress Response Index (SRI), Fagerstrom test for nicotine dependence, lifetime average caffeine consumption, and sociodemographic query form were used in the collection of data. The t test and chi-square test were used for data analysis.
One hundred ninety-seven (39.4%) of the participants was classified into the PIU group. There was no difference of gender and education between PIU and normal users. However, PIU group was younger (mean 39.5 years) than normal users (mean 45.8 years). PIU group was more likely to have high levels of perceived stress, nicotine dependence, and drink more often caffeinated beverages (P < 0.05).
These data indicate that problematic Internet use is associated with perceived stress level, nicotine and caffeine use in South Korean Internet users. More research is needed to better understand the relationship between Internet use and mental health issues.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Mycoplasma pneumoniae is a main pathogen causing community-acquired pneumonia in children and young adults. Since the emergence of macrolide-resistant M. pneumoniae in the early 2000s in Japan, it has been increasingly reported worldwide as a growing problem in treatment for children. With increasing macrolide-resistant M. pneumoniae and limited data regarding its characterization and molecular analysis, we investigated the dominant M. pneumoniae strains during the recent outbreak in South Korea, and evaluated if there was an association between a specific type and macrolide resistance. Between October 2014 and December 2016 in South Korea, 249 respiratory specimens obtained from patients with confirmed M. pneumoniae pneumonia were genotyped the P1 adhesin gene, and the mutations associated with resistance (A2063G and A2064G) were tested by sequencing the targeted domain V regions of the 23S ribosomal RNA gene. Results revealed that M. pneumoniae type 1 were predominant, which was strongly associated with macrolide-resistance during the whole study period. This is the first study assessing whether M. pneumoniae subtype is related to macrolide resistance during the outbreak of M. pneumoniae.
Treatment of liver fibrosis is very limited as there is currently no effective anti-fibrotic therapy. Spirulina platensis (SP) is a blue-green alga that is widely supplemented in healthy foods. The objective of this study was to determine whether SP supplementation can prevent obesity-induced liver fibrosis in vivo. Male C57BL/6J mice were randomly assigned to a low-fat or a high-fat (HF)/high-sucrose/high-cholesterol diet or an HF diet supplemented with 2·5 % SP (w/w) (HF/SP) for 16 or 20 weeks. There were no significant differences in body weight, activity, energy expenditure, serum lipids or glucose tolerance between mice on HF and HF/SP diets. However, plasma alanine aminotransferase level was significantly reduced by SP at 16 weeks. Expression of fibrotic markers and trichrome stains showed no differences between HF and HF/SP. Splenocytes isolated from HF/SP fed mice had lower inflammatory gene expression and cytokine secretion compared with splenocytes from HF-fed mice. SP supplementation did not attenuate HF-induced liver fibrosis. However, the expression and secretion of inflammatory genes in splenocytes were significantly reduced by SP supplementation, demonstrating the anti-inflammatory effects of SP in vivo. Although SP did not show appreciable effect on the prevention of liver fibrosis in this mouse model, it may be beneficial for other inflammatory conditions.
This study aimed to investigate associations among spirituality, coping strategies, quality of life (QOL), and the effects of depression and anxiety thereon in cancer patients.
In total, 237 cancer patients referred to a psycho-oncology clinic at a university hospital in Korea were enrolled. After identifying predictors of patient QOL in a stepwise regression model, we developed a hypothetical path model wherein interpersonal coping was considered as a mediating variable between spirituality (meaning/peace) and QOL and wherein depression and anxiety affected each of these three variables.
The direct effect of spirituality (meaning/peace) on QOL was 36.7%. In an indirect model, interpersonal coping significantly mediated the relationship between spirituality (meaning/peace) and QOL. Depression exerted the largest negative effect on spirituality (meaning/peace), interpersonal coping, and QOL. Anxiety had negative effects on spirituality (meaning/peace) and QOL, but a positive effect on interpersonal coping.
Significance of results
Interpersonal coping strategies work as a partial mediator of the relationship between meaning/peace subscales of spirituality and QOL. Effective management of depression may help in achieving better outcomes associated therewith. Greater attention and efforts to improve social connectedness and meaning of life in spiritual well-being may improve the QOL of cancer patients.
To verify the validity of a semiautomated surgical site infection (SSI) surveillance system using electronic screening algorithms in 38 categories of surgery.
A cohort study for validation of semiautomated SSI surveillance system using screening algorithms.
A 1,989-bed tertiary-care referral center in Seoul, Republic of Korea.
A dataset of 40,516 surgical procedures in 38 categories stored in the conventional SSI surveillance registry at the Samsung Medical Center between January 2013 and December 2014 was used as the reference standard. In the semiautomated surveillance system, electronic screening algorithms flagged cases meeting at least 1 of 3 criteria: antibiotic prescription, microbial culture, and infectious disease consultation. Flagged cases were audited by infection preventionists. Analyses of sensitivity, specificity, and positive predictive value (PPV) were conducted for the semiautomated surveillance system, and its effect on reducing the workload for chart review was evaluated.
A total of 575 SSI events (1·42%) were identified by conventional SSI surveillance. The sensitivity of the semiautomated SSI surveillance was 96·7%, and the PPV of the screening algorithms alone was 4·1%. Semiautomated SSI surveillance reduced the chart review workload of the infection preventionists from 1,283 to 482 person hours per year (a 62·4% decrease).
Compared to conventional surveillance, semiautomated surveillance using electronic screening algorithms followed by chart review of selected cases can provide high-validity surveillance results and can significantly reduce the workload of infection preventionists.
Rising costs and the rapidly increasing volume of findings from research in health care are driving the demand for comprehensive information to inform the allocation of resources. Health technology assessment (HTA) applies rigorous processes to provide high-quality synthesized information to policymakers and healthcare payers. HTA involves combining large amounts of research publications to systematically evaluate the properties, effects, and impacts on a topic of interest.
The time and resources required to complete a full HTA are often demanding. There is an opportunity to apply high-performance computing (inclusive of artificial intelligence and machine learning disciplines) to HTA. This project applied high-computing technology to create a research synthesis tool to support HTA and then developed a service that integrates as much relevant data as possible to strengthen HTA. This was a joint project that combined expertise from the areas of health technology, machine learning, information technology, and innovation.
The information gathered for this phased project from HTA subject matter experts and other stakeholders was collated to inform a research synthesis tool and a broader concept of the project.
The results of this study will inform the design of a research synthesis tool that covers the entire HTA process (literature search, screening titles and abstracts, data extraction, quality assessment, and analysis). The collaborators included Alberta Innovates, the Alberta Machine Intelligence Institute, the University of Alberta, Cybera, and PolicyWise. Alberta Innovates, which is an accelerator and innovator of research in the province of Alberta, Canada, was the primary source of funding for this project.
It has not been well established whether dietary folate intake reduces the risk of diabetes development. We aimed to clarify the prospective association between dietary folate intake and type 2 diabetes (T2D) risk among 7333 Korean adults aged 40 years or older who were included in the Multi-Rural Communities Cohort. Dietary folate intake was estimated from all 106 food items listed on a FFQ, not including folate intake from supplements. Two different measurements of dietary folate intake were used: the baseline consumption and the average consumption from baseline until just before the end of follow-up. The association between folate intake and T2D risk was determined through a modified Poisson regression model with a robust error estimator controlling for potential confounders. For 29 745 person years, 319 cases of diabetes were ascertained. In multivariable analyses, dietary folate intake was inversely associated with risk of T2D for women, not for men. For women, the incidence rate ratio of diabetes in the third tertile compared with the first tertile was 0·57 (95 % CI 0·38–0·87, Pfor trend=0·0085) in the baseline consumption model and 0·64 (95 % CI 0·43–0·95, Pfor trend=0·0244) in the average consumption model. These inverse associations was found in both normal fasting blood glucose group and impaired fasting glucose group among women. Among non-users of multinutrients and vitamin supplements, the significant inverse association remained. Thus, higher dietary intake of folate is prospectively associated with lower risk of diabetes for women.
We present the results of the linear polarisation observations of methanol masers at 44 and 95 GHz towards 39 massive star forming regions (Kang et al. 2016). These two lines are observed simultaneously with the 21-m Korean VLBI Network (KVN) telescope in single dish mode. About 60% of the observed showed fractional polarisation of a few percents at least at one of the two transition lines. We note that the linear polarisation of the 44 GHz methanol maser is first detected in this study including single dish and interferometer observations. We find the polarisation properties of these two lines are similar as expected, since they trace similar regions. As a follow-up study, we have carried out the VLBI polarisation observations toward some 44 GHz maser targets using the KVN telescope. We present preliminary VLBI polarisation results of G10.34-0.14, which show consistent polarisation properties in multiple epoch observations.
The HoDoo English game was developed to take advantage of the benefits attributed to on-line games while teaching English to native Korean speakers. We expected to see that the improvements in the subjects’ English language abilities after playing the HoDoo English game would be associated with increased brain functional connectivity in the areas of the brain involved in the language production (Broca’s area) and the understanding (Wernicke’s area) networks. Twelve children, aged nine and ten, were asked to play the on-line English education game for 50 minutes per day, five days per week for twelve weeks. At baseline, and again at the end of twelve weeks of game play, each child’s English language ability was assessed and a functional magnetic resonance imaging (fMRI) scan was conducted. The on-line English education game play effectively improved English language skills, especially in terms of non-verbal pragmatic skills. Following twelve weeks of on-line English education game play, the children showed positive connectivity between Broca’s area and the left frontal cortex as well as between Wernicke’s area and the left parahippocampal gyrus and the right medial frontal gyrus. Changes in pragmatic scores were positively correlated with average peak brain activity in the left parahippocampal gyrus. To the best of our knowledge, this is the first study to report an improvement in English ability and changes in brain activity within language areas after on-line language education game play.
During the past decade, carbapenemase-producing Enterobacteriaceae (CPE) has emerged and spread across the world.1 The major carbapenemase enzymes currently being reported are KPC, NDM-1, VIM, IMP, and OXA.2 Because carbapenemase can be effectively transmitted via mobile genetic elements, and current therapeutic options for CPE infections are extremely limited, CPE may be one of the most serious contemporary threats to public health. However, very little is known about the characteristics of CPE carriage during hospitalization. The aims of this study were to investigate the clearance rate of CPE carriage and determine the number of consecutive negative cultures required to confirm CPE clearance. We also examined CPE transmission among hospitalized patients.
Infect. Control Hosp. Epidemiol. 2015;36(11):1361–1362