To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We conducted a replication of Shafir (1993) who showed that people are inconsistent in their preferences when faced with choosing versus rejecting decision-making scenarios. The effect was demonstrated using an enrichment paradigm, asking subjects to choose between enriched and impoverished alternatives, with enriched alternatives having more positive and negative features than the impoverished alternative. Using eight different decision scenarios, Shafir found support for a compatibility principle: subjects chose and rejected enriched alternatives in choose and reject decision scenarios (d = 0.32 [0.23,0.40]), respectively, and indicated greater preference for the enriched alternative in the choice task than in the rejection task (d = 0.38 [0.29,0.46]). In a preregistered very close replication of the original study (N = 1026), we found no consistent support for the hypotheses across the eight problems: two had similar effects, two had opposite effects, and four showed no effects (overall d = −0.01 [−0.06,0.03]). Seeking alternative explanations, we tested an extension, and found support for the accentuation hypothesis.
We investigated the effects of transcranial alternating stimulation (tACS) in patients with insomnia. Nine patients with chronic insomnia underwent two in-laboratory polysomnography, 2 weeks apart, and were randomized to receive tACS either during the first or second study. The stimulation was applied simultaneously and bilaterally at F3/M1 and F4/M2 electrodes (0.75 mA, 0.75 Hz, 5-minute). Sleep onset latency and wake after sleep onset dropped on the stimulation night but they did not reach statistical significance; however, there were significant improvements in spontaneous and total arousals, sleep quality, quality of life, recall memory, sleep duration, sleep efficiency, and daytime sleepiness.
This study investigates how International Relations (IR) as an academic discipline emerged and evolved in South Korea, focusing on the country's peculiar colonial and postcolonial experiences. In the process, it examines why South Korean IR has been so state-centric and positivist (American-centric), while also disclosing the ways in which international history has shaped the current state of IR in South Korea, institutionally and intellectually. It is argued that IR intellectuals in South Korea have largely reflected the political arrangement of their time, rather than demonstrate academic independence or leadership for its government and/or civil society, as they have navigated difficult power structures in world politics. Related to this, it reveals South Korean IR's twisted postcoloniality, which is the absence – or weakness – of non-Western Japanese colonial legacies in its knowledge production/system, while its embracing the West/America as an ideal and better model of modernity for South Korea's security and development. It also reveals that South Korean IR's recent quest for building a Korean School of IR to overcome its Western dependency appears to be in operation within a colonial mentality towards mainstream American IR.
Two aphid-transmitted RNA viruses, broad bean wilt virus 2 (BBWV2) and cucumber mosaic virus (CMV), are the most prevalent viruses in Korean pepper fields and cause chronic damage in pepper production. In this study, we employed a screening system for pathotype-specific resistance of pepper germplasm to BBWV2 and CMV by utilizing infectious cDNA clones of different pathotypes of the viruses (two BBWV2 strains and three CMV strains). We first examined pathogenic characteristics of the BBWV2 and CMV strains in various plant species and their phylogenetic positions in the virus population structures. We then screened 34 commercial pepper cultivars and seven accessions for resistance. While 21 pepper cultivars were resistant to CMV Fny strain, only two cultivars were resistant to CMV P1 strain. We also found only one cultivar partially resistant to BBWV2 RP1 strain. However, all tested commercial pepper cultivars were susceptible to the resistance-breaking CMV strain GTN (CMV-GTN) and BBWV2 severe strain PAP1 (BBWV2-PAP1), suggesting that breeding new cultivars resistant to these virus strains is necessary. Fortunately, we identified several pepper accessions that were resistant or partially resistant to CMV-GTN and one symptomless accession despite systemic infection with BBWV2-PAP1. These genetic resources will be useful in pepper breeding programs to deploy resistance to BBWV2 and CMV.
This study was performed to improve production efficiency at the level of recipient pig and donor nuclei of transgenic cloned pigs used for xenotransplantation. To generate transgenic pigs, human endothelial protein C receptor (hEPCR) and human thrombomodulin (hTM) genes were introduced using the F2A expression vector into GalT–/–/hCD55+ porcine neonatal ear fibroblasts used as donor cells and cloned embryos were transferred to the sows and gilts. Cloned fetal kidney cells were also used as donor cells for recloning to increase production efficiency. Pregnancy and parturition rates after embryo transfer and preimplantation developmental competence were compared between cloned embryos derived from adult and fetal cells. Significantly higher parturition rates were shown in the group of sows (50.0 vs. 4.1%), natural oestrus (20.8 vs. 0%), and ovulated ovary (16.7 vs. 5.6%) compared with gilt, induced and non-ovulated, respectively (P < 0.05). When using gilts as recipients, final parturitions occurred in only the fetal cell groups and significantly higher blastocyst rates (15.1% vs. 21.3%) were seen (P < 0.05). Additionally, gene expression levels related to pluripotency were significantly higher in the fetal cell group (P < 0.05). In conclusion, sows can be recommended as recipients due to their higher efficiency in the generation of transgenic cloned pigs and cloned fetal cells also can be recommended as donor cells through correct nuclear reprogramming.
Consensus does not exist for which cost forms (i.e., one accounting solely for explicit cost and the other for both explicit and opportunity costs as in relative opportunity cost) are used in calculating return on investment (ROI) for conservation-related decisions. This research examines how the cost of conservation investment with and without inclusion of the opportunity cost of the protected area results in different solutions in a multi-objective optimization framework at the county level in the Central and Southern Appalachian Region of the USA. We maximize rates of ROI of both forest-dependent biodiversity and economic impact generated by forest-based payments for ecosystem services. We find that the conservation budget is optimally distributed more narrowly among counties that are more likely to be rural when the investment cost measure is relative opportunity cost than when it is explicit cost. We also find that the sacrifice in forest-dependent biodiversity per unit increase in economic impact is higher when investment cost is measured by relative opportunity cost rather than when measured by explicit cost. By understanding the consequences of using one cost measure over the other, a conservation agency can decide on which cost measure is more appropriate for informing the agency’s decision-making process.
To evaluate the impact of a vancomycin-resistant Enterococcus (VRE) screening policy change on the incidence of healthcare-associated (HA)-VRE bacteremia in an endemic hospital setting.
A quasi-experimental before-and-after study.
A 1,989-bed tertiary-care referral center in Seoul, Republic of Korea.
Since May 2010, our hospital has diminished VRE screening for admitted patients transferred from other healthcare facilities. We assessed the impact of this policy change on the incidence of HA-VRE bacteremia using segmented autoregression analysis of interrupted time series from January 2006 to December 2014 at the hospital and unit levels. In addition, we compared the molecular characteristics of VRE blood isolates collected before and after the screening policy change using multilocus sequence typing and pulsed-field gel electrophoresis.
After the VRE screening policy change, the incidence of hospital-wide HA-VRE bacteremia increased, although no significant changes of level or slope were observed. In addition, a significant slope change in the incidence of HA-VRE bacteremia (change in slope, 0.007; 95% CI, 0.001–0.013; P = .02) was observed in the hemato-oncology department. Molecular analysis revealed that various VRE sequence types appeared after the policy change and that clonally related strains became more predominant (increasing from 26.1% to 59.3%).
The incidence of HA-VRE bacteremia increased significantly after VRE screening policy change, and this increase was mainly driven by high-risk patient populations. When planning VRE control programs in hospitals, different approaches that consider risk for severe VRE infection in patients may be required.
The longitudinal relationship between depression and the risk of non-alcoholic fatty liver disease is uncertain. We examined: (a) the association between depressive symptoms and incident hepatic steatosis (HS), both with and without liver fibrosis; and (b) the influence of obesity on this association.
A cohort of 142 005 Korean adults with neither HS nor excessive alcohol consumption at baseline were followed for up to 8.9 years. The validated Center for Epidemiologic Studies-Depression score (CES-D) was assessed at baseline, and subjects were categorised as non-depressed (a CES-D < 8, reference) or depression (CES-D ⩾ 16). HS was diagnosed by ultrasonography. Liver fibrosis was assessed by the fibrosis-4 index (FIB-4). Parametric proportional hazards models were used to estimate the adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs).
During a median follow-up of 4.0 years, 27 810 people with incident HS and 134 with incident HS plus high FIB-4 were identified. Compared with the non-depressed category, the aHR (95% CIs) for incident HS was 1.24 (1.15–1.34) for CES-D ⩾ 16 among obese individuals, and 1.00 (0.95–1.05) for CES-D ⩾ 16 among non-obese individuals (p for interaction with obesity <0.001). The aHR (95% CIs) for developing HS plus high FIB-4 was 3.41 (1.33–8.74) for CES-D ⩾ 16 among obese individuals, and 1.22 (0.60–2.47) for CES-D ⩾ 16 among non-obese individuals (p for interaction = 0.201).
Depression was associated with an increased risk of incident HS and HS plus high probability of advanced fibrosis, especially among obese individuals.
Background: We describe and evaluate our outbreak of carbapenem-resistant K. pneumoniae transmitted by contaminated duodenoscopes during endoscopic retrograde cholangiopancreatography (ERCP) procedures. Methods: An outbreak investigation was performed when Klebsiella pneumoniae carbapenemase-producing K. pneumoniae (KPC-KP) were identified from bile specimens of 4 patients. The investigation included medical record review, practice audits, and surveillance cultures of duodenoscopes and environmental sites. If available, clinical specimens were obtained from patients who had undergone ERCP in the previous 3 months. Carbapenem-resistant Enterobacteriaceae (CRE) screening cultures were performed to identify additional patients until no CRE cases were detected during 2 consecutive weeks. Pulsed-field gel electrophoresis (PFGE) of KPC-KP isolates was implemented. Results: In total, 12 cases were identified with exposure to duodenoscope from February 2019 through April 2019, including 6 cases with infections and 6 asymptomatic carriers. Case-control analysis showed that 2 specific duodenoscopes would be associated with the KPC-KP outbreak. Duodenoscope reprocessing procedures did not deviate from manufacturer recommendations for reprocessing. After ethylene oxide (EO) gas sterilization, the outbreak was terminated. Conclusions: Meticulous cleaning protocol and enhanced surveillance are necessary to prevent outbreaks of CRE. Notably, enhanced cleaning measures, such as sterilization for duodenoscopes, would be required after procedures with KPC-KP carriers.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
We report our experience with an emergency room (ER) shutdown related to an accidental exposure to a patient with coronavirus disease 2019 (COVID-19) who had not been isolated.
A 635-bed, tertiary-care hospital in Daegu, South Korea.
To prevent nosocomial transmission of the disease, we subsequently isolated patients with suspected symptoms, relevant radiographic findings, or epidemiology. Severe acute respiratory coronavirus 2 (SARS-CoV-2) reverse-transcriptase polymerase chain reaction assays (RT-PCR) were performed for most patients requiring hospitalization. A universal mask policy and comprehensive use of personal protective equipment (PPE) were implemented. We analyzed effects of these interventions.
From the pre-shutdown period (February 10–25, 2020) to the post-shutdown period (February 28 to March 16, 2020), the mean hourly turnaround time decreased from 23:31 ±6:43 hours to 9:27 ±3:41 hours (P < .001). As a result, the proportion of the patients tested increased from 5.8% (N=1,037) to 64.6% (N=690) (P < .001) and the average number of tests per day increased from 3.8±4.3 to 24.7±5.0 (P < .001). All 23 patients with COVID-19 in the post-shutdown period were isolated in the ER without any problematic accidental exposure or nosocomial transmission. After the shutdown, several metrics increased. The median duration of stay in the ER among hospitalized patients increased from 4:30 hours (interquartile range [IQR], 2:17–9:48) to 14:33 hours (IQR, 6:55–24:50) (P < .001). Rates of intensive care unit admissions increased from 1.4% to 2.9% (P = .023), and mortality increased from 0.9% to 3.0% (P = .001).
Problematic accidental exposure and nosocomial transmission of COVID-19 can be successfully prevented through active isolation and surveillance policies and comprehensive PPE use despite longer ER stays and the presence of more severely ill patients during a severe COVID-19 outbreak.
To evaluate the bidirectional relationship between blood pressure (BP) and depressive symptoms using a large prospective cohort study.
Prospective cohort study was performed in 276 244 adults who participated in a regular health check-up and were followed annually or biennially for up to 5.9 years. BP levels were categorised according to the 2017 American College of Cardiology and American Heart Association hypertension guidelines. Depressive symptoms were assessed using Centre for Epidemiologic Studies-Depression (CESD) questionnaire and a cut-off score of ≥25 was regarded as case-level depressive symptoms.
During 672 603.3 person-years of follow-up, 5222 participants developed case-level depressive symptoms. The multivariable-adjusted hazard ratios (HRs) [95% confidence interval (CI)] for incident case-level depressive symptoms comparing hypotension, elevated BP, hypertension stage 1 and hypertension stage 2 to normal BP were 1.07 (0.99–1.16), 0.93 (0.82–1.05), 0.89 (0.81–0.97) and 0.81 (0.62–1.06), respectively (p for trend <0.001). During 583 615.3 person-years of follow-up, 27 787 participants developed hypertension. The multivariable-adjusted HRs (95% CI) for incident hypertension comparing CESD 16–24 and ⩾25 to CESD < 16 were 1.05 (1.01–1.11) and 1.12 (1.03–1.20), respectively (p for trend <0.001) and in the time-dependent models, corresponding HRs (95% CI) were 1.12 (1.02–1.24) and 1.29 (1.10–1.50), respectively (p for trend <0.001).
In this large cohort study of young and middle-aged individuals, higher BP levels were independently associated with a decreased risk for developing case-level depressive symptoms and depressive symptoms were also associated with incident hypertension. Further studies are required to elucidate the mechanisms underlying the bidirectional association between BP levels and incident depression.
Here, we present an overview of how a tertiary hospital responded to maintain necessary activities and protect patients and staff from the coronavirus disease (COVID-19) outbreak.
Gil Medical Center, a tertiary hospital in Incheon, has operated a special response team since January 21, 2020. All visitors were assessed for body temperature and respiratory symptoms, and screened for recent overseas travel. Suspected COVID-19 patients were taken to a screening clinic. All febrile patients with or without respiratory symptoms were taken to a respiratory safety clinic. An isolation ward, which consisted of 10 negative-pressure rooms, was used to treat confirmed cases. More than 120 beds were prepared for the outbreak, and patients with pneumonia were preemptively isolated.
By May 5, 480 960 visitors were assessed at the control station, 3350 patients visited the triage center, and 1794 were treated in the respiratory safety clinic. Seventeen confirmed cases were admitted to the negative isolation ward, and 350 patients with pneumonia were preemptively isolated. A total of 2977 severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) polymerase chain reaction tests were performed.
While tertiary hospitals play an important role in treating both COVID-19 patients and non-COVID-19 patients, hospital staff have to protect themselves from unexpected in-hospital transmission. A multifaceted response must be undertaken to protect tertiary hospitals and their staff during the COVID-19 epidemic.
To propose a new anthropometric index that can be employed to better predict percent body fat (PBF) among young adults and to compare with current anthropometric indices.
All measurements were taken in a controlled laboratory setting in Seoul (South Korea), between 1 December 2015 and 30 June 2016.
Eighty-seven young adults (18–35 years) who underwent dual-energy x-ray absorptiometry (DXA) were used for analysis. Multiple regression analyses were conducted to develop a body fat index (BFI) using simple demographic and anthropometric information. Correlations of DXA measured PBF (DXA_PBF) with previously developed anthropometric indices and the BFI were analysed. Receiver operating characteristic curve analyses were conducted to compare the ability of anthropometric indices to identify obese individuals.
BFI showed a strong correlation with DXA_PBF (r = 0·84), which was higher than the correlations of DXA_PBF with the traditional (waist circumference, r = 0·49; waist to height ratio, r = 0·68; BMI, r = 0·36) and alternate anthropometric indices (a body shape index, r = 0·47; body roundness index, r = 0·68; body adiposity index, r = 0·70). Moreover, the BFI showed higher accuracy at identifying obese individuals (area under the curve (AUC) = 0·91), compared with the other anthropometric indices (AUC = 0·71–0·86).
The BFI can accurately predict DXA_PBF in young adults, using simple demographic and anthropometric information that are commonly available in research and clinical settings. However, larger representative studies are required to build on our findings.
In the Republic of Korea, despite the introduction of one-dose universal varicella vaccination in 2005 and achieving a high coverage rate of 98.9% in 2012, the incidence rate has been increased sevenfold. This study aimed to investigate time trends of varicella incidence rate, assessing the age, period and birth cohort effects. We used national data on the annual number of reported cases from 2006 to 2017. A log-linear Poisson regression model was used to estimate age–period–cohort effects on varicella incidence rate. From 2006 to 2017, the incidence of varicella increased from 22.5 cases to more than 154.8 cases per 100 000. Peak incidence has shifted from 4 to 6 years old. The estimated period and cohort effects showed significant upward patterns, with a linear increasing trend by net drift. There has been an increase in the incidence among the Korean population regarding period and cohort despite the universal vaccination of varicella vaccine. Our data suggest the need for additional studies to address the current gap in herd immunity.
For decades, fructose intake has been recognised as an environmental risk for metabolic syndromes and diseases. Here we comprehensively examined the effects of fructose intake on mice liver transcriptomes. Fructose-supplemented water (34 %; w/v) was fed to both male and female C57BL/6N mice at their free will for 6 weeks, followed by hepatic transcriptomics analysis. Based on our criteria, differentially expressed genes (DEG) were selected and subjected to further computational analyses to predict key pathways and upstream regulator(s). Subsequently, predicted genes and pathways from the transcriptomics dataset were validated via quantitative RT-PCR analyses. As a result, we identified eighty-nine down-regulated and eighty-eight up-regulated mRNA in fructose-fed mice livers. These DEG were subjected to bioinformatics analysis tools in which DEG were mainly enriched in xenobiotic metabolic processes; further, in the Ingenuity Pathway Analysis software, it was suggested that the aryl hydrocarbon receptor (AhR) is an upstream regulator governing overall changes, while fructose suppresses the AhR signalling pathway. In our quantitative RT-PCR validation, we confirmed that fructose suppressed AhR signalling through modulating expressions of transcription factor (AhR nuclear translocator; Arnt) and upstream regulators (Ncor2, and Rb1). Altogether, we demonstrated that ad libitum fructose intake suppresses the canonical AhR signalling pathway in C57BL/6N mice liver. Based on our current observations, further studies are warranted, especially with regard to the effects of co-exposure to fructose on (1) other types of carcinogens and (2) inflammation-inducing agents (or even diets such as a high-fat diet), to find implications of fructose-induced AhR suppression.
The complete chloroplast (cp) genome sequences of three Amaranthus species (Amaranthus hypochondriacus, A. cruentus and A. caudatus) were determined by next-generation sequencing. The cp genome sequences of A. hypochondriacus, A. cruentus and A. caudatus were 150,523, 150,757 and 150,523 bp in length, respectively, each containing 84 genes with identical contents and orders. Expansion or contraction of the inverted repeat region was not observed among the three Amaranthus species. The coding regions were highly conserved with 99.3% homology in nucleotide and amino acid sequences. Five genes – matK, accD, ndhJ, ccsA and ndhF – showed relatively high non-synonymous/synonymous values (Ka/Ks > 0.1). Sequence comparison identified two insertion/deletion (InDels) greater than 40 bp in length, and polymerase chain reaction markers that could amplify these InDel regions were applied to diverse Korean Genbank accessions, which could discriminate the three Amaranthus species. Phylogenetic analyses based on 62 protein-coding genes showed that the core Caryophyllales were monophyletic and Amaranthoideae formed a sister group with the Betoideae and Chenopodioideae clade. Comparing each homologous locus among the three Amaranthus species, identified eight regions with high Pi values (>0.03). Seven of these loci, except for rps19-trnH (GUG), were considered to be useful molecular markers for further phylogenetic studies.
Given its diverse disease courses and symptom presentations, multiple phenotype dimensions with different biological underpinnings are expected with bipolar disorders (BPs). In this study, we aimed to identify lifetime BP psychopathology dimensions. We also explored the differing associations with bipolar I (BP-I) and bipolar II (BP-II) disorders.
We included a total of 307 subjects with BPs in the analysis. For the factor analysis, we chose six variables related to clinical courses, 29 indicators covering lifetime symptoms of mood episodes, and 6 specific comorbid conditions. To determine the relationships among the identified phenotypic dimensions and their effects on differentiating BP subtypes, we applied structural equation modeling.
We selected a six-factor solution through scree plot, Velicer's minimum average partial test, and face validity evaluations; the six factors were cyclicity, depression, atypical vegetative symptoms, elation, psychotic/irritable mania, and comorbidity. In the path analysis, five factors excluding atypical vegetative symptoms were associated with one another. Cyclicity, depression, and comorbidity had positive associations, and they correlated negatively with psychotic/irritable mania; elation showed positive correlations with cyclicity and psychotic/irritable mania. Depression, cyclicity, and comorbidity were stronger in BP-II than in BP-I, and they contributed significantly to the distinction between the two disorders.
We identified six phenotype dimensions; in addition to symptom features of manic and depressive episodes, various comorbidities and high cyclicity constructed separate dimensions. Except for atypical vegetative symptoms, all factors showed a complex interdependency and played roles in discriminating BP-II from BP-I.