To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study aimed to explore perceptions of the meaning of life among Korean patients living with advanced cancer.
The study employed a mixed-methods design, and 16 participants were included in the analysis. Qualitative data gathered from in-depth interviews were analyzed using Colaizzi's phenomenological method. Quantitative survey data were analyzed using descriptive statistics, the Mann–Whitney U test, the Kruskal–Wallis test, and Spearman's ρ correlation.
Participants experienced both the existence of meaning and the will to find meaning in terms of four categories: “interpersonal relationships based on attachment and cohesion” (three themes — family as the core meaning of one's life, supportive and dependent interconnectedness with significant others, and existential responsibility embedded in familism), “therapeutic relationships based on trust” (one theme — communication and trust between the patient and medical staff), “optimism” (two themes — positivity embodied through past experiences and a positive attitude toward the current situation), and “a sense of purpose with advanced cancer” (two themes — the will to survive and expectations for the near future). The meaning in life questionnaire (MLQ) and the purpose in life scale (PIL) showed a significant positive correlation tendency with the functional assessment of chronic illness therapy-spiritual well-being scale (FACIT-Sp). The patient health questionnaire (PHQ-9) showed significant negative correlation tendency with both the MLQ-presence of meaning (MLQ-PM) and PIL-Initiative (PIL-I) questionnaires.
Significance of results
Finding meaning in life helps advanced cancer patients realize their will to live. It also acts as a coping mechanism that palliates negative experiences in the fight against the disease. In particular, among advanced cancer patients in the Korean culture, the dynamics of relationships with family and medical staff was a key axis that instilled optimism and will to live. These results suggest that considering the meaning of life in advanced cancer patients by reflecting Korean culture in the treatment process improves the quality of care.
Nosocomial transmission of COVID-19 among immunocompromised hosts can have a serious impact on COVID-19 severity, underlying disease progression and SARS-CoV-2 transmission to other patients and healthcare workers within hospitals. We experienced a nosocomial outbreak of COVID-19 in the setting of a daycare unit for paediatric and young adult cancer patients. Between 9 and 18 November 2020, 473 individuals (181 patients, 247 caregivers/siblings and 45 staff members) were exposed to the index case, who was a nursing staff. Among them, three patients and four caregivers were infected. Two 5-year-old cancer patients with COVID-19 were not severely ill, but a 25-year-old cancer patient showed prolonged shedding of SARS-CoV-2 RNA for at least 12 weeks, which probably infected his mother at home approximately 7–8 weeks after the initial diagnosis. Except for this case, no secondary transmission was observed from the confirmed cases in either the hospital or the community. To conclude, in the day care setting of immunocompromised children and young adults, the rate of in-hospital transmission of SARS-CoV-2 was 1.6% when applying the stringent policy of infection prevention and control, including universal mask application and rapid and extensive contact investigation. Severely immunocompromised children/young adults with COVID-19 would have to be carefully managed after the mandatory isolation period while keeping the possibility of prolonged shedding of live virus in mind.
Large herbivores can disperse seeds over long distances through endozoochory. The Korean water deer (Hydropotes inermis argyropus), an internationally vulnerable species but locally considered a vermin, is a potential endozoochorous seed dispersal vector. In this study, feeding experiments were conducted to test the efficiency of seed dispersal through gut ingestion by the Korean water deer, its temporal pattern and the effect of gut passage on seed recovery and germination rate. Eight plant species, including species that formerly germinated from its faeces, were used to feed three Korean water deer. Once the deer had consumed all the provided seeds, their faeces were collected after 24, 48, 72 and 96 h. The collected faeces were air-dried, and the number of seeds retrieved from the faeces was counted every 24 h (0–24, 24–48, 48–72 and 72–96 h). Among the eight plant species, six species were retrieved with intact seeds. Panicum bisulcatum had the highest recovery rate of 33.7%, followed by Amaranthus mangostanus (24.5%) and Chenopodium album (14.4%). Most of the seeds were recovered within the 24–48 h time interval. Germination tests were conducted on the ingested and uningested seeds for the four species which had a sufficient recovery rate. The effects of gut passage on seed germination differed according to plant species. The germination rate substantially decreased after gut passage. The results suggest that the Korean water deer can disperse seeds, potentially over long distances albeit at a high cost of low seed recovery and germination rate.
Accumulating evidence suggests that alterations in inflammatory biomarkers are important in depression. However, previous meta-analyses disagree on these associations, and errors in data extraction may account for these discrepancies.
PubMed/MEDLINE, Embase, PsycINFO, and the Cochrane Library were searched from database inception to 14 January 2020. Meta-analyses of observational studies examining the association between depression and levels of tumor necrosis factor-α (TNF-α), interleukin 1-β (IL-1β), interleukin-6 (IL-6), and C-reactive protein (CRP) were eligible. Errors were classified as follows: incorrect sample sizes, incorrectly used standard deviation, incorrect participant inclusion, calculation error, or analysis with insufficient data. We determined their impact on the results after correction thereof.
Errors were noted in 14 of the 15 meta-analyses included. Across 521 primary studies, 118 (22.6%) showed the following errors: incorrect sample sizes (20 studies, 16.9%), incorrect use of standard deviation (35 studies, 29.7%), incorrect participant inclusion (7 studies, 5.9%), calculation errors (33 studies, 28.0%), and analysis with insufficient data (23 studies, 19.5%). After correcting these errors, 11 (29.7%) out of 37 pooled effect sizes changed by a magnitude of more than 0.1, ranging from 0.11 to 1.15. The updated meta-analyses showed that elevated levels of TNF- α, IL-6, CRP, but not IL-1β, are associated with depression.
These findings show that data extraction errors in meta-analyses can impact findings. Efforts to reduce such errors are important in studies of the association between depression and peripheral inflammatory biomarkers, for which high heterogeneity and conflicting results have been continuously reported.
We examined what causes L1-L2 differences in sensitivity to prominence cues in discourse processing. Participants listened to recorded stories in segment-by-segment fashion at their own pace. Each story established a pair of contrasting items, and one item from the pair was rementioned and manipulated to carry either a contrastive or presentational pitch accent. By directly comparing the current self-paced listening data to previously obtained experimenter-paced listening data, we tested whether reducing online-processing demands allows L2 learners to show a nativelike behavior, such that contrastive pitch accents facilitate later ruling out the salient alternative. However, reduced time pressure failed to lead even higher proficiency L1-Korean learners of English to reach a nativelike level, suggesting that L2 learners’ nonnativelike processing and representation of the prominence cue in spoken discourse processing can be due to the inherent difficulty of fully learning a complex form-function mapping rather than to online-processing demands.
Background: After the Middle East respiratory syndrome coronavirus outbreak in Korea in 2015, the government newly established the additional reimbursement for infection prevention to encourage infection control activities in the hospitals. The new policy was announced in December 2015 and was implemented in September 2016. We evaluated how infection control activities improved in hospitals after the change of government policy in Korea. Methods: Three cross-sectional surveys using the WHO Hand Hygiene Self-Assessment Framework (HHSAF) were conducted in 2013, 2015, and 2017. Using multivariable linear regression model including hospital characteristics, we analyzed the changes in total HHSAF scores according to the survey time. Results: In total, 32 hospitals participated in the survey in 2013, 52 in 2015, and 101 in 2017. The number of inpatient beds per infection control professionals decreased from 324 in 2013 to 303 in 2015 and 179 in 2017. Most hospitals were at intermediate or advanced levels of progress (90.6% in 2013, 86.6% in 2015, and 94.1% in 2017). In a multivariable linear regression model, the total HHSAF scores were significantly associated with hospital teaching status (β coefficient of major teaching hospital, 52.6; 95% CI, 8.9–96.4; P = .018), bed size (β coefficient of 100-bed increase, 5.1; 95% CI, 0.3–9.8; P = .038), and survey time (β coefficient of 2017 survey, 45.1; 95% CI, 19.3–70.9; P = .001). Conclusions: After the national policy implementation, the number of infection control professionals increased, and the promotion of hand hygiene activities was strengthened in Korean hospitals.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
We report our experience with an emergency room (ER) shutdown related to an accidental exposure to a patient with coronavirus disease 2019 (COVID-19) who had not been isolated.
A 635-bed, tertiary-care hospital in Daegu, South Korea.
To prevent nosocomial transmission of the disease, we subsequently isolated patients with suspected symptoms, relevant radiographic findings, or epidemiology. Severe acute respiratory coronavirus 2 (SARS-CoV-2) reverse-transcriptase polymerase chain reaction assays (RT-PCR) were performed for most patients requiring hospitalization. A universal mask policy and comprehensive use of personal protective equipment (PPE) were implemented. We analyzed effects of these interventions.
From the pre-shutdown period (February 10–25, 2020) to the post-shutdown period (February 28 to March 16, 2020), the mean hourly turnaround time decreased from 23:31 ±6:43 hours to 9:27 ±3:41 hours (P < .001). As a result, the proportion of the patients tested increased from 5.8% (N=1,037) to 64.6% (N=690) (P < .001) and the average number of tests per day increased from 3.8±4.3 to 24.7±5.0 (P < .001). All 23 patients with COVID-19 in the post-shutdown period were isolated in the ER without any problematic accidental exposure or nosocomial transmission. After the shutdown, several metrics increased. The median duration of stay in the ER among hospitalized patients increased from 4:30 hours (interquartile range [IQR], 2:17–9:48) to 14:33 hours (IQR, 6:55–24:50) (P < .001). Rates of intensive care unit admissions increased from 1.4% to 2.9% (P = .023), and mortality increased from 0.9% to 3.0% (P = .001).
Problematic accidental exposure and nosocomial transmission of COVID-19 can be successfully prevented through active isolation and surveillance policies and comprehensive PPE use despite longer ER stays and the presence of more severely ill patients during a severe COVID-19 outbreak.
Understanding alternatives to prominent information contributes to successful native language discourse comprehension. Several past studies have suggested that the way second language (L2) learners encode and represent an alternative set in L2 speech is not exactly native-like. However, because these studies involved contrastive pitch accents in running speech, these native language–second language differences may reflect the demands of comprehending running speech in L2 rather than intrinsic deficit in discourse processing per se. Here, we tested L2 learners’ discourse encoding and representation using a different cue to prominence: font emphasis in self-paced reading. We found that, in this temporally less demanding modality, L2 learners’ encoding of salient alternatives became native-like. Font emphasis facilitated L2 learners’ memory for the discourse by ruling out salient alternatives, just as how it facilitates native speakers’. L2 learners were also similar to native speakers in using the situation model to constrain an alternative set. The results suggest that L2 learners can show native-like processing of prominence and that previous underuse of contrastive accents in L2 comprehension could reflect cognitive demands of processing running speech in L2.
This study examined whether L1-Mandarin learners of L2-English use verb bias and complementizer cues to process temporarily ambiguous English sentences the same way native speakers do. SVO word order places verbs early in sentences in both languages, allowing the use of verb-based knowledge to anticipate what could follow. The two languages differ, however, in whether an optional complementizer signals embedded clauses. In a self-paced reading experiment, native English speakers and L1-Mandarin learners of L2-English read sentences containing temporary ambiguity about whether a noun was the direct object of the verb preceding it or the subject of an embedded clause. Native speakers replicated previous work showing an optimally efficient interactive pattern of cue use, while non-native learners showed additive effects of the two cues, consistent with predictions of the Competition Model about learning how to use multiple cues in a second language that sometimes agree and sometimes do not.
Our objective was to evaluate long-term altered appearance, distress, and body image in posttreatment breast cancer patients and compare them with those of patients undergoing active treatment and with general population controls.
We conducted a cross-sectional survey between May and December of 2010. We studied 138 breast cancer patients undergoing active treatment and 128 posttreatment patients from 23 Korean hospitals and 315 age- and area-matched subjects drawn from the general population. Breast, hair, and skin changes, distress, and body image were assessed using visual analogue scales and the EORTC BR–23. Average levels of distress were compared across groups, and linear regression was utilized to identify the factors associated with body image.
Compared to active-treatment patients, posttreatment patients reported similar breast changes (6.6 vs. 6.2), hair loss (7.7 vs. 6.7), and skin changes (5.8 vs. 5.4), and both groups had significantly more severe changes than those of the general population controls (p < 0.01). For a similar level of altered appearance, however, breast cancer patients experienced significantly higher levels of distress than the general population. In multivariate analysis, patients with high altered appearance distress reported significantly poorer body image (–20.7, CI95% = –28.3 to –13.1) than patients with low distress.
Significance of results:
Posttreatment breast cancer patients experienced similar levels of altered appearance, distress, and body-image disturbance relative to patients undergoing active treatment but significantly higher distress and poorer body image than members of the general population. Healthcare professionals should acknowledge the possible long-term effects of altered appearance among breast cancer survivors and help them to manage the associated distress and psychological consequences.
Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression.
Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers’ personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined.
Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL.
Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.
A few epidemiological data are available assessing the associations of intakes of sodium (Na) and potassium (K) with non-alcoholic fatty liver disease (NAFLD). We aimed to examine the associations of dietary intake of Na and K with the prevalence of ultrasound-diagnosed NAFLD. We performed a cross-sectional study of 100 177 participants (46 596 men and 53 581 women) who underwent a health screening examination and completed a FFQ at the Kangbuk Samsung Hospital Total Healthcare Centers, South Korea, between 2011 and 2013. NAFLD was defined by ultrasonographic detection of fatty liver in the absence of excessive alcohol intake or other known causes of liver disease. The proportion of NAFLD was 35·6 % for men and 9·8 % for women. Increasing prevalence of NAFLD was observed with increasing Na intake. The multivariable-adjusted prevalence ratios (PR) of NAFLD comparing the highest with the lowest quintile of energy-adjusted Na intake were 1·25 (95 % CI 1·18, 1·32; Ptrend<0·001) in men and 1·32 (95 % CI 1·18, 1·47; Ptrend <0·001) in women. However, when we additionally adjusted for body fat percentage, the association became attenuated; the corresponding PR of NAFLD were 1·15 (95 % CI 1·09, 1·21) in men and 1·06 (95 % CI 0·95, 1·17) in women. No inverse association was observed for energy-adjusted K intake. Our findings suggest that higher Na intake is associated with a greater prevalence of NAFLD in young and middle-aged asymptomatic adults, which might be partly mediated by adiposity.
Contrastive pitch accents benefit native English speakers’ memory for discourse by enhancing a representation of a specific relevant contrast item (Fraundorf et al., 2010). This study examines whether and how second language (L2) listeners differ in how contrastive accents affect their encoding and representation of a discourse, as compared to native speakers. Using the same materials as Fraundorf et al. (2010), we found that low and mid proficiency L2 learners showed no memory benefit from contrastive accents. High proficiency L2 learners revealed some sensitivity to contrastive accents, but failed to fully integrate information conveyed by contrastive accents into their discourse representation. The results suggest that L2 listeners’ non-native performance in processing contrastive accents, observed in this and other prior studies, may be attributed at least in part to a difference in the depth of processing of the information conveyed by contrastive accents.
Decreased hemoglobin levels increase the risk of developing dementia among the elderly. However, the underlying mechanisms that link decreased hemoglobin levels to incident dementia still remain unclear, possibly due to the fact that few studies have reported on the relationship between low hemoglobin levels and neuroimaging markers. We, therefore, investigated the relationships between decreased hemoglobin levels, cerebral small-vessel disease (CSVD), and cortical atrophy in cognitively healthy women and men.
Cognitively normal women (n = 1,022) and men (n = 1,018) who underwent medical check-ups and magnetic resonance imaging (MRI) were enrolled at a health promotion center. We measured hemoglobin levels, white matter hyperintensities (WMH) scales, lacunes, and microbleeds. Cortical thickness was automatically measured using surface based methods. Multivariate regression analyses were performed after controlling for possible confounders.
Decreased hemoglobin levels were not associated with the presence of WMH, lacunes, or microbleeds in women and men. Among women, decreased hemoglobin levels were associated with decreased cortical thickness in the frontal (Estimates, 95% confidence interval, −0.007, (−0.013, −0.001)), temporal (−0.010, (−0.018, −0.002)), parietal (−0.009, (−0.015, −0.003)), and occipital regions (−0.011, (−0.019, −0.003)). Among men, however, no associations were observed between hemoglobin levels and cortical thickness.
Our findings suggested that decreased hemoglobin levels affected cortical atrophy, but not increased CSVD, among women, although the association is modest. Given the paucity of modifiable risk factors for age-related cognitive decline, our results have important public health implications.
This study aimed to investigate the influences of age, education, and gender on the two total scores (TS-I and TS-II) of the Consortium to Establish a Registry for Alzheimer's Disease Neuropsychological assessment battery (CERAD-NP) and to provide normative information based on an analysis for a large number of elderly persons with a wide range of educational levels.
In the study, 1,987 community-dwelling healthy volunteers (620 males and 1,367 females; 50–90 years of age; and zero to 25 years of education) were included. People with serious neurological, medical, and psychiatric disorders (including dementia) were excluded. All participants underwent the CERAD-NP assessment. TS-I was generated by summing raw scores from the CERAD-NP subtests, excluding Mini-Mental State Examination and Constructional Praxis (CP) recall subtests. TS-II was calculated by adding CP recall score to TS-I.
Both TS-I and TS-II were significantly influenced by demographic variables. Education accounted for the greatest proportion of score variance. Interaction effect between age and gender was found. Based on the results obtained, normative data of the CERAD-NP total scores were stratified by age (six overlapping tables), education (four strata), and gender.
The normative information will be very useful for better interpretation of the CERAD-NP total scores in various clinical and research settings and for comparing individuals’ performance of the battery across countries.
At the apical tip of Drosophila testis, there is a stem cell niche known as the proliferation center, where the stem cells are maintained by hub cell cluster for the regulation of differentiation and proliferation. Germline stem cells go through mitosis four times from one primary spermatogonial cell to the 16-cell stage before the maturation. The cells derived from the same germline stem cell are located within one cyst, an enclosed system by two cyst cells, and they are connected by the intercellular bridges called ring canals. In this study, the three-dimensional (3D) structure of Drosophila testis tip was reconstructed from serial sections. The size of cells at each stage was compared in volume from the 3D structure. The stages of cells in a cyst could be distinguishable exactly by counting the cells linked with intercellular bridges in 3D-reconstructed structure. The cysts containing the same stage cells appeared in the horizontal plane. Both the germline stem cell directly attached to the hub cell and the spermatogonial cells detached from the hub cell were divided at the almost perpendicular direction to the spermatogonial cell layers. The dividing phase in one cyst was delayed gradually through the cytoplasmic region of intercellular bridge.
Sources of variation in nutrient intake have been examined for Western diets, but little is known about the sources of variation and their differences by age and sex among Koreans. We examined sources of variation in nutrient intake and calculated the number of days needed to estimate usual intake using 12 d of dietary records (DR). To this end, four 3 d DR including two weekdays and one weekend day were collected throughout four seasons of 1 year from 178 male and 236 female adults aged 20–65 years residing in Seoul, Korea. The sources of variation were estimated using the random-effects model, and the variation ratio (within-individual:between-individual) was calculated to determine a desirable number of days. Variations attributable to the day of the week, recording sequence and seasonality were generally small, although the degree of variation differed by sex and age (20–45 years and 46–65 years). The correlation coefficient between the true intake and the observed intake (r) increased with additional DR days, reaching 0·7 at 3–4 d and 0·8 at 6–7 d. However, the degree of increase became attenuated with additional days: r increased by 13·0–26·9 % from 2 to 4 d, by 6·5–16·4 % from 4 to 7 d and by 4·0–11·6 % from 7 to 12 d for energy and fifteen nutrients. In conclusion, the present study suggests that the day of the week, recording sequence and seasonality minimally contribute to the variation in nutrient intake. To measure Korean usual dietary intake using open-ended dietary instruments, 3–4 d may be needed to achieve modest precision (r>0·7) and 6–7 d for high precision (r>0·8).
Using a self-paced reading task, this study examines whether second language (L2) learners are flexible enough to learn L2 parsing strategies that are not useful in their first language (L1). Native Korean-speaking learners of English were compared with native English speakers on resolving a temporary ambiguity about the relationship between a verb and the noun following it (e.g., The student read [that] the article . . .). Consistent with previous studies, native English reading times showed the usual interaction between the optional complementizer that and the particular verb's bias about the structures that can follow it. Lower proficiency L1-Korean learners of L2-English did not show a similar interaction, but higher proficiency learners did. Thus, despite native language word order differences (English: SVO; Korean: SOV) that determine the availability of verbs early enough in sentences to generate predictions about upcoming sentence structure, higher proficiency L1-Korean learners were able to learn to optimally combine verb bias and complementizer cues on-line during sentence comprehension just as native English speakers did, while lower proficiency learners had not yet learned to do so. Optimal interactive cue combination during L2 sentence comprehension can probably be achieved only after sufficient experience with the target language.