To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Accurate prognostication is important for patients and their families to prepare for the end of life. Objective Prognostic Score (OPS) is an easy-to-use tool that does not require the clinicians’ prediction of survival (CPS), whereas Palliative Prognostic Score (PaP) needs CPS. Thus, inexperienced clinicians may hesitate to use PaP. We aimed to evaluate the accuracy of OPS compared with PaP in inpatients in palliative care units (PCUs) in three East Asian countries.
This study was a secondary analysis of a cross-cultural, multicenter cohort study. We enrolled inpatients with far-advanced cancer in PCUs in Japan, Korea, and Taiwan from 2017 to 2018. We calculated the area under the receiver operating characteristics (AUROC) curve to compare the accuracy of OPS and PaP.
A total of 1,628 inpatients in 33 PCUs in Japan and Korea were analyzed. OPS and PaP were calculated in 71.7% of the Japanese patients and 80.0% of the Korean patients. In Taiwan, PaP was calculated for 81.6% of the patients. The AUROC for 3-week survival was 0.74 for OPS in Japan, 0.68 for OPS in Korea, 0.80 for PaP in Japan, and 0.73 for PaP in Korea. The AUROC for 30-day survival was 0.70 for OPS in Japan, 0.71 for OPS in Korea, 0.79 for PaP in Japan, and 0.74 for PaP in Korea.
Significance of results
Both OPS and PaP showed good performance in Japan and Korea. Compared with PaP, OPS could be more useful for inexperienced physicians who hesitate to estimate CPS.
Several studies supported the usefulness of “the surprise question” in terms of 1-year mortality of patients. “The surprise question” requires a “Yes” or “No” answer to the question “Would I be surprised if this patient died in [specific time frame].” However, the 1-year time frame is often too long for advanced cancer patients seen by palliative care personnel. “The surprise question” with shorter time frames is needed for decision making. We examined the accuracy of “the surprise question” for 7-day, 21-day, and 42-day survival in hospitalized patients admitted to palliative care units (PCUs).
This was a prospective multicenter cohort study of 130 adult patients with advanced cancer admitted to 7 hospital-based PCUs in South Korea. The accuracy of “the surprise question” was compared with that of the temporal question for clinician's prediction of survival.
We analyzed 130 inpatients who died in PCUs during the study period. The median survival was 21.0 days. The sensitivity, specificity, and overall accuracy for the 7-day “the surprise question” were 46.7, 88.7, and 83.9%, respectively. The sensitivity, specificity, and overall accuracy for the 7-day temporal question were 6.7, 98.3, and 87.7%, respectively. The c-indices of the 7-day “the surprise question” and 7-day temporal question were 0.662 (95% CI: 0.539–0.785) and 0.521 (95% CI: 0.464–0.579), respectively. The c-indices of the 42-day “the surprise question” and 42-day temporal question were 0.554 (95% CI: 0.509–0.599) and 0.616 (95% CI: 0.569–0.663), respectively.
Significance of results
Surprisingly, “the surprise questions” and temporal questions had similar accuracies. The high specificities for the 7-day “the surprise question” and 7- and 21-day temporal question suggest they may be useful to rule in death if positive.
The explosive outbreak of COVID-19 led to a shortage of medical resources, including isolation rooms in hospitals, healthcare workers (HCWs) and personal protective equipment. Here, we constructed a new model, non-contact community treatment centres to monitor and quarantine asymptomatic and mildly symptomatic COVID-19 patients who recorded their own vital signs using a smartphone application. This new model in Korea is useful to overcome shortages of medical resources and to minimise the risk of infection transmission to HCWs.
Mycobacterial infections are widely distributed in animals and cause considerable economic losses, especially in livestock animals. Bovine paratuberculosis and bovine tuberculosis, which are representative mycobacterial infections in cattle, are difficult to diagnose using current-generation diagnostics due to their relatively long incubation periods. Thus, alternative diagnostic tools are needed for the detection of mycobacterial infections in cattle. A biomarker is an indicator present in biological fluids that reflects the biological state of an individual during the progression of a specific disease. Therefore, biomarkers are considered a potential diagnostic tool for various diseases. Recently, the number of studies investigating biomarkers as tools for diagnosing mycobacterial infections has increased. In human medicine, many diagnostic biomarkers have been developed and applied in clinical practice. In veterinary medicine, however, many such developments are still in the early stages. In this review, we summarize the current progress in biomarker research related to the development of diagnostic biomarkers for mycobacterial infections in cattle.
Spirituality is what gives people meaning and purpose in life, and it has been recognized as a critical factor in patients’ well-being, particularly at the ends of their lives. Studies have demonstrated relationships between spirituality and patient-reported outcomes such as quality of life and mental health. Although a number of studies have suggested that spiritual belief can be associated with mortality, the results are inconsistent. We aimed to determine whether spirituality was related to survival in advanced cancer inpatients in Korea.
For this multicenter study, we recruited adult advanced cancer inpatients who had been admitted to seven palliative care units with estimated survival of <3 months. We measured spirituality at admission using the Korean version of the Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being (FACIT-sp), which comprises two subscales: meaning/peace and faith. We calculated a Kaplan-Meier curve for spirituality, dichotomized at the predefined cutoffs and medians for the total scale and each of the two subscales, and performed univariate regression with a Cox proportional hazard model.
We enrolled a total of 204 adults (mean age: 64.5 ± 13.0; 48.5% female) in the study. The most common primary cancer diagnoses were lung (21.6%), colorectal (18.6%), and liver/biliary tract (13.0%). Median survival was 19.5 days (95% confidence interval [CI95%]: 23.5, 30.6). Total FACIT-sp score was not related to survival time (hazard ratio [HR] = 0.981, CI95% = 0.957, 1.007), and neither were the scores for its two subscales, meaning/peace (HR = 0.969, CI95% = 0.932, 1.008) and faith (HR = 0.981, CI95% = 0.938, 1.026).
Significance of results
Spirituality was not related to survival in advanced cancer inpatients in Korea. Plausible mechanisms merit further investigation.
We investigated the extent of delays in the response time of emergency medical services (EMS) as an impact of mass casualty incidences (MCIs) in the same area.
We defined an MCI case as an event that resulted in 6 or more patients being transported by EMS, and prehospital response time as the time from the call to arrival at the scene. We matched patients before and after MCIs by dividing them into categories of 3 hours before, 0-1 hour after, 1-2 hours after, and 2-3 hours after the MCIs. We compared prehospital response times using multiple linear regression.
A total of 33,276 EMS-treated patients were matched. The prehospital response time for the category of 3 hours before the MCIs was 8.8 minutes (SD: 8.2), treated as the reference, whereas that for the category of 0-1 hour after the MCI was 11.3 minutes (P<0.01). The multiple linear regression analysis revealed that prehospital response time increased by 2.5 minutes (95% CI: 2.3-2.8) during the first hour and by 0.3 minutes (95% CI: 0.1-0.6) during the second hour after MCIs.
There were significant delays in the prehospital response time for emergency patients after MCIs, and it lasted for 2 hours as the spillover effect. (Disaster Med Public Health Preparedness. 2018;12:94–100)
Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression.
Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers’ personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined.
Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL.
Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.
The Sewol ferry disaster is one of the most tragic events in Korea’s modern history. Among the 476 people on board, which included Danwon High School students (324) and teachers (14), 304 passengers died in the disaster (295 recovered corpses and 9 missing) and 172 survived. Of the rescued survivors, 72 were attending Danwon High School, located in Ansan City, and residing in a residence nearby. Because the students were young, emotionally susceptible adolescents, both the government and the parents requested the students be grouped together at a single hospital capable of appropriate psychiatric care. Korea University Ansan Hospital was the logical choice, as the only third-tier university-grade hospital with the necessary faculty and facilities within the residential area of the families of the students. We report the experiences and the lessons learned from the processes of preparing for and managing the surviving young students as a community-based hospital. (Disaster Med Public Health Preparedness. 2017;11:389–393)
To examine the hypothesis that the association between vitamin D deficiency and depressive symptoms is dependent upon total cholesterol level in a representative national sample of the South Korean population.
This was a population-based cross-sectional study.
The Fifth Korean National Health and Nutrition Examination Survey (KNHANES V, 2010–2012).
We included 7198 adults aged 20–88 years.
The incidence of depressive symptoms in individuals with vitamin D deficiency (serum 25-hydroxyvitamin D<20 ng/ml) was 1·54-fold (95 % CI 1·20, 1·98) greater than in individuals without vitamin D deficiency (serum 25-hydroxyvitamin D ≥20 ng/ml). The relationship was stronger in individuals with normal-to-borderline serum total cholesterol (serum total cholesterol<240 mg/dl; OR=1·60; 95 % CI 1·23, 2·08) and non-significant in individuals with high serum total cholesterol (OR=0·97; 95 % CI 0·52, 1·81) after adjustment for confounding variables (age, sex, BMI, alcohol consumption, smoking status, regular exercise, income level, education level, marital status, changes in body weight, perceived body shape, season of examination date and cholesterol profiles).
The association between vitamin D deficiency and depressive symptoms was weakened by high serum total cholesterol status. These findings suggest that both vitamin D and total cholesterol are important targets for the prevention and treatment of depression.
Cerebral white matter hyperintensities (WMH) are prevalent incident findings on brain MRI scans among elderly people and have been consistently implicated in cognitive dysfunction. However, differential roles of WMH by region in cognitive function are still unclear. The aim of this study was to ascertain the differential role of regional WMH in predicting progression from mild cognitive impairment (MCI) to different subtypes of dementia.
Participants were recruited from the Clinical Research Center for Dementia of South Korea (CREDOS) study. A total of 622 participants with MCI diagnoses at baseline and follow-up evaluations were included for the analysis. Initial MRI scans were rated for WMH on a visual rating scale developed for the CREDOS. Differential effects of regional WMH in predicting incident dementia were evaluated using the Cox proportional hazards model.
Of the 622 participants with MCI at baseline, 139 patients (22.3%) converted to all-cause dementia over a median of 14.3 (range 6.0–36.5) months. Severe periventricular WMH (PWMH) predicted incident all-cause dementia (Hazard ratio (HR) 2.22; 95% confidence interval (CI) 1.43–3.43) and Alzheimer's disease (AD) (HR 1.86; 95% CI 1.12–3.07). Subcortical vascular dementia (SVD) was predicted by both PWMH (HR 16.14; 95% CI 1.97–132.06) and DWMH (HR 8.77; 95% CI 1.77–43.49) in more severe form (≥ 10 mm).
WMH differentially predict dementia by region and severity. Our findings suggest that PWMH may play an independent role in the pathogenesis of dementia, especially in AD.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
This study examined changes in health-related quality of life (HRQoL) and quality of care (QoC) as perceived by terminally ill cancer patients and a stratified set of HRQoL or QoC factors that are most likely to influence survival at the end of life (EoL).
We administered questionnaires to 619 consecutive patients immediately after they were diagnosed with terminal cancer by physicians at 11 university hospitals and at the National Cancer Center in Korea. Subjects were followed up over 161.2 person-years until their deaths. We measured HRQoL using the core 30-item European Organization for Research and Treatment of Cancer Quality of Life Questionnaire, and QoC using the Quality Care Questionnaire–End of Life (QCQ–EoL). We evaluated changes in HRQoL and QoC issues during the first three months after enrollment, performing sensitivity analysis by using data generated via four methods (complete case analysis, available case analysis, the last observation carried forward, and multiple imputation).
Emotional and cognitive functioning decreased significantly over time, while dyspnea, constipation, and pain increased significantly. Dignity-conserving care, care by healthcare professionals, family relationships, and QCQ–EoL total score decreased significantly. Global QoL, appetite loss, and Eastern Cooperative Oncology Group Performance Status (ECOG–PS) scores were significantly associated with survival.
Significance of results:
Future standardization of palliative care should be focused on assessment of these deteriorated types of quality. Accurate estimates of the length of life remaining for terminally ill cancer patients by such EoL-enhancing factors as global QoL, appetite loss, and ECOG–PS are needed to help patients experience a dignified and comfortable death.
Liposomal drug delivery products have been already commercialized in tumor therapeutics, which can realize passive tumor targeting via enhanced permeability and retention (EPR) effect resulting from the leaky tumor vasculature. To control drug release out of the liposomes, thermo-sensitive liposomes (TSLs) have been developed so that an abrupt exposure of highly concentrated drugs to tumor tissues was enabled by locally treated thermal stimuli. As interests upon TSL have increased along with ongoing clinical trials, some types of TSLs with different physical properties in pharmacokinetics and the mechanism of drug release have been formulated. However, there are few protocols established with a desirable heat source to maximize the efficacy of different TSLs as treating tumors. In this study, we examined different protocols for the most effective application of different TSLs to tumor therapy. First, we examined if enhancing the accumulation of TSLs within tumor tissues prior to bursting drugs out of TSLs could lead to increasing anti-tumor efficacy. Second, we compared the efficiency of two different heat sources on the use of TSL, a warm water bath (42°C) and high intensity focused ultrasound (HIFU). Our study suggests that the specified protocol be setup for TSLs with different physical properties to optimally function in tumor therapies.
Objectives: The aim of this study was to systematically assess the long-term (≥ 6 months) benefits of epidural steroid injection therapies for patients with low back pain.
Methods: We identified randomized controlled trials by database searches up to October 2011 and by additional hand searches without language restrictions. Randomized controlled trials on the effects of epidurals for low back pain with follow-up for at least 6 months were included. Outcomes considered were pain relief, functional improvement in 6 to 12 months after epidural steroid injection treatment and the number of patients who underwent subsequent surgery. Meta-analysis was performed using a random-effects model.
Results: Twenty-nine articles were selected. The meta-analysis suggested that a significant treatment effect on pain was noted at 6 months of follow-up (weighted mean difference [WMD], −0.41; 95 percent confidence interval [CI], −0.66 to −0.16), but was no longer statistically significant after adjusting for the baseline pain score (WMD, −0.19; 95 percent CI, −0.61 to 0.24). Epidural steroid injection did not improve back-specific disability more than a placebo or other procedure. Epidural steroid injection did not significantly decrease the number of patients who underwent subsequent surgery compared with a placebo or other treatments (relative risk, 1.02; 95 percent CI, 0.83 to 1.24).
Conclusions: A long-term benefit of epidural steroid injections for low back pain was not suggested at 6 months or longer. Introduction of selection bias in the majority of injection studies seems apparent. Baseline adjustment is essential when we evaluate pain as a main outcome of injection therapy.
Direct energy conversion between thermal and electrical energy, based on thermoelectric (TE) effect, has the potential to recover waste heat and convert it to provide clean electric power. The energy conversion efficiency is related to the thermoelectric figure of merit ZT expressed as ZT=S2σT/κ, T is temperature, S is the Seebeck coefficient, σ is conductance and κ is thermal conductivity. For a lower thermal conductivity κ and high power factor (S2σ), our current strategy is the development of rhombohedrally strained single crystalline SiGe materials that are highly -oriented twinned. The development of a SiGe “twin lattice structure (TLS)” plays a key role in phonon scattering. The TLS increases the electrical conductivity and decreases thermal conductivity due to phonon scattering at stacking faults generated from the 60° rotated primary twin structure. To develop high performance materials, the substrate temperature, chamber working pressure, and DC sputtering power are controlled for the aligned growth production of SiGe layer and TLS on a c-plane sapphire. Additionally, a new elevated temperature thermoelectric characterization system, that measures the thermal diffusivity and Seebeck effect nondestructively, was developed. The material properties were characterized at various temperatures and optimized process conditions were experimentally determined. The present paper encompasses the technical discussions toward the development of thermoelectric materials and the measurement techniques.
Despite numerous previous studies, relationships between watershed land use and adjacent streams and rivers at various scales in Korea remain unclear. This study investigated the relationships between land uses and the physical, chemical, and biological characteristics of 720 sites of streams and rivers across the country. The land uses at two spatial scales, including a 1-km buffer and the base watershed management region (BWMR), were computed in a geographical information system (GIS) with a digital land use/land cover map. Characteristics of land uses at two spatial scales were then correlated with the monitored multidimensional characteristics of the streams and rivers. The results of this study indicate that land use types have significant effects on stream and river characteristics. Specifically, most characteristics were negatively correlated with the proportions of urban, rice paddy, agricultural, and bare soil areas and positively correlated with the amount of forest. The site-scale and BWMR-scale analyses suggest that BWMR land use patterns were more strongly related to ecological integrity than they were to site land use patterns. Improving our understanding of land use effects will largely depend on relating the results of site-specific studies that use similar response techniques and measures to evaluate ecological integrity. In addition, our results clearly indicate that the characteristics of streams and rivers are closely linked and that land use types differentially affect those characteristics. Thus, effective restoration and management for ecological integrity of lotic system should consider the physical, chemical, and biological factors in combination.
Introduction: Internet game overuse is an emerging disorder and features diminished impulse control and poor reward-processing. In an attempt to understand the neurobiological bases of Internet game overuse, we investigated the differences in regional cerebral glucose metabolism at resting state between young individuals with Internet game overuse and those with normal use using 18F-fluorodeoxyglucose positron emission tomography study.
Methods: Twenty right-handed male participants (9 normal users: 24.7±2.4 years of age, 11 overusers: 23.5±2.9 years of age) participated. A trait measure of impulsivity was also completed after scanning.
Results: Internet game overusers showed greater impulsiveness than the normal users and there was a positive correlation between the severity of Internet game overuse and impulsiveness. Imaging data showed that the overusers had increased glucose metabolism in the right middle orbitofrontal gyrus, left caudate nucleus, and right insula, and decreased metabolism in the bilateral postcentral gyrus, left precentrai gyrus, and bilateral occipital regions compared to normal users.
Conclusion: Internet game overuse may be associated with abnormal neurobiological mechanisms in the orbitofrontal cortex, striatum, and sensory regions, which are implicated in impulse control, reward processing, and somatic representation of previous experiences. Our results support the idea that Internet game overuse shares psychological and neural mechanisms with other types of impulse control disorders and substance/non-substance-related addiction.
The levels and distribution of polychlorinated biphenyl (PCB) congeners were analysed in fourteen soil and eight lichen (Usnea aurantiaco-atra) samples from King George Island, West Antarctica. A total of 32 PCB congeners were found in five soil samples collected in 2006, and the mean concentration of total PCBs was 20.4 pg g-1 dry weight (range, 8.0−33.8 pg g-1 dry weight). The most abundant PCB isomers in soil samples were di-, tri-, and penta-CBs, which accounted for more than 75% of the total residues. Twelve dioxin-like PCBs were also detected in nine soil and eight lichen samples, and the levels of dioxin-like PCBs were 5-fold higher in lichens than in soil. PCBs were detected at very low levels in most soil and lichen samples. The highest congener concentrations were found for PCB 118 (6.63 and 21.93 pg g-1 in soil and lichen, respectively) among dioxin-like PCBs. PCB levels in air samples were highly correlated with those in soil and lichen samples, as were PCB levels in soil and lichen samples collected at the same site. Long-range atmospheric transport is thought to be the main source of PCBs on King George Island. However, PCB levels in soil and lichen samples were also apparently influenced by local sources of PCBs.
Over the past century, the population of Korea has aged rapidly as a result of decreasing fertility and mortality. Furthermore, the percentage of the population aged 65 and older is expected to double from 7% to 14% within 18 years, a much shorter doubling period than in most other developed countries. As Korean society ages, interest in healthy and successful ageing has increased. However, although previous studies have examined various determinants of successful ageing, such as socioeconomic status, gender differences have been neglected. This study investigated gender differences as factors in successful ageing among elderly men and women. Successful ageing has been defined as having high levels of physical and social functioning. Physical functioning includes having no difficulties with activities of daily living (ADL) or instrumental activities of daily living (IADL). Social functioning is defined as participation in at least one of the following social activities: paid work, religious gatherings or volunteer service. Data for this study were obtained from a representative sample of 761 community-living individuals aged 65–84 years (340 males, 421 females); the respondents were interviewed face-to-face as part of the third wave of the Hallym Ageing Study (2007). Socioeconomic status appears to have a greater gender-specific effect on physical functioning than on social functioning. Especially for elderly men, a higher monthly individual income was significantly related to a higher level of physical functioning. Among elderly women, a higher level of education was associated with a higher level of physical functioning. In a major metropolis, elderly men had low social functioning and elderly women had low physical functioning. As Korea's population ages, successful ageing has received much attention. This study shows that policies promoting successful ageing must consider gender differences and associated socioeconomic factors.