We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There are growing concerns about the impact of the COVID-19 pandemic on the mental health of older adults. We examined the effect of the pandemic on the risk of depression in older adults.
Methods
We analyzed data from the prospective cohort study of Korean older adults, which has been followed every 2 years. Among the 2308 participants who completed both the third and the fourth follow-up assessments, 58.4% completed their fourth follow-up before the outbreak of COVID-19 and the rest completed it during the pandemic. We conducted face-to-face diagnostic interviews using Mini International Neuropsychiatric Interview and used Geriatric Depression Scale. We performed generalized estimating equations and logistic regression analyses.
Results
The COVID-19 pandemic was associated with increased depressive symptoms in older adults [b (standard error) = 0.42 (0.20), p = 0.040] and a doubling of the risk for incident depressive disorder even in euthymic older adults without a history of depression (odds ratio = 2.44, 95% confidence interval 1.18–5.02, p = 0.016). Less social activities, which was associated with the risk of depressive disorder before the pandemic, was not associated with the risk of depressive disorder during the pandemic. However, less family gatherings, which was not associated with the risk of depressive disorder before the pandemic, was associated with the doubled risk of depressive disorder during the pandemic.
Conclusions
The COVID-19 pandemic significantly influences the risk of late-life depression in the community. Older adults with a lack of family gatherings may be particularly vulnerable.
This study aims to identify factors associated with divorce following breast cancer diagnosis and measures the impact of divorce on the quality of life (QoL) of patients.
Methods
We used cross-sectional survey data collected at breast cancer outpatient clinics in South Korea from November 2018 to April 2019. Adult breast cancer survivors who completed active treatment without any cancer recurrence at the time of the survey (N = 4,366) were included. The participants were classified into two groups: “maintaining marriage” and “being divorced,” between at the survey and at the cancer diagnosis. We performed logistic regression and linear regression to identify the factors associated with divorce after cancer diagnosis and to compare the QoL of divorced and nondivorced survivors.
Results
Approximately 11.1/1,000 of married breast cancer survivors experienced divorce after cancer diagnosis. Younger age, lower education, and being employed at diagnosis were associated with divorce. Being divorced survivors had significantly lower QoL (Coefficient [Coef] = −7.50; 95% CI = −13.63, −1.36), social functioning (Coef = −9.47; 95% CI = −16.36, −2.57), and body image (Coef = −8.34; 95% CI = −6.29, −0.39) than survivors who remained married. They also experienced more symptoms including pain, insomnia, financial difficulties, and distress due to hair loss.
Conclusion
Identifying risk factors of divorce will ultimately help ascertain the resources necessary for early intervention.
Network approach has been applied to a wide variety of psychiatric disorders. The aim of the present study was to identify network structures of remitters and non-remitters in patients with first-episode psychosis (FEP) at baseline and the 6-month follow-up.
Methods
Participants (n = 252) from the Korean Early Psychosis Study (KEPS) were enrolled. They were classified as remitters or non-remitters using Andreasen's criteria. We estimated network structure with 10 symptoms (three symptoms from the Positive and Negative Syndrome Scale, one depressive symptom, and six symptoms related to schema and rumination) as nodes using a Gaussian graphical model. Global and local network metrics were compared within and between the networks over time.
Results
Global network metrics did not differ between the remitters and non-remitters at baseline or 6 months. However, the network structure and nodal strengths associated with positive-self and positive-others scores changed significantly in the remitters over time. Unique central symptoms for remitters and non-remitters were cognitive brooding and negative-self, respectively. The correlation stability coefficients for nodal strength were within the acceptable range.
Conclusion
Our findings indicate that network structure and some nodal strengths were more flexible in remitters. Negative-self could be an important target for therapeutic intervention.
Background: After the Middle East respiratory syndrome coronavirus outbreak in Korea in 2015, the government newly established the additional reimbursement for infection prevention to encourage infection control activities in the hospitals. The new policy was announced in December 2015 and was implemented in September 2016. We evaluated how infection control activities improved in hospitals after the change of government policy in Korea. Methods: Three cross-sectional surveys using the WHO Hand Hygiene Self-Assessment Framework (HHSAF) were conducted in 2013, 2015, and 2017. Using multivariable linear regression model including hospital characteristics, we analyzed the changes in total HHSAF scores according to the survey time. Results: In total, 32 hospitals participated in the survey in 2013, 52 in 2015, and 101 in 2017. The number of inpatient beds per infection control professionals decreased from 324 in 2013 to 303 in 2015 and 179 in 2017. Most hospitals were at intermediate or advanced levels of progress (90.6% in 2013, 86.6% in 2015, and 94.1% in 2017). In a multivariable linear regression model, the total HHSAF scores were significantly associated with hospital teaching status (β coefficient of major teaching hospital, 52.6; 95% CI, 8.9–96.4; P = .018), bed size (β coefficient of 100-bed increase, 5.1; 95% CI, 0.3–9.8; P = .038), and survey time (β coefficient of 2017 survey, 45.1; 95% CI, 19.3–70.9; P = .001). Conclusions: After the national policy implementation, the number of infection control professionals increased, and the promotion of hand hygiene activities was strengthened in Korean hospitals.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
Methods:
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
Results:
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Conclusions:
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
Gosan-ri-type pottery (GTP) is a unique plant-fiber-tempered pottery from Korea and has only been found in Early Neolithic sites on Jeju Island. In this study, we conducted radiocarbon (14C) dating for one GTP sample and 10 charcoal samples collected from archaeological structures in which GTP was found in 2012. The measurement conditions, the internal quality assurance test, and the reliability test indicate that each 14C date is very reliable. However, the 14C dates of the charcoal samples were more accurate than that of the GTP sample due to contamination from younger humic acids. From the summary of all 14C dates of charcoal samples using the KDE model, we finally conclude that GTP was manufactured and utilized throughout the period 9610–9490 cal BP (7670–7550 BC) with 95.4% confidence level. This age corroborates the inference that GTP is the oldest known Korean Neolithic pottery.
Self-poisoning using pesticides is among the major methods of suicide worldwide, and accounts for one-fifth of suicides in 2006–2010 in South Korea. We investigated long-term trends in pesticide suicide rates in South Korea and factors related to these trends.
Methods
We calculated age-standardised rates of pesticide suicide in South Korea (1983–2014) using registered death data. We used graphical approach and joinpoint regression analysis to examine secular trends in pesticide suicide by sex, age and area, and a time-series analysis to investigate association of pesticide suicide rate with socioeconomic and agriculture-related factors. Age, period and cohort effects were examined using the intrinsic estimator method.
Results
Age-standardised rate of pesticide suicide fluctuated between 1983 and 2000 before it markedly increased in 2000–2003 (annual percent change 29.7%), followed by a gradual fall (annual percent change −6.3%) in 2003–2011. Following the paraquat ban (2011–2012), there was a marked reduction (annual percent change −28.2%) in 2011–2014. Trend in pesticide suicide was associated with divorce rate but not with other factors studied. Declines in pesticide suicide in 2003–2011 were most noticeable in younger groups and metropolises; by contrast, elderly adults aged 70+ living in rural areas showed an upward trend until after the 2011–2012 paraquat ban, when it turned downward. In the age–period–cohort modelling, having been born between 1938 and 1947 was associated with higher pesticide suicide rates.
Conclusions
Pesticide suicide trend changed substantially in South Korea over the last three decades. Effective prevention should include close monitoring of trends and strong regulations of toxic pesticides.
Lack of understanding the effects of single- and multiple-weed interference on soybean yield has led to inadequate weed management in Primorsky Krai, resulting in much lower average yield than neighboring regions. A 2 yr field experiment was conducted in a soybean field located in Bogatyrka (43.82°N, 131.6°E), Primorsky Krai, Russia, in 2013 and 2014 to investigate the effects of single and multiple interference caused by naturally established weeds on soybean yield and to model these effects. Aboveground dry weight was negatively affected the most by weed interference, followed by number of pods and seeds. Soybean yield under single-weed interference was best demonstrated by a rectangular hyperbolic model, showing that common ragweed and barnyardgrass were the most competitive weed species, followed by annual sowthistle, American sloughgrass, and common lambsquarters. In the case of multiple-weed interference, soybean yield loss was accurately described by a multivariate rectangular hyperbolic model, with total density equivalent as the independent variable. Parameter estimates indicated that weed-free soybean yields were similar in 2013 and 2014, i.e., estimated as 1.72 t and 1.75 t ha−1, respectively, and competitiveness of each weed species was not significantly different between the two years. Economic thresholds for single-weed interference were 0.74, 0.66, 1.15, 1.23, and 1.45 plants m−2 for common ragweed, barnyardgrass, annual sowthistle, American sloughgrass, and common lambsquarters, respectively. The economic threshold for multiple-weed interference was 0.70 density equivalent m−2. These results, including the model, thus can be applied to a decision support system for weed management in soybean cultivation under single and multiple-weed interference in Primorsky Krai and its neighboring regions of Russia.
Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression.
Methods:
Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers’ personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined.
Results:
Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL.
Conclusions:
Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.
This study evaluated the impacts of earlier traumatic events on the mental health of older adults, in terms of mental disorders and mental well-being, according to sociodemographic variables, trauma-related characteristics, and personality traits in a nationally representative sample of older Koreans.
Methods:
A total of 1,621 subjects aged 60 to 74 years from a Korean national epidemiological survey of mental disorders responded face-to-face interviews. The Korean Composite International Diagnostic Interview was used to investigate lifetime trauma exposure (LTE) and psychiatric diagnoses. The EuroQol health classification system and life satisfaction scale were used to assess quality of life (QoL), and the Big Five Inventory-10 (BFI-10) to measure personality traits.
Results:
Five-hundred and seventy-seven subjects (35.6%) reported a history of LTE (mean age at trauma, 30.8 years old). Current mental disorders were more prevalent in elderly people with LTE, while better current QoL was more frequent in those without LTE. Among older people with LTE, lower extraversion and higher neuroticism increased the risk of current mood or anxiety disorders, whereas higher extraversion increased the probability of experiencing mental well-being after adjusting for sociodemographic and trauma-related variables.
Conclusion:
Personality traits, especially extraversion, and neuroticism, may be useful for predicting the mental health outcomes of LTE in older adults. Further longitudinal studies investigating the relationship between traumatic events and mental health outcomes are needed.
Some clinical studies have reported reduced peripheral glial cell line-derived neurotrophic factor (GDNF) level in elderly patients with major depressive disorder (MDD). We verified whether a reduction in plasma GDNF level was associated with MDD.
Method
Plasma GDNF level was measured in 23 healthy control subjects and 23 MDD patients before and after 6 weeks of treatment.
Results
Plasma GDNF level in MDD patients at baseline did not differ from that in healthy controls. Plasma GDNF in MDD patients did not differ significantly from baseline to the end of treatment. GDNF level was significantly lower in recurrent-episode MDD patients than in first-episode patients before and after treatment.
Conclusions
Our findings revealed significantly lower plasma GDNF level in recurrent-episode MDD patients, although plasma GDNF levels in MDD patients and healthy controls did not differ significantly. The discrepancy between our study and previous studies might arise from differences in the recurrence of depression or the ages of the MDD patients.
Decreased hemoglobin levels increase the risk of developing dementia among the elderly. However, the underlying mechanisms that link decreased hemoglobin levels to incident dementia still remain unclear, possibly due to the fact that few studies have reported on the relationship between low hemoglobin levels and neuroimaging markers. We, therefore, investigated the relationships between decreased hemoglobin levels, cerebral small-vessel disease (CSVD), and cortical atrophy in cognitively healthy women and men.
Methods:
Cognitively normal women (n = 1,022) and men (n = 1,018) who underwent medical check-ups and magnetic resonance imaging (MRI) were enrolled at a health promotion center. We measured hemoglobin levels, white matter hyperintensities (WMH) scales, lacunes, and microbleeds. Cortical thickness was automatically measured using surface based methods. Multivariate regression analyses were performed after controlling for possible confounders.
Results:
Decreased hemoglobin levels were not associated with the presence of WMH, lacunes, or microbleeds in women and men. Among women, decreased hemoglobin levels were associated with decreased cortical thickness in the frontal (Estimates, 95% confidence interval, −0.007, (−0.013, −0.001)), temporal (−0.010, (−0.018, −0.002)), parietal (−0.009, (−0.015, −0.003)), and occipital regions (−0.011, (−0.019, −0.003)). Among men, however, no associations were observed between hemoglobin levels and cortical thickness.
Conclusion:
Our findings suggested that decreased hemoglobin levels affected cortical atrophy, but not increased CSVD, among women, although the association is modest. Given the paucity of modifiable risk factors for age-related cognitive decline, our results have important public health implications.
There is increasing evidence of a relationship between underweight or obesity and dementia risk. Several studies have investigated the relationship between body weight and brain atrophy, a pathological change preceding dementia, but their results are inconsistent. Therefore, we aimed to evaluate the relationship between body mass index (BMI) and cortical atrophy among cognitively normal participants.
Methods:
We recruited cognitively normal participants (n = 1,111) who underwent medical checkups and detailed neurologic screening, including magnetic resonance imaging (MRI) in the health screening visits between September 2008 and December 2011. The main outcome was cortical thickness measured using MRI. The number of subjects with five BMI groups in men/women was 9/9, 148/258, 185/128, 149/111, and 64/50 in underweight, normal, overweight, mild obesity, and moderate to severe obesity, respectively. Linear and non-linear relationships between BMI and cortical thickness were examined using multiple linear regression analysis and generalized additive models after adjustment for potential confounders.
Results:
Among men, underweight participants showed significant cortical thinning in the frontal and temporal regions compared to normal weight participants, while overweight and mildly obese participants had greater cortical thicknesses in the frontal region and the frontal, temporal, and occipital regions, respectively. However, cortical thickness in each brain region was not significantly different in normal weight and moderate to severe obesity groups. Among women, the association between BMI and cortical thickness was not statistically significant.
Conclusions:
Our findings suggested that underweight might be an important risk factor for pathological changes in the brain, while overweight or mild obesity may be inversely associated with cortical atrophy in cognitively normal elderly males.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
Methods:
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
Results:
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Conclusion:
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
Background: Holt–Oram syndrome is characterised by CHD and limb anomalies. Mutations in TBX5 gene, encoding the T-box transcription factor, are responsible for the development of Holt–Oram syndrome, but such mutations are variably detected in 30–75% of patients. Methods: Clinically diagnosed eight Holt–Oram syndrome patients from six families were evaluated the clinical characteristics, focusing on the cardiac manifestations, in particular, and molecular aetiologies. In addition to the investigation of the mutation of TBX5, SALL4, NKX2.5, and GATA4 genes, which are known to regulate cardiac development by physically and functionally interacting with TBX5, were also analyzed. Multiple ligation-dependent probe amplification analysis was performed to detect exonic deletion and duplication mutations in these genes. Results: All included patients showed cardiac septal defects and upper-limb anomalies. Of the eight patients, seven underwent cardiac surgery, and four suffered from conduction abnormalities such as severe sinus bradycardia and complete atrioventricular block. Although our patients showed typical clinical findings of Holt–Oram syndrome, only three distinct TBX5 mutations were detected in three families: one nonsense, one splicing, and one missense mutation. No new mutations were identified by testing SALL4, NKX2.5, and GATA4 genes. Conclusions: All Holt–Oram syndrome patients in this study showed cardiac septal anomalies. Half of them showed TBX5 gene mutations. To understand the genetic causes for inherited CHD such as Holt–Oram syndrome is helpful to take care of the patients and their families. Further efforts with large-scale genomic research are required to identify genes responsible for cardiac manifestations or genotype–phenotype relation in Holt–Oram syndrome.
Non-destructive high-throughput phenotyping based on phenomics is an emerging technology for assessing the genetic diversity of various traits and screening in breeding programmes. In this study, non-destructive measurements of leaf temperature and chlorophyll fluorescence were conducted to investigate the physiological responses of soybean (Glycine max) to salt stress so as to set up a non-destructive screening method. Two-week-old seedlings of soybean in the V2 stage were treated with 0, 12.5, 25, 50 and 100 mM NaCl to induce salt stress. Three parameters, photosynthesis rate, stomatal conductance and chlorophyll fluorescence, decreased significantly, while soybean leaf temperature increased by exhibiting a positive correlation with NaCl concentration (P< 0.001). Soybean leaf temperature increased significantly at 50 mM NaCl when compared with the untreated control, although no visual symptom was observed. We selected leaf temperature as a major physiological parameter of salt stress as its measurement is much easier, faster and cheaper than that of other physiological parameters. Therefore, leaf temperature can be used for evaluating the responses to salt stress in soybean as a non-destructive and phenomic parameter. The results of this study suggest that non-destructive parameters such as chlorophyll fluorescence and leaf temperature are useful tools for assessing the genetic diversity of soybean with regard to salt stress tolerance and to screen salt stress-tolerant soybean for breeding.
It is controversial whether Borna disease virus (BDV) infects humans and causes psychiatric disorders.
Objectives:
The relationship between BDV infection and schizophrenia with deficit syndrome was investigated.
Study design:
Using the Schedule for the Deficit Syndrome, 62 schizophrenic in-patients were selected from three psychiatric hospitals. RNA was extracted from peripheral blood mononuclear cells and analyzed using nested reverse transcriptase-polymerase chain reaction with primers to detect BDV p24 and p40.
Results and conclusions:
BDV transcripts were not detected in samples from any of the 62 schizophrenic patients. These data do not support an etiologic association between BDV infection and the deficit form of schizophrenia.
Sesame (Sesamum indicum L.) is one of the oldest oil crops and is widely cultivated in Asia and Africa. The aim of this study was to assess the genetic diversity, phylogenetic relationships and population structure of 277 sesame core collection accessions collected from 15 countries in four different continents. A total of 158 alleles were detected among the sesame accessions, with the number varying from 3 to 25 alleles per locus and an average of 11.3. Polymorphism information content values ranged from 0.34 to 0.84, with an average of 0.568. These values indicated a high genetic diversity at 14 loci both among and within the populations. Of these, 44 genotype-specific alleles were identified in 12 of the 14 polymorphic simple sequence repeat markers. The core collection preserved a much higher level of genetic variation. Therefore, 10.1% was selected as the best sampling percentage from the whole collection when constructing the core collection. The 277 core collection accessions formed four robust clusters in the unweighted pair group method and the arithmetic averages (UPGMA) dendrogram, although the clustering did not indicate any clear division among the sesame accessions based on their geographical locations. Similar patterns were obtained using model-based structure analysis and country-based dendrograms, as some accessions situated geographically far apart were grouped together in the same cluster. The results of these analyses will increase our understanding of the genotype-specific alleles, genetic diversity and population structure of core collections, and the information can be used for the development of a future breeding strategy to improve sesame yield.
To investigate whether low vitamin D status was related to insulin resistance (IR) or impaired fasting glucose (IFG) in Korean adolescents, after adjusting for total body fat mass (FM).
Design
A cross-sectional study.
Setting
Korea National Health and Nutrition Examination Survey (KNAHNES) 2009–2010.
Subjects
In total, 1466 participants (769 males) aged 10–19 years were assessed for serum 25-hydroxyvitamin D (25(OH)D) levels, for FM by whole-body dual-energy X-ray absorptiometry and for IR by homeostasis model assessment (HOMA-IR) after an 8 h fast.
Results
Age-, sex-, season- and physical-activity-adjusted regression models showed that serum 25(OH)D levels were significantly related to markers of adiposity (P = 0·016 for FM (g), P = 0·023 for FM (%) and P = 0·035 for fat mass index). When the participants were stratified into three 25(OH)D categories (<37·5 nmol/l (n 553), 37·5 to < 50 nmol/l (n 543) and ≥ 50 nmol/l (n 370)), significantly decreasing trends were observed for fasting insulin (all P < 0·001), HOMA-IR (all P < 0·001) and the odds ratios for IFG (all P for trend < 0·05) from the lowest to the highest 25(OH)D category, after adjustments for age, sex, physical activity and all markers of adiposity. In the multivariate logistic regression analysis, the likelihood of participants in the lowest serum 25(OH)D category having IFG was 2·96–3·15 compared with those in the highest 25(OH)D category (all P < 0·05).
Conclusions
There was a significant inverse relationship between vitamin D status and IR and the risk of IFG, independent of adiposity, in Korean adolescents.