We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There are growing concerns about the impact of the COVID-19 pandemic on the mental health of older adults. We examined the effect of the pandemic on the risk of depression in older adults.
Methods
We analyzed data from the prospective cohort study of Korean older adults, which has been followed every 2 years. Among the 2308 participants who completed both the third and the fourth follow-up assessments, 58.4% completed their fourth follow-up before the outbreak of COVID-19 and the rest completed it during the pandemic. We conducted face-to-face diagnostic interviews using Mini International Neuropsychiatric Interview and used Geriatric Depression Scale. We performed generalized estimating equations and logistic regression analyses.
Results
The COVID-19 pandemic was associated with increased depressive symptoms in older adults [b (standard error) = 0.42 (0.20), p = 0.040] and a doubling of the risk for incident depressive disorder even in euthymic older adults without a history of depression (odds ratio = 2.44, 95% confidence interval 1.18–5.02, p = 0.016). Less social activities, which was associated with the risk of depressive disorder before the pandemic, was not associated with the risk of depressive disorder during the pandemic. However, less family gatherings, which was not associated with the risk of depressive disorder before the pandemic, was associated with the doubled risk of depressive disorder during the pandemic.
Conclusions
The COVID-19 pandemic significantly influences the risk of late-life depression in the community. Older adults with a lack of family gatherings may be particularly vulnerable.
Background: The purpose of this study was to find out the relationship between appropriateness of antibiotic prescription and clinical outcomes in patients with community-acquired acute pyelonephritis (CA-APN). Methods: A multicenter prospective cohort study was performed in 8 Korean hospitals from September 2017 to August 2018. All hospitalized patients aged ≥19 years diagnosed with CA-APN at admission were recruited. Pregnant women and patients with insufficient data were excluded. In addition, patients with prolonged hospitalization due to medical problems that were not associated with APN treatment were excluded. The appropriateness of empirical and definitive antibiotics was divided into “optimal,” “suboptimal,” and “inappropriate,” and optimal and suboptimal were regarded as appropriate antibiotic use. The standard for the classification of empirical antibiotics was defined reflecting the Korean national guideline for the antibiotic use in urinary tract infection 2018. The standards for the classification of definitive antibiotics were defined according to the result of in vitro susceptibility tests of causative organisms. Clinical outcomes including clinical failure (mortality or recurrence) rate, hospitalization days, and medical costs were compared between patients who were prescribed antibiotics appropriately and those who were prescribed them inappropriately. Results: In total, 397 and 318 patients were eligible for the analysis of the appropriateness of empirical and definitive antibiotics, respectively. Of these, 10 (2.5%) and 18 (5.7%) were inappropriately prescribed empirical and definitive antibiotics, respectively, and 28 (8.8%) were prescribed either empirical or definitive antibiotics inappropriately. Patients who were prescribed empirical antibiotics appropriately showed a lower mortality rate (0 vs 10%; P = .025), shorter hospitalization days (9 vs 12.5 days; P = .014), and lower medical costs (US$2,333 vs US$4,531; P = .007) compared to those who were prescribed empirical antibiotics “inappropriately.” In comparison, we detected no significant differences in clinical outcomes between patients who were prescribed definitive antibiotics appropriately and those who were prescribed definitive antibiotics inappropriately. Patients who were prescribed both empirical and definitive antibiotics appropriately showed a lower clinical failure rate (0.3 vs 7.1%; P = .021) and shorter hospitalization days (9 vs 10.5 days; P = .041) compared to those who were prescribed either empirical or definitive antibiotics inappropriately. Conclusions: Appropriate use of antibiotics leads patients with CA-APN to better clinical outcomes including fewer hospitalization days and lower medical costs.
Serotonergic dysfunction may play an important role in motor and nonmotor symptoms of Parkinson’s disease (PD). The loudness dependence of auditory evoked potentials (LDAEP) has been used to evaluate serotonergic activity. Therefore, this study aimed to determine central serotonergic activity using LDAEP in de novo PD according to the age at onset and changes in serotonergic activity after dopaminergic treatment.
Methods:
A total of 30 patients with unmedicated PD, 16 in the early-onset and 14 in the late-onset groups, were enrolled. All subjects underwent comprehensive neurological examination, laboratory tests, the Unified Parkinson’s Disease Rating Scale, and LDAEP. The LDAEP was calculated as the slope of the two N1/P2 peaks measured at the Cz electrode, first at baseline conditions (pretreatment) and a second time after 12 weeks (post-treatment) following dopaminergic medications.
Results:
The absolute values of pretreatment N1/P2 LDAEP (early-onset: late-onset, 0.99 ± 0.68: 1.62 ± 0.88, p = 0.035) and post-treatment N1 LDAEP (early-onset: late-onset, −0.61 ± 0.61: −1.26 ± 0.91, p = 0.03) were significantly lower in the early-onset group compared with those of the late-onset group. In addition, a higher value of pretreatment N1/P2 LDAEP was significantly correlated with the late-onset group (coefficient = 1.204, p = 0.044). The absolute value of the N1 LDAEP decreased after 12 weeks of taking dopaminergic medication (pretreatment: post-treatment, −1.457 ± 1.078: −0.904 ± 0.812, p = 0.0018).
Conclusions:
Based on the results of this study, LDAEP could be a marker for serotonergic neurotransmission in PD. Central serotonergic activity assessed by LDAEP may be more preserved in early-onset PD patients and can be altered with dopaminergic medication.
Refugees commonly experience difficulties with emotional processing, such as alexithymia, due to stressful or traumatic experiences. However, the functional connectivity of the amygdala, which is central to emotional processing, has yet to be assessed in refugees. Thus, the present study investigated the resting-state functional connectivity of the amygdala and its association with emotional processing in North Korean (NK) refugees.
Methods
This study included 45 NK refugees and 40 native South Koreans (SK). All participants were administered the Toronto Alexithymia Scale (TAS), Beck Depression Inventory (BDI), and Clinician-administered PTSD Scale (CAPS), and differences between NK refugees and native SK in terms of resting-state functional connectivity of the amygdala were assessed. Additionally, the association between the strength of amygdala connectivity and the TAS score was examined.
Results
Resting-state connectivity values from the left amygdala to the bilateral dorsolateral prefrontal cortex (dlPFC) and dorsal anterior cingulate cortex (dACC) were higher in NK refugees than in native SK. Additionally, the strength of connectivity between the left amygdala and right dlPFC was positively associated with TAS score after controlling for the number of traumatic experiences and BDI and CAPS scores.
Conclusions
The present study found that NK refugees exhibited heightened frontal–amygdala connectivity, and that this connectivity was correlated with alexithymia. The present results suggest that increased frontal–amygdala connectivity in refugees may represent frontal down-regulation of the amygdala, which in turn may produce alexithymia.
Our objective was to evaluate long-term altered appearance, distress, and body image in posttreatment breast cancer patients and compare them with those of patients undergoing active treatment and with general population controls.
Method:
We conducted a cross-sectional survey between May and December of 2010. We studied 138 breast cancer patients undergoing active treatment and 128 posttreatment patients from 23 Korean hospitals and 315 age- and area-matched subjects drawn from the general population. Breast, hair, and skin changes, distress, and body image were assessed using visual analogue scales and the EORTC BR–23. Average levels of distress were compared across groups, and linear regression was utilized to identify the factors associated with body image.
Results:
Compared to active-treatment patients, posttreatment patients reported similar breast changes (6.6 vs. 6.2), hair loss (7.7 vs. 6.7), and skin changes (5.8 vs. 5.4), and both groups had significantly more severe changes than those of the general population controls (p < 0.01). For a similar level of altered appearance, however, breast cancer patients experienced significantly higher levels of distress than the general population. In multivariate analysis, patients with high altered appearance distress reported significantly poorer body image (–20.7, CI95% = –28.3 to –13.1) than patients with low distress.
Significance of results:
Posttreatment breast cancer patients experienced similar levels of altered appearance, distress, and body-image disturbance relative to patients undergoing active treatment but significantly higher distress and poorer body image than members of the general population. Healthcare professionals should acknowledge the possible long-term effects of altered appearance among breast cancer survivors and help them to manage the associated distress and psychological consequences.
Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression.
Methods:
Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers’ personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined.
Results:
Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL.
Conclusions:
Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.
To examine the hypothesis that the association between vitamin D deficiency and depressive symptoms is dependent upon total cholesterol level in a representative national sample of the South Korean population.
Design
This was a population-based cross-sectional study.
Setting
The Fifth Korean National Health and Nutrition Examination Survey (KNHANES V, 2010–2012).
Subjects
We included 7198 adults aged 20–88 years.
Results
The incidence of depressive symptoms in individuals with vitamin D deficiency (serum 25-hydroxyvitamin D<20 ng/ml) was 1·54-fold (95 % CI 1·20, 1·98) greater than in individuals without vitamin D deficiency (serum 25-hydroxyvitamin D ≥20 ng/ml). The relationship was stronger in individuals with normal-to-borderline serum total cholesterol (serum total cholesterol<240 mg/dl; OR=1·60; 95 % CI 1·23, 2·08) and non-significant in individuals with high serum total cholesterol (OR=0·97; 95 % CI 0·52, 1·81) after adjustment for confounding variables (age, sex, BMI, alcohol consumption, smoking status, regular exercise, income level, education level, marital status, changes in body weight, perceived body shape, season of examination date and cholesterol profiles).
Conclusions
The association between vitamin D deficiency and depressive symptoms was weakened by high serum total cholesterol status. These findings suggest that both vitamin D and total cholesterol are important targets for the prevention and treatment of depression.
During the past decade, carbapenemase-producing Enterobacteriaceae (CPE) has emerged and spread across the world.1 The major carbapenemase enzymes currently being reported are KPC, NDM-1, VIM, IMP, and OXA.2 Because carbapenemase can be effectively transmitted via mobile genetic elements, and current therapeutic options for CPE infections are extremely limited, CPE may be one of the most serious contemporary threats to public health. However, very little is known about the characteristics of CPE carriage during hospitalization. The aims of this study were to investigate the clearance rate of CPE carriage and determine the number of consecutive negative cultures required to confirm CPE clearance. We also examined CPE transmission among hospitalized patients.
Infect. Control Hosp. Epidemiol. 2015;36(11):1361–1362
Decreased hemoglobin levels increase the risk of developing dementia among the elderly. However, the underlying mechanisms that link decreased hemoglobin levels to incident dementia still remain unclear, possibly due to the fact that few studies have reported on the relationship between low hemoglobin levels and neuroimaging markers. We, therefore, investigated the relationships between decreased hemoglobin levels, cerebral small-vessel disease (CSVD), and cortical atrophy in cognitively healthy women and men.
Methods:
Cognitively normal women (n = 1,022) and men (n = 1,018) who underwent medical check-ups and magnetic resonance imaging (MRI) were enrolled at a health promotion center. We measured hemoglobin levels, white matter hyperintensities (WMH) scales, lacunes, and microbleeds. Cortical thickness was automatically measured using surface based methods. Multivariate regression analyses were performed after controlling for possible confounders.
Results:
Decreased hemoglobin levels were not associated with the presence of WMH, lacunes, or microbleeds in women and men. Among women, decreased hemoglobin levels were associated with decreased cortical thickness in the frontal (Estimates, 95% confidence interval, −0.007, (−0.013, −0.001)), temporal (−0.010, (−0.018, −0.002)), parietal (−0.009, (−0.015, −0.003)), and occipital regions (−0.011, (−0.019, −0.003)). Among men, however, no associations were observed between hemoglobin levels and cortical thickness.
Conclusion:
Our findings suggested that decreased hemoglobin levels affected cortical atrophy, but not increased CSVD, among women, although the association is modest. Given the paucity of modifiable risk factors for age-related cognitive decline, our results have important public health implications.
During the past decades, a rapid nutritional transition has been observed along with economic growth in the Republic of Korea. Since this dramatic change in diet has been frequently associated with cancer and other non-communicable diseases, dietary monitoring is essential to understand the association. Benefiting from pre-existing standardised dietary methodologies, the present study aimed to evaluate the feasibility and describe the development of a Korean version of the international computerised 24 h dietary recall method (GloboDiet software) and its complementary tools, developed at the International Agency for Research on Cancer (IARC), WHO. Following established international Standard Operating Procedures and guidelines, about seventy common and country-specific databases on foods, recipes, dietary supplements, quantification methods and coefficients were customised and translated. The main results of the present study highlight the specific adaptations made to adapt the GloboDiet software for research and dietary surveillance in Korea. New (sub-) subgroups were added into the existing common food classification, and new descriptors were added to the facets to classify and describe specific Korean foods. Quantification methods were critically evaluated and adapted considering the foods and food packages available in the Korean market. Furthermore, a picture book of foods/dishes was prepared including new pictures and food portion sizes relevant to Korean diet. The development of the Korean version of GloboDiet demonstrated that it was possible to adapt the IARC-WHO international dietary tool to an Asian context without compromising its concept of standardisation and software structure. It, thus, confirms that this international dietary methodology, used so far only in Europe, is flexible and robust enough to be customised for other regions worldwide.
To determine the influence of early pain relief for patients with suspected appendicitis on the diagnostic performance of surgical residents.
Methods
A prospective randomized, double-blind, placebo-controlled trial was conducted for patients with suspected appendicitis. The patients were randomized to receive placebo (normal saline intravenous [IV]) infusions over 5 minutes or the study drug (morphine 5 mg IV). All of the clinical evaluations by surgical residents were performed 30 minutes after administration of the study drug or placebo. After obtaining the clinical probability of appendicitis, as determined by the surgical residents, abdominal computed tomography was performed. The primary objective was to compare the influence of IV morphine on the ability of surgical residents to diagnose appendicitis.
Results
A total of 213 patients with suspected appendicitis were enrolled. Of these patients, 107 patients received morphine, and 106 patients received placebo saline. The negative appendectomy percentages in each group were similar (3.8% in the placebo group and 3.2% in the pain control group, p=0.62). The perforation rates in each group were also similar (18.9% in the placebo group and 14.3% in the pain control group, p=0.75). Receiver operating characteristic analysis revealed that the overall diagnostic accuracy in each group was similar (the area under the curve of the placebo group and the pain control group was 0.63 v. 0.61, respectively, p=0.81).
Conclusions
Early pain control in patients with suspected appendicitis does not affect the diagnostic performance of surgical residents.
There is increasing evidence of a relationship between underweight or obesity and dementia risk. Several studies have investigated the relationship between body weight and brain atrophy, a pathological change preceding dementia, but their results are inconsistent. Therefore, we aimed to evaluate the relationship between body mass index (BMI) and cortical atrophy among cognitively normal participants.
Methods:
We recruited cognitively normal participants (n = 1,111) who underwent medical checkups and detailed neurologic screening, including magnetic resonance imaging (MRI) in the health screening visits between September 2008 and December 2011. The main outcome was cortical thickness measured using MRI. The number of subjects with five BMI groups in men/women was 9/9, 148/258, 185/128, 149/111, and 64/50 in underweight, normal, overweight, mild obesity, and moderate to severe obesity, respectively. Linear and non-linear relationships between BMI and cortical thickness were examined using multiple linear regression analysis and generalized additive models after adjustment for potential confounders.
Results:
Among men, underweight participants showed significant cortical thinning in the frontal and temporal regions compared to normal weight participants, while overweight and mildly obese participants had greater cortical thicknesses in the frontal region and the frontal, temporal, and occipital regions, respectively. However, cortical thickness in each brain region was not significantly different in normal weight and moderate to severe obesity groups. Among women, the association between BMI and cortical thickness was not statistically significant.
Conclusions:
Our findings suggested that underweight might be an important risk factor for pathological changes in the brain, while overweight or mild obesity may be inversely associated with cortical atrophy in cognitively normal elderly males.
Epidemiological studies have reported that higher education (HE) is associated with a reduced risk of incident Alzheimer's disease (AD). However, after the clinical onset of AD, patients with HE levels show more rapid cognitive decline than patients with lower education (LE) levels. Although education level and cognition have been linked, there have been few longitudinal studies investigating the relationship between education level and cortical decline in patients with AD. The aim of this study was to compare the topography of cortical atrophy longitudinally between AD patients with HE (HE-AD) and AD patients with LE (LE-AD).
Methods:
We prospectively recruited 36 patients with early-stage AD and 14 normal controls. The patients were classified into two groups according to educational level, 23 HE-AD (>9 years) and 13 LE-AD (≤9 years).
Results:
As AD progressed over the 5-year longitudinal follow-ups, the HE-AD showed a significant group-by-time interaction in the right dorsolateral frontal and precuneus, and the left parahippocampal regions compared to the LE-AD.
Conclusion:
Our study reveals that the preliminary longitudinal effect of HE accelerates cortical atrophy in AD patients over time, which underlines the importance of education level for predicting prognosis.
Visual hallucination (VH) is a common psychotic symptom in patients with Parkinson's disease (PD) and may be a significant predictor of cognitive impairment (CI) in such patients.
Objective:
This study aimed to investigate the pattern of glucose metabolism of VH and the relationship between VH and CI in PD.
Methods:
We studied 28 PD patients, including 15 with VH (PD-VH) and 13 without VH (PD-NVH). Of the 15 PD-VH patients, 8 patients had cognitive impairment (PD-VHCI) whereas 7 did not (PD-VHNCI). All patients underwent [18F] fluorodeoxyglucose positron emission tomography ([18F] FDG PET) followed by statistical parametric mapping (SPM) analyses.
Results:
Compared to the patients with PDNVH, PD-VHNCI patients showed glucose hypometabolism in the inferior and middle temporal cortices, fusiform gyri, and frontal areas, suggesting the involvement of the ventral visual pathway. Compared to the patients with PDNVH, PD-VHCI patients showed glucose hypometabolism in the temporoparietal association cortices with scattered frontal areas.
Conclusion:
Dysfunction of ventral visual pathway involving the temporal lobe may play a key role in VH development in PD patients. The evolving distribution from the ventral visual pathway to more extensive posterior cortices in PD-VHCI patients suggests that VH may be a prodromal symptom occurring prior to CI in PD patients.
Cataract, defined as opacity of the lens in one or both eyes, is a major cause of blindness throughout the world, and not uncommon, particularly in the elderly population. However, congenital cataracts are rare and occur with a frequency of 30 cases in 100,000 births. About one-third of the cases fall into the group inherited without systemic abnormality. Importantly, congenital cataracts produce deprivation amblyopia, refractive amblyopia, and retinal detachment, leading to lifelong visual impairment. Successful management is dependent on early diagnosis and referral for surgery when indicated. Here we present a case of hereditary bilateral cataracts in a dizygotic twin detected on prenatal ultrasound examinations and postnatally confirmed as congenital cataracts associated with posterior lenticonus.
Sources of variation in nutrient intake have been examined for Western diets, but little is known about the sources of variation and their differences by age and sex among Koreans. We examined sources of variation in nutrient intake and calculated the number of days needed to estimate usual intake using 12 d of dietary records (DR). To this end, four 3 d DR including two weekdays and one weekend day were collected throughout four seasons of 1 year from 178 male and 236 female adults aged 20–65 years residing in Seoul, Korea. The sources of variation were estimated using the random-effects model, and the variation ratio (within-individual:between-individual) was calculated to determine a desirable number of days. Variations attributable to the day of the week, recording sequence and seasonality were generally small, although the degree of variation differed by sex and age (20–45 years and 46–65 years). The correlation coefficient between the true intake and the observed intake (r) increased with additional DR days, reaching 0·7 at 3–4 d and 0·8 at 6–7 d. However, the degree of increase became attenuated with additional days: r increased by 13·0–26·9 % from 2 to 4 d, by 6·5–16·4 % from 4 to 7 d and by 4·0–11·6 % from 7 to 12 d for energy and fifteen nutrients. In conclusion, the present study suggests that the day of the week, recording sequence and seasonality minimally contribute to the variation in nutrient intake. To measure Korean usual dietary intake using open-ended dietary instruments, 3–4 d may be needed to achieve modest precision (r>0·7) and 6–7 d for high precision (r>0·8).
We used a database approach in developing a dish-based, semi-quantitative FFQ for Korean diet and cancer research. Cancer-related dietary factors (CRDF) recognised in the scientific community and dietary intake data from the 2001 Korean National Health and Nutrition Examination Survey and the 2002 Korean National Nutrition Survey by Season were used. The list of dishes (n 993) was those reported to be consumed by individuals over 30 years of age during all four seasons. The resulting 112-dish list was selected using contribution analyses and variability analyses to detect between-person variation for CRDF and non-CRDF nutrients. Variations of each dish were grouped into one dish for the final list of 112 dishes, which were then linked to the nutrient database. The final 112 dish items consisted of nine Korean staple dishes, including rice and noodles, twenty-five soups and stews, fifty-four side dishes, nine beverages, nine fruit dishes and six alcoholic beverages. The percentage coverages of energy, protein, fat, carbohydrate and alcohol intake in the selected 112 dishes were 82·4, 76·4, 68·9, 86·0 and 99·8 %, respectively. Dietary exposure to cancer-related Korean dietary factors can be assessed by this new dish-based, semi-quantitative FFQ. This new instrument can calculate the intake of CRDF along with non-CRDF nutrient intake for cancer research.
Little is known about the bioavailability of isoflavones in children. Previous studies have shown that children excrete more isoflavone in urine compared with adults. Thus we examined the relationship between usual dietary isoflavone intake and the urinary excretion of isoflavonoids in Korean girls of pubertal age. Twelve girls each were selected from the lowest and the highest quartiles of isoflavone intake among 252 Korean girls aged 8–11 years. Age, BMI and sexual maturation stage were matched between the two groups. Dietary intakes for 3 d by diet record and overnight urine samples were collected at baseline and at 6 and 12 months. Total and individual isoflavone (daidzein, genistein and glycitein) intakes were calculated from diet records. The parent isoflavone compounds (daidzein, genistein and glycitein) and their metabolites (equol, O-desmethylangolensin (O-DMA), dihydrodaidzein and dihydrogenistein) present in the urine samples were analysed using liquid chromatography–MS. Intake levels of total and individual isoflavone compounds were significantly higher in the high isoflavone (HI) group than the levels in the low isoflavone (LI) group (P < 0·05). Urinary excretion of all isoflavone parent compounds was significantly higher in the HI group than in the LI group (P < 0·0001). Among isoflavone metabolites, only O-DMA and total metabolites were significantly different (P < 0·05). Total isoflavone intake was highly correlated with the urinary excretion of total parent compounds (r 0·68; P < 0·01), parent compounds plus their metabolites (r 0·66–0·69; P < 0·01) and total isoflavonoids (r 0·72; P < 0·0001). In conclusion, overnight urinary excretion of total isoflavonoids is a reliable biomarker of usual isoflavone intake in Korean girls of pubertal age.
This study compared the developmental competence of somatic cell nuclear transfer (SCNT) embryos reconstructed with different donor cells and analysed gene expression in the resulting embryos. Bovine fetal/adult ear fibroblasts and cumulus cells were used as donor cells and the developmental competence of the reconstructed embryos was monitored. The cell number and allocation in blastocysts were determined by differential staining. The Bax, E-cad, IF-tau, Hsp (heat shock protein) 70, Igf2r (insulin-like growth factor 2 receptor), DNMT (DNA methyltransferase) 1 and Mash (mammalian achaete-scute homologue) 2 genes were selected for gene expression analysis. The relative abundance (ratio to GAPDH mRNA) of gene transcripts in blastocysts was measured by semiquantitative reverse transcription-polymerase chain reaction. In experiment 1, development of SCNT preimplantation embryos and the cell numbers of inner cell masses and trophoblasts were not different among SCNT embryos derived from different cell types. In experiment 2, the relative expression of GAPDH and Hsp 70 transcripts was similar in all embryos. The expression of Bax, Igf2r and Mash2 transcripts was significantly increased in SCNT embryos reconstructed with adult fibroblasts. The E-cad transcript levels were reduced in SCNT embryos reconstructed with fetal fibroblasts. Relative abundance of DNMT1 in SCNT embryos derived from fetal fibroblasts was increased, and IF-tau expression in SCNT embryos derived from cumulus cells was increased. In conclusion, depending on the type of donor cells, preimplantation SCNT embryos displayed marked differences in gene expression. This may affect the developmental competence of SCNT embryos reconstructed with different cell types after implantation or during fetal growth in vivo.
The Korean Twin Registry is the first nationwide twin study in Korea. We compiled 154,783 twin pairs from existing nationwide data sources, mainly from address and national health insurance data. The coverage of this registry was almost complete for the twins born since 1970, but less complete as age increased, so that there were only 990 pairs who were born before 1930. The twins' health examination (N = 54,390 persons) and questionnaire (N = 44,546 persons) results were incorporated into the registry, yielding 12,894 and 9074 concordantly informative pairs. Morbidity and mortality outcomes have been followed up since 1990, for most diseases. For preliminary analysis of complex diseases, we selected ventricular septal defects (VSD) in young twins, stomach and colorectal cancers in adult twins. We identified 353 VSDs, 284 stomach cancers, and 116 colorectal cancers among twins. The prevalence rates of cancers, but not that of VSD, were lower in twins than those in population. The difference in the cancer prevalence was marked for twins born before 1926, implying some degree of selection. Like-sex (LS) twins showed familial recurrence risks (λLS) of 41.2 for VSD and 22.4 for colorectal cancers, and 1.74 for stomach cancers. For opposite-sex (OS) twins, we could estimate λOS of 19.8 for VSD only. These results were compatible with previous studies for VSD and colorectal cancers, but not for stomach cancers. Despite the strength in size, availability of health outcomes, and some lifestyle and basic laboratory data, we need accurate zygosity information to improve the validity of the results.