We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Nosocomial transmission of COVID-19 among immunocompromised hosts can have a serious impact on COVID-19 severity, underlying disease progression and SARS-CoV-2 transmission to other patients and healthcare workers within hospitals. We experienced a nosocomial outbreak of COVID-19 in the setting of a daycare unit for paediatric and young adult cancer patients. Between 9 and 18 November 2020, 473 individuals (181 patients, 247 caregivers/siblings and 45 staff members) were exposed to the index case, who was a nursing staff. Among them, three patients and four caregivers were infected. Two 5-year-old cancer patients with COVID-19 were not severely ill, but a 25-year-old cancer patient showed prolonged shedding of SARS-CoV-2 RNA for at least 12 weeks, which probably infected his mother at home approximately 7–8 weeks after the initial diagnosis. Except for this case, no secondary transmission was observed from the confirmed cases in either the hospital or the community. To conclude, in the day care setting of immunocompromised children and young adults, the rate of in-hospital transmission of SARS-CoV-2 was 1.6% when applying the stringent policy of infection prevention and control, including universal mask application and rapid and extensive contact investigation. Severely immunocompromised children/young adults with COVID-19 would have to be carefully managed after the mandatory isolation period while keeping the possibility of prolonged shedding of live virus in mind.
There are growing concerns about the impact of the COVID-19 pandemic on the mental health of older adults. We examined the effect of the pandemic on the risk of depression in older adults.
Methods
We analyzed data from the prospective cohort study of Korean older adults, which has been followed every 2 years. Among the 2308 participants who completed both the third and the fourth follow-up assessments, 58.4% completed their fourth follow-up before the outbreak of COVID-19 and the rest completed it during the pandemic. We conducted face-to-face diagnostic interviews using Mini International Neuropsychiatric Interview and used Geriatric Depression Scale. We performed generalized estimating equations and logistic regression analyses.
Results
The COVID-19 pandemic was associated with increased depressive symptoms in older adults [b (standard error) = 0.42 (0.20), p = 0.040] and a doubling of the risk for incident depressive disorder even in euthymic older adults without a history of depression (odds ratio = 2.44, 95% confidence interval 1.18–5.02, p = 0.016). Less social activities, which was associated with the risk of depressive disorder before the pandemic, was not associated with the risk of depressive disorder during the pandemic. However, less family gatherings, which was not associated with the risk of depressive disorder before the pandemic, was associated with the doubled risk of depressive disorder during the pandemic.
Conclusions
The COVID-19 pandemic significantly influences the risk of late-life depression in the community. Older adults with a lack of family gatherings may be particularly vulnerable.
Accumulating evidence suggests that alterations in inflammatory biomarkers are important in depression. However, previous meta-analyses disagree on these associations, and errors in data extraction may account for these discrepancies.
Methods
PubMed/MEDLINE, Embase, PsycINFO, and the Cochrane Library were searched from database inception to 14 January 2020. Meta-analyses of observational studies examining the association between depression and levels of tumor necrosis factor-α (TNF-α), interleukin 1-β (IL-1β), interleukin-6 (IL-6), and C-reactive protein (CRP) were eligible. Errors were classified as follows: incorrect sample sizes, incorrectly used standard deviation, incorrect participant inclusion, calculation error, or analysis with insufficient data. We determined their impact on the results after correction thereof.
Results
Errors were noted in 14 of the 15 meta-analyses included. Across 521 primary studies, 118 (22.6%) showed the following errors: incorrect sample sizes (20 studies, 16.9%), incorrect use of standard deviation (35 studies, 29.7%), incorrect participant inclusion (7 studies, 5.9%), calculation errors (33 studies, 28.0%), and analysis with insufficient data (23 studies, 19.5%). After correcting these errors, 11 (29.7%) out of 37 pooled effect sizes changed by a magnitude of more than 0.1, ranging from 0.11 to 1.15. The updated meta-analyses showed that elevated levels of TNF- α, IL-6, CRP, but not IL-1β, are associated with depression.
Conclusions
These findings show that data extraction errors in meta-analyses can impact findings. Efforts to reduce such errors are important in studies of the association between depression and peripheral inflammatory biomarkers, for which high heterogeneity and conflicting results have been continuously reported.
Background: The purpose of this study was to find out the relationship between appropriateness of antibiotic prescription and clinical outcomes in patients with community-acquired acute pyelonephritis (CA-APN). Methods: A multicenter prospective cohort study was performed in 8 Korean hospitals from September 2017 to August 2018. All hospitalized patients aged ≥19 years diagnosed with CA-APN at admission were recruited. Pregnant women and patients with insufficient data were excluded. In addition, patients with prolonged hospitalization due to medical problems that were not associated with APN treatment were excluded. The appropriateness of empirical and definitive antibiotics was divided into “optimal,” “suboptimal,” and “inappropriate,” and optimal and suboptimal were regarded as appropriate antibiotic use. The standard for the classification of empirical antibiotics was defined reflecting the Korean national guideline for the antibiotic use in urinary tract infection 2018. The standards for the classification of definitive antibiotics were defined according to the result of in vitro susceptibility tests of causative organisms. Clinical outcomes including clinical failure (mortality or recurrence) rate, hospitalization days, and medical costs were compared between patients who were prescribed antibiotics appropriately and those who were prescribed them inappropriately. Results: In total, 397 and 318 patients were eligible for the analysis of the appropriateness of empirical and definitive antibiotics, respectively. Of these, 10 (2.5%) and 18 (5.7%) were inappropriately prescribed empirical and definitive antibiotics, respectively, and 28 (8.8%) were prescribed either empirical or definitive antibiotics inappropriately. Patients who were prescribed empirical antibiotics appropriately showed a lower mortality rate (0 vs 10%; P = .025), shorter hospitalization days (9 vs 12.5 days; P = .014), and lower medical costs (US$2,333 vs US$4,531; P = .007) compared to those who were prescribed empirical antibiotics “inappropriately.” In comparison, we detected no significant differences in clinical outcomes between patients who were prescribed definitive antibiotics appropriately and those who were prescribed definitive antibiotics inappropriately. Patients who were prescribed both empirical and definitive antibiotics appropriately showed a lower clinical failure rate (0.3 vs 7.1%; P = .021) and shorter hospitalization days (9 vs 10.5 days; P = .041) compared to those who were prescribed either empirical or definitive antibiotics inappropriately. Conclusions: Appropriate use of antibiotics leads patients with CA-APN to better clinical outcomes including fewer hospitalization days and lower medical costs.
This study aimed to evaluate manufacturers’ perceptions of the decision-making process for new drug reimbursement and to formulate implications in operating a health technology assessment system. In 2019, we conducted a questionnaire survey and a semistructured group interview for domestic (n = 6) and foreign manufacturers (n = 9) who had vast experience in introducing new medicines into the market through a health technology assessment. Representatives of manufacturers indicated that disease severity, budget impact, existence of alternative treatment, and health-related quality of life were relevant criteria when assessing reimbursement decisions. Compared with domestic manufacturers, foreign manufacturers were risk takers when making reimbursement decisions in terms of adopting a new drug and managing pharmaceutical expenditure. However, foreign manufacturers were risk-averse when evaluating new drugs with uncertainties based on real-world data such as clinical effectiveness. Based on manufacturers’ perceptions of the decision-making process for new drug reimbursement, there is room for improvement in health technology assessment systems. Explaining the underlying reasons behind their decisions, unbiased participation by various stakeholders and their embedded roles in the decision-making process need to be emphasized. However, the measures suggested in this study should be introduced with cautions. The process of health technology assessment might be a target for those who undermine the system in pursuit of their private interests.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
Methods:
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
Results:
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Conclusions:
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
The study aims to examine whether cognitive deficits are different between patients with early stage Alzheimer's disease (AD) and patients with early stage vascular dementia (VaD) using the Korean version of the CERAD neuropsychological battery (CERAD-K-N).
Methods
Patients with early stage dementia, global Clinical Dementia Rating (CDR) 0.5 or 1 were consecutively recruited among first visitors to a dementia clinic, 257 AD patients and 90 VaD patients completed the protocol of the Korean version of the CERAD clinical assessment battery. CERAD-K-N was administered for the comprehensive evaluation of the neuropsychological function.
Results
Of the total 347 participants, 257 (69.1%) were AD group (CDR 0.5 = 66.9%) and 90 (21.9%) were VaD group (CDR 0.5 = 40.0%). Patients with very mild AD showed poorer performances in Boston naming test (BNT) (P = 0.028), word list memory test (P < 0.001), word list recall test (P < 0.001) and word list recognition test (WLRcT) (P = 0.006) than very mild VaD after adjustment of T score of MMSE-KC. However, the performance of trail making A (TMA) was more impaired in VaD group than in AD group. The performance of WLRcT (P < 0.001) was the worst among neuropsychological tests within AD group, whereas TMA was performed worst within VaD group.
Conclusions
Patients with early-stage AD have more cognitive deficits on memory and language while patients with early-stage VaD show worse cognitive function on attention/processing speed. In addition, as the first cognitive deficit, memory dysfunction comes in AD and deficit in attention/processing speed in VaD.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Serotonergic dysfunction may play an important role in motor and nonmotor symptoms of Parkinson’s disease (PD). The loudness dependence of auditory evoked potentials (LDAEP) has been used to evaluate serotonergic activity. Therefore, this study aimed to determine central serotonergic activity using LDAEP in de novo PD according to the age at onset and changes in serotonergic activity after dopaminergic treatment.
Methods:
A total of 30 patients with unmedicated PD, 16 in the early-onset and 14 in the late-onset groups, were enrolled. All subjects underwent comprehensive neurological examination, laboratory tests, the Unified Parkinson’s Disease Rating Scale, and LDAEP. The LDAEP was calculated as the slope of the two N1/P2 peaks measured at the Cz electrode, first at baseline conditions (pretreatment) and a second time after 12 weeks (post-treatment) following dopaminergic medications.
Results:
The absolute values of pretreatment N1/P2 LDAEP (early-onset: late-onset, 0.99 ± 0.68: 1.62 ± 0.88, p = 0.035) and post-treatment N1 LDAEP (early-onset: late-onset, −0.61 ± 0.61: −1.26 ± 0.91, p = 0.03) were significantly lower in the early-onset group compared with those of the late-onset group. In addition, a higher value of pretreatment N1/P2 LDAEP was significantly correlated with the late-onset group (coefficient = 1.204, p = 0.044). The absolute value of the N1 LDAEP decreased after 12 weeks of taking dopaminergic medication (pretreatment: post-treatment, −1.457 ± 1.078: −0.904 ± 0.812, p = 0.0018).
Conclusions:
Based on the results of this study, LDAEP could be a marker for serotonergic neurotransmission in PD. Central serotonergic activity assessed by LDAEP may be more preserved in early-onset PD patients and can be altered with dopaminergic medication.
Since the significance of metacognition as the theoretical basis of a psychological intervention for schizophrenia first emerged, there have been ongoing attempts to restore or strengthen patients’ metacognitive abilities.
Aim:
A Korean version of the metacognitive training (MCT) program was developed, and its effects on theory of mind, positive and negative symptoms, and interpersonal relationships were examined in stable outpatients with schizophrenia.
Method:
A pre-test–post-test design with a control group was used. The participants were 59 outpatients (30 in experimental group, 29 in control group) registered at five mental health facilities in a city in South Korea. The developed MCT program was applied for a total of 18 sessions, 60 min per session, over a period of 14 weeks. The hinting task, false belief task, Scale for the Assessment of Positive and Negative Symptoms, and Relationship Change Scale were used to verify the effects of this program. Data were analysed by the chi-square test, t-test, and Mann–Whitney U-test using the SPSS/PASW 18.0 statistics program.
Results:
The general characteristics, intelligence, and outcome variables of the two groups were homogeneous. After the intervention, the experimental group showed significant improvements in theory of mind, positive and negative symptoms and interpersonal relationships compared with the control group.
Conclusion:
These results suggest that the MCT program can be a complementary psychotherapy that contributes to symptom relief and interpersonal functioning in patients with schizophrenia, and is effective in the Korean culture, beyond the Western context.
Refugees commonly experience difficulties with emotional processing, such as alexithymia, due to stressful or traumatic experiences. However, the functional connectivity of the amygdala, which is central to emotional processing, has yet to be assessed in refugees. Thus, the present study investigated the resting-state functional connectivity of the amygdala and its association with emotional processing in North Korean (NK) refugees.
Methods
This study included 45 NK refugees and 40 native South Koreans (SK). All participants were administered the Toronto Alexithymia Scale (TAS), Beck Depression Inventory (BDI), and Clinician-administered PTSD Scale (CAPS), and differences between NK refugees and native SK in terms of resting-state functional connectivity of the amygdala were assessed. Additionally, the association between the strength of amygdala connectivity and the TAS score was examined.
Results
Resting-state connectivity values from the left amygdala to the bilateral dorsolateral prefrontal cortex (dlPFC) and dorsal anterior cingulate cortex (dACC) were higher in NK refugees than in native SK. Additionally, the strength of connectivity between the left amygdala and right dlPFC was positively associated with TAS score after controlling for the number of traumatic experiences and BDI and CAPS scores.
Conclusions
The present study found that NK refugees exhibited heightened frontal–amygdala connectivity, and that this connectivity was correlated with alexithymia. The present results suggest that increased frontal–amygdala connectivity in refugees may represent frontal down-regulation of the amygdala, which in turn may produce alexithymia.
Scholars often assume that reference groups are industry-wide, homogeneous, and stable. We examine this assumption and suggest hypotheses based on managers’ motivations such as self-enhancement and self-improvement, social identity, and affiliation-based impression management. We test hypotheses on failure-induced changes in reference groups and their direction in terms of upward and downward comparisons. An empirical examination of changes in reference groups for firms listed on the Dow Jones Industrial Average Index between 1993 and 2008 shows that performance below social aspirations induces changes in reference groups and toward upward comparisons. The results indicate that managers can choose to change the reference group – a cognition-centered response – as an alternative to such action-centered responses as organizational search and risk-taking in response to poor performance from social aspirations and that upward comparisons may be the result of social performance shortfalls to give a better impression and to improve firm performance.
Given its diverse disease courses and symptom presentations, multiple phenotype dimensions with different biological underpinnings are expected with bipolar disorders (BPs). In this study, we aimed to identify lifetime BP psychopathology dimensions. We also explored the differing associations with bipolar I (BP-I) and bipolar II (BP-II) disorders.
Methods
We included a total of 307 subjects with BPs in the analysis. For the factor analysis, we chose six variables related to clinical courses, 29 indicators covering lifetime symptoms of mood episodes, and 6 specific comorbid conditions. To determine the relationships among the identified phenotypic dimensions and their effects on differentiating BP subtypes, we applied structural equation modeling.
Results
We selected a six-factor solution through scree plot, Velicer's minimum average partial test, and face validity evaluations; the six factors were cyclicity, depression, atypical vegetative symptoms, elation, psychotic/irritable mania, and comorbidity. In the path analysis, five factors excluding atypical vegetative symptoms were associated with one another. Cyclicity, depression, and comorbidity had positive associations, and they correlated negatively with psychotic/irritable mania; elation showed positive correlations with cyclicity and psychotic/irritable mania. Depression, cyclicity, and comorbidity were stronger in BP-II than in BP-I, and they contributed significantly to the distinction between the two disorders.
Conclusions
We identified six phenotype dimensions; in addition to symptom features of manic and depressive episodes, various comorbidities and high cyclicity constructed separate dimensions. Except for atypical vegetative symptoms, all factors showed a complex interdependency and played roles in discriminating BP-II from BP-I.
Our objective was to evaluate long-term altered appearance, distress, and body image in posttreatment breast cancer patients and compare them with those of patients undergoing active treatment and with general population controls.
Method:
We conducted a cross-sectional survey between May and December of 2010. We studied 138 breast cancer patients undergoing active treatment and 128 posttreatment patients from 23 Korean hospitals and 315 age- and area-matched subjects drawn from the general population. Breast, hair, and skin changes, distress, and body image were assessed using visual analogue scales and the EORTC BR–23. Average levels of distress were compared across groups, and linear regression was utilized to identify the factors associated with body image.
Results:
Compared to active-treatment patients, posttreatment patients reported similar breast changes (6.6 vs. 6.2), hair loss (7.7 vs. 6.7), and skin changes (5.8 vs. 5.4), and both groups had significantly more severe changes than those of the general population controls (p < 0.01). For a similar level of altered appearance, however, breast cancer patients experienced significantly higher levels of distress than the general population. In multivariate analysis, patients with high altered appearance distress reported significantly poorer body image (–20.7, CI95% = –28.3 to –13.1) than patients with low distress.
Significance of results:
Posttreatment breast cancer patients experienced similar levels of altered appearance, distress, and body-image disturbance relative to patients undergoing active treatment but significantly higher distress and poorer body image than members of the general population. Healthcare professionals should acknowledge the possible long-term effects of altered appearance among breast cancer survivors and help them to manage the associated distress and psychological consequences.
The aim of this study is to develop predictive models to predict organ at risk (OAR) complication level, classification of OAR dose-volume and combination of this function with our in-house developed treatment decision support system.
Materials and methods
We analysed the support vector machine and decision tree algorithm for predicting OAR complication level and toxicity in order to integrate this function into our in-house radiation treatment planning decision support system. A total of 12 TomoTherapyTM treatment plans for prostate cancer were established, and a hundred modelled plans were generated to analyse the toxicity prediction for bladder and rectum.
Results
The toxicity prediction algorithm analysis showed 91·0% accuracy in the training process. A scatter plot for bladder and rectum was obtained by 100 modelled plans and classification result derived. OAR complication level was analysed and risk factor for 25% bladder and 50% rectum was detected by decision tree. Therefore, it was shown that complication prediction of patients using big data-based clinical information is possible.
Conclusion
We verified the accuracy of the tested algorithm using prostate cancer cases. Side effects can be minimised by applying this predictive modelling algorithm with the planning decision support system for patient-specific radiotherapy planning.
The National Institute of Neurological Disease and Stroke-Canadian Stroke Network (NINDS-CSN) 5-minute neuropsychology protocol consists of only verbal tasks, and is proposed as a brief screening method for vascular cognitive impairment. We evaluated its feasibility within two weeks after stroke and ability to predict the development of post-stroke dementia (PSD) at 3 months after stroke.
Method:
We prospectively enrolled subjects with ischemic stroke within seven days of symptom onset who were consecutively admitted to 12 university hospitals. Neuropsychological assessments using the NINDS-CSN 5-minute and 60-minute neuropsychology protocols were administered within two weeks and at 3 months after stroke onset, respectively. PSD was diagnosed with reference to the American Heart Association/American Stroke Association statement, requiring deficits in at least two cognitive domains.
Results:
Of 620 patients, 512 (82.6%) were feasible for the NINDS-CSN 5-minute protocol within two weeks after stroke. The incidence of PSD was 16.2% in 308 subjects who had completed follow-up at 3 months after stroke onset. The total score of the NINDS-CSN 5-minute protocol differed significantly between those with and without PSD (4.0 ± 2.7, 7.4 ± 2.7, respectively; p < 0.01). A cut-off value of 6/7 showed reasonable discriminative power (sensitivity 0.82, specificity 0.67, AUC 0.74). The NINDS-CSN 5-minute protocol score was a significant predictor for PSD (adjusted odds ratio 6.32, 95% CI 2.65–15.05).
Discussion:
The NINDS-CSN 5-minute protocol is feasible to evaluate cognitive functions in patients with acute ischemic stroke. It might be a useful screening method for early identification of high-risk groups for PSD.
Vertically aligned BaTiO3 nanowire (NW) arrays on a Ti substrate were adopted for use in piezoelectric energy harvesting device that scavenges electricity from mechanical energy. BaTiO3 NWs were simultaneously grown at the top and bottom surfaces of a Ti substrate by two-step hydrothermal process. To characterized the piezoelectric output performance of the individual NW, we transferred a BaTiO3 single NW that was selected from well-aligned NW arrays onto a flexible substrate and measured the electric signals during the bending/unbending motions. For fabricating a piezoelectric energy harvester (PEH), both NW arrays were sandwiched between two transparent indium tin oxide (ITO)-coated polyethylene terephthalate (PET) plastic films and then packaged with polydimethylsiloxane (PDMS) elastomer. A lead-free BaTiO3 NW array-based PEH produced an output voltage of about 90 V and a maximum current of 1.2 μA under periodically bending motions.
Personality may predispose family caregivers to experience caregiving differently in similar situations and influence the outcomes of caregiving. A limited body of research has examined the role of some personality traits for health-related quality of life (HRQoL) among family caregivers of persons with dementia (PWD) in relation to burden and depression.
Methods:
Data from a large clinic-based national study in South Korea, the Caregivers of Alzheimer's Disease Research (CARE), were analyzed (N = 476). Path analysis was performed to explore the association between family caregivers’ personality traits and HRQoL. With depression and burden as mediating factors, direct and indirect associations between five personality traits and HRQoL of family caregivers were examined.
Results:
Results demonstrated the mediating role of caregiver burden and depression in linking two personality traits (neuroticism and extraversion) and HRQoL. Neuroticism and extraversion directly and indirectly influenced the mental HRQoL of caregivers. Neuroticism and extraversion only indirectly influenced their physical HRQoL. Neuroticism increased the caregiver's depression, whereas extraversion decreased it. Neuroticism only was mediated by burden to influence depression and mental and physical HRQoL.
Conclusions:
Personality traits can influence caregiving outcomes and be viewed as an individual resource of the caregiver. A family caregiver's personality characteristics need to be assessed for tailoring support programs to get the optimal benefits from caregiver interventions.
The development of embryonic stem cells (ESCs) from large animal species has become an important model for therapeutic cloning using ESCs derived by somatic cell nuclear transfer (SCNT). However, poor embryo quality and blastocyst formation have been major limitations for derivation of cloned ESCs (ntESCs). In this study, we have tried to overcome these problems by treating these cells with histone deacetylase inhibitors (HDACi) and aggregating porcine embryos. First, cloned embryos were treated with Scriptaid to confirm the effect of HDACi on cloned embryo quality. The Scriptaid-treated blastocysts showed significantly higher total cell numbers (29.50 ± 2.10) than non-treated blastocysts (22.29 ± 1.50, P < 0.05). Next, cloned embryo quality and blastocyst formation were analyzed in aggregates. Three zona-free, reconstructed, four-cell-stage SCNT embryos were injected into the empty zona of hatched parthenogenetic (PA) blastocysts. Blastocyst formation and total cell number of cloned blastocysts increased significantly for all aggregates (76.4% and 83.18 ± 8.33) compared with non-aggregates (25.5% and 27.11 ± 1.67, P < 0.05). Finally, aggregated blastocysts were cultured on a feeder layer to examine the efficiency of porcine ES-like cell derivation. Aggregated blastocysts showed a higher primary colony formation rate than non-aggregated cloned blastocysts (17.6 ± 12.3% vs. 2.2 ± 1.35%, respectively, P < 0.05). In addition, derived ES-like cells showed typical characters of ESCs. In conclusion, the aggregation of porcine SCNT embryos at the four-cell stage could be a useful technique for improving the development rate and quality of porcine-cloned blastocysts and the derivation efficiency of porcine ntESCs.