We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article examines the phenomenal growth of Korean cultural industries and their export to East Asia and other parts of the world. In the early years of industrialisation, culture was used by the authoritarian regime as a form of ideological support. Strict controls were exercised over cultural production and presentation. Controls were later relaxed as the regime used entertainment and sport as popular distractions. However, the increase in television ownership, the growth of domestic electronics and home appliance industries, and rising incomes (especially among the middle class) provided the material base for the growth of cultural industries following democratisation in the 1990s. Cultural industries became key drivers of economic growth, innovation and employment, and were strongly promoted and supported by government in the style of the developmental state. The result was burgeoning production and international trade across a wide spectrum of cultural industries – film, television drama, animation, video games and music. As a reflection of the increasing integration of Korea into world markets, the government also had to ensure compliance with international trade regulations and clamp down on piracy. Today, the Korean Wave of popular culture has reached consumers in all parts of the world and makes a significant contribution to Korean gross domestic product and exports.
Identifying more homogenous subtypes of patients with obsessive–compulsive disorder (OCD) using biological evidence is critical for understanding complexities of the disorder in this heterogeneous population. Age of onset serves as a useful subtyping scheme for distinguishing OCD into two subgroups that aligns with neurodevelopmental perspectives. The underlying neurobiological markers for these distinct neurodevelopmental differences can be identified by investigating gyrification changes to establish biological evidence-based homogeneous subtypes.
Methods
We compared whole-brain cortical gyrification in 84 patients with early-onset OCD, 84 patients with late-onset OCD, and 152 healthy controls (HCs) to identify potential markers for early neurodevelopmental deficits using the local gyrification index (lGI). Then, the relationships between lGI in clusters showing significant differences and performance in visuospatial memory and verbal fluency, which are considered trait-related neurocognitive impairments in OCD, were further examined in early-onset OCD patients.
Results
The early-onset OCD patients exhibited significantly greater gyrification than those with late-onset OCD patients and HCs in frontoparietal and cingulate regions, including the bilateral precentral, postcentral, precuneus, paracentral, posterior cingulate, superior frontal, and caudal anterior cingulate gyri. Moreover, impaired neurocognitive functions in early-onset OCD patients were correlated with increased gyrification.
Conclusions
Our findings provide a neurobiological marker to distinguish the OCD population into more neurodevelopmentally homogeneous subtypes, which may contribute to the understanding of the neurodevelopmental underpinnings of an etiology in early-onset OCD consistent with the accumulated phenotypic evidence of greater neurodevelopmental deficits in early-onset OCD than in late-onset OCD.
We perform a three-dimensional direct numerical simulation of flow over the Tacoma Narrows Bridge to understand the vertical and torsional vibrations that occurred before its collapse in 1940. Real-scale structural parameters of the bridge are used for the simulation. The Reynolds number based on the free-stream velocity and height of the deck fence is lower (${Re}=10\ 000$) than the actual one on the day of its collapse (${Re}=3.06 \times 10^{6}$), but the magnitude of a fluid property is modified to provide the real-scale aerodynamic force and moment on the deck. The vertical and torsional vibrations are simulated through two-way coupling of the fluid flow and structural motion. The vertical vibration occurs from the frequency lock-in with the vortex shedding, and its wavelength and frequency agree well with the recorded data in 1940. After saturation of the vertical vibration, a torsional vibration resulting from the aeroelastic fluttering grows exponentially in time, with its wavelength and frequency in excellent agreement with the recorded data of the incident. The critical flutter wind speed for the growth of torsional vibration is obtained as $3.56 < U_c / (f_{nat} B) \le 4$, where $U_{c}$ is the critical flutter wind speed, $f_{nat}$ is the natural frequency of the torsional vibration and $B$ is the deck width. Finally, apart from the actual vibration process in 1940, we perform more numerical simulations to investigate the roles of the free-stream velocity and vertical vibration in the growth of the torsional vibration.
There has been increasing evidence of hormonal changes during reproductive events that lead to mood changes. However, studies on the severity of psychological problems according to the menopausal stage are limited. Thus, this study aimed to investigate the association between menopausal stages, depression and suicidality.
Methods
A total of 45 177 women who underwent regular health check-ups between 2015 and 2018 at Kangbuk Samsung Hospital were included. Participants were stratified into four groups (pre-menopause, early transition, late transition and post-menopause) based on the Stages of Reproductive Aging Workshop Criteria. The Center for Epidemiological Studies-Depression scale (CESD) was used to evaluate depressive symptoms, and the degree of depressive symptoms was classified as moderate (CESD score 16–24) or severe (CESD score ⩾ 25). To measure suicide risk, we administered questionnaires related to suicidal ideation.
Results
Overall, the prevalence of CESD scores of 16–24 and ⩾ 25 was 7.6 and 2.8%, respectively. Menopausal stages were positively associated with depressive symptoms in a dose-dependent manner. Multivariable-adjusted prevalence ratios (PRs, 95% confidence intervals) for CESD scores of 16–24 comparing the stages of the early menopausal transition (MT), late MT and post-menopause to pre-menopause was 1.28 (1.16–1.42), 1.21 (1.05–1.38) and 1.58 (1.36–1.84), respectively. The multivariable-adjusted PRs for CESD scores ⩾ 25 comparing the stages of the early MT, late MT and post-menopause to pre-menopause were 1.31 (1.11–1.55), 1.39 (1.12–1.72), 1.86 (1.47–2.37), respectively. In addition, the multivariable-adjusted PRs for suicidal ideation comparing the early MT, late MT and post-menopause stages to the pre-menopause stage were 1.24 (1.12–1.38), 1.07 (0.93–1.24) and 1.46 (1.25–1.70) (p for trend <0.001), respectively.
Conclusions
These findings indicate that the prevalence of depressive symptoms and suicidal ideation increases with advancing menopausal stage, even pre-menopause.
Despite considerable advances in our understanding of the biology that underlies tumor development and progression of cancer and the rapidly evolving field of personalized medicine, cancer is still one of the deadliest diseases. Many cancer patients have benefited from the survival improvements observed with targeted therapies but only a small subset of patients receiving targeted drugs experience an objective response. Because cancer is a complex and heterogeneous disease, the search for effective cancer treatments will need to address not only patient-specific molecular defects but also aspects of the tumor microenvironment. The functional tumor profiling directly measures the cellular phenotype, in particular tumor growth, in response to drugs using patient-derived tumor models and might be the next step toward precision oncology. In this Element, the authors discuss the personalized drug screening as a novel patient stratification strategy for the determination of individualized treatment choices in oncology.
Background: Clinical guidelines recommend MAP maintenance at 85-90 mmHg to optimize spinal cord perfusion post-SCI. Recently, there has been increased interest in spinal cord perfusion pressure as a surrogate marker for spinal cord blood flow. The study aims to determine the congruency of subdural and intramedullary spinal cord pressure measurements at the site of SCI, both rostral and caudal to the epicenter of injury. Methods: Seven Yucatan pigs underwent a T5 to L1 laminectomy with intramedullary (IM) and subdural (SD) pressure sensors placed 2 mm rostral and 2 mm caudal to the epicenter of SCI. A T10 contusion SCI was performed followed by an 8-hour period of monitoring. Axial ultrasound images were captured at the epicenter of injury pre-SCI, post-SCI, and hourly thereafter. Results: Pigs with pre-SCI cord to dural sac ratio (CDSR) of >0.8 exhibited greater occlusion of the subdural space post-SCI with a positive correlation between IM and SD pressure rostral to the injury and a negative correlation caudal to the epicenter. Pigs with pre-SCI CDSR <0.8 exhibited no correlation between IM and SD pressure. Conclusions: Congruency of IM and SD pressure is dependent on compartmentalization of the spinal cord occurring secondary to swelling that occludes the subdural space.
The topographical effect on a strong wind event that occurred on 7 January 2013 at King Sejong Station (KSJ), Antarctica, was investigated using the Polar Weather Research and Forecasting (WRF) model. Numerical experiments applying three different terrain heights of the Antarctic Peninsula (AP) were performed to quantitatively estimate the topographical effect on the selected strong wind event. The experiment employing original AP topography successfully represented the observed features in the strong wind event, both in terms of peak wind speed (by ~94%; ~19.7 m/s) and abrupt transitions of wind speed. In contrast, the experiment with a flattened terrain height significantly underestimated the peak wind speeds (by ~51%; ~10.4 m/s) of the observations. An absence of AP topography failed to simulate both a strong discontinuity of sea-level pressure fields around the east coast of the AP and a strong south-easterly wind over the AP. As a result, the observed downslope windstorm, driven by a flow overriding a barrier, was not formed at the western side of the AP, resulting in no further enhancement of the wind at KSJ. This result demonstrates that the topography of the AP played a critical role in driving the strong wind event at KSJ on 7 January 2013, accounting for ~50% of the total wind speed.
Accurate prognostication is important for patients and their families to prepare for the end of life. Objective Prognostic Score (OPS) is an easy-to-use tool that does not require the clinicians’ prediction of survival (CPS), whereas Palliative Prognostic Score (PaP) needs CPS. Thus, inexperienced clinicians may hesitate to use PaP. We aimed to evaluate the accuracy of OPS compared with PaP in inpatients in palliative care units (PCUs) in three East Asian countries.
Method
This study was a secondary analysis of a cross-cultural, multicenter cohort study. We enrolled inpatients with far-advanced cancer in PCUs in Japan, Korea, and Taiwan from 2017 to 2018. We calculated the area under the receiver operating characteristics (AUROC) curve to compare the accuracy of OPS and PaP.
Results
A total of 1,628 inpatients in 33 PCUs in Japan and Korea were analyzed. OPS and PaP were calculated in 71.7% of the Japanese patients and 80.0% of the Korean patients. In Taiwan, PaP was calculated for 81.6% of the patients. The AUROC for 3-week survival was 0.74 for OPS in Japan, 0.68 for OPS in Korea, 0.80 for PaP in Japan, and 0.73 for PaP in Korea. The AUROC for 30-day survival was 0.70 for OPS in Japan, 0.71 for OPS in Korea, 0.79 for PaP in Japan, and 0.74 for PaP in Korea.
Significance of results
Both OPS and PaP showed good performance in Japan and Korea. Compared with PaP, OPS could be more useful for inexperienced physicians who hesitate to estimate CPS.
This study aimed to determine the effect of donor-transmitted atherosclerosis on the late aggravation of cardiac allograft vasculopathy in paediatric heart recipients aged ≥7 years.
Methods:
In total, 48 patients were included and 23 had donor-transmitted atherosclerosis (baseline maximal intimal thickness of >0.5 mm on intravascular ultrasonography). Logistic regression analyses were performed to identify risk factors for donor-transmitted atherosclerosis. Rates of survival free from the late aggravation of cardiac allograft vasculopathy (new or worsening cardiac allograft vasculopathy on following angiograms, starting 1 year after transplantation) in each patient group were estimated using the Kaplan–Meier method and compared using the log-rank test. The effect of the results of intravascular ultrasonography at 1 year after transplantation on the late aggravation of cardiac allograft vasculopathy, correcting for possible covariates including donor-transmitted atherosclerosis, was examined using the Cox proportional hazards model.
Results:
The mean follow-up duration after transplantation was 5.97 ± 3.58 years. The log-rank test showed that patients with donor-transmitted atherosclerosis had worse survival outcomes than those without (p = 0.008). Per the multivariate model considering the difference of maximal intimal thickness between baseline and 1 year following transplantation (hazard ratio, 22.985; 95% confidence interval, 1.948–271.250; p = 0.013), donor-transmitted atherosclerosis was a significant covariate (hazard ratio, 4.013; 95% confidence interval, 1.047–15.376; p = 0.043).
Conclusion:
Paediatric heart transplantation recipients with donor-transmitted atherosclerosis aged ≥7 years had worse late cardiac allograft vasculopathy aggravation-free survival outcomes.
Two aphid-transmitted RNA viruses, broad bean wilt virus 2 (BBWV2) and cucumber mosaic virus (CMV), are the most prevalent viruses in Korean pepper fields and cause chronic damage in pepper production. In this study, we employed a screening system for pathotype-specific resistance of pepper germplasm to BBWV2 and CMV by utilizing infectious cDNA clones of different pathotypes of the viruses (two BBWV2 strains and three CMV strains). We first examined pathogenic characteristics of the BBWV2 and CMV strains in various plant species and their phylogenetic positions in the virus population structures. We then screened 34 commercial pepper cultivars and seven accessions for resistance. While 21 pepper cultivars were resistant to CMV Fny strain, only two cultivars were resistant to CMV P1 strain. We also found only one cultivar partially resistant to BBWV2 RP1 strain. However, all tested commercial pepper cultivars were susceptible to the resistance-breaking CMV strain GTN (CMV-GTN) and BBWV2 severe strain PAP1 (BBWV2-PAP1), suggesting that breeding new cultivars resistant to these virus strains is necessary. Fortunately, we identified several pepper accessions that were resistant or partially resistant to CMV-GTN and one symptomless accession despite systemic infection with BBWV2-PAP1. These genetic resources will be useful in pepper breeding programs to deploy resistance to BBWV2 and CMV.
Prognostic heterogeneity in early psychosis patients yields significant difficulties in determining the degree and duration of early intervention; this heterogeneity highlights the need for prognostic biomarkers. Although mismatch negativity (MMN) has been widely studied across early phases of psychotic disorders, its potential as a common prognostic biomarker in early periods, such as clinical high risk (CHR) for psychosis and first-episode psychosis (FEP), has not been fully studied.
Methods
A total of 104 FEP patients, 102 CHR individuals, and 107 healthy controls (HCs) participated in baseline MMN recording. Clinical outcomes were assessed; 17 FEP patients were treatment resistant, 73 FEP patients were nonresistant, 56 CHR individuals were nonremitters (15 transitioned to a psychotic disorder), and 22 CHR subjects were remitters. Baseline MMN amplitudes were compared across clinical outcome groups and tested for utility prognostic biomarkers using binary logistic regression.
Results
MMN amplitudes were greatest in HCs, intermediate in CHR subjects, and smallest in FEP patients. In the clinical outcome groups, MMN amplitudes were reduced from the baseline in both FEP and CHR patients with poor prognostic trajectories. Reduced baseline MMN amplitudes were a significant predictor of later treatment resistance in FEP patients [Exp(β) = 2.100, 95% confidence interval (CI) 1.104–3.993, p = 0.024] and nonremission in CHR individuals [Exp(β) = 1.898, 95% CI 1.065–3.374, p = 0.030].
Conclusions
These findings suggest that MMN could be used as a common prognostic biomarker across early psychosis periods, which will aid clinical decisions for early intervention.
Background: The purpose of this study was to find out the relationship between appropriateness of antibiotic prescription and clinical outcomes in patients with community-acquired acute pyelonephritis (CA-APN). Methods: A multicenter prospective cohort study was performed in 8 Korean hospitals from September 2017 to August 2018. All hospitalized patients aged ≥19 years diagnosed with CA-APN at admission were recruited. Pregnant women and patients with insufficient data were excluded. In addition, patients with prolonged hospitalization due to medical problems that were not associated with APN treatment were excluded. The appropriateness of empirical and definitive antibiotics was divided into “optimal,” “suboptimal,” and “inappropriate,” and optimal and suboptimal were regarded as appropriate antibiotic use. The standard for the classification of empirical antibiotics was defined reflecting the Korean national guideline for the antibiotic use in urinary tract infection 2018. The standards for the classification of definitive antibiotics were defined according to the result of in vitro susceptibility tests of causative organisms. Clinical outcomes including clinical failure (mortality or recurrence) rate, hospitalization days, and medical costs were compared between patients who were prescribed antibiotics appropriately and those who were prescribed them inappropriately. Results: In total, 397 and 318 patients were eligible for the analysis of the appropriateness of empirical and definitive antibiotics, respectively. Of these, 10 (2.5%) and 18 (5.7%) were inappropriately prescribed empirical and definitive antibiotics, respectively, and 28 (8.8%) were prescribed either empirical or definitive antibiotics inappropriately. Patients who were prescribed empirical antibiotics appropriately showed a lower mortality rate (0 vs 10%; P = .025), shorter hospitalization days (9 vs 12.5 days; P = .014), and lower medical costs (US$2,333 vs US$4,531; P = .007) compared to those who were prescribed empirical antibiotics “inappropriately.” In comparison, we detected no significant differences in clinical outcomes between patients who were prescribed definitive antibiotics appropriately and those who were prescribed definitive antibiotics inappropriately. Patients who were prescribed both empirical and definitive antibiotics appropriately showed a lower clinical failure rate (0.3 vs 7.1%; P = .021) and shorter hospitalization days (9 vs 10.5 days; P = .041) compared to those who were prescribed either empirical or definitive antibiotics inappropriately. Conclusions: Appropriate use of antibiotics leads patients with CA-APN to better clinical outcomes including fewer hospitalization days and lower medical costs.
Atomically sharpened tips have attracted much interest in the imaging and manufacturing fields due to their high spatial resolutions. Typically, tungsten (W) is mainly used as the material of such a tip, but when the W tip is used in an oxygen environment, a limit is revealed due to corrosiveness stemming from a reaction with the oxygen gas. To solve this problem, methods of depositing a metal on W that does not react with oxygen have been studied. In this study, we introduce a method of depositing iridium (Ir) directly onto an insulating layer without an additional pretreatment to remove the insulating layer remaining on the W surface, forming an Ir-nanopyramid structure at the apex of the W tip by field evaporation and faceting. Field ion microscopy and atom probe tomography were used to analyze the crystal structure and composition at the apex during the faceting process, and the overall tip shape change after faceting was compared and analyzed with transmission electron microscopy. The proposed method does not have a tip heating step when creating an atomically sharp tip such that it can be made easily with a simpler equipment configuration than in the existing method.
Several studies supported the usefulness of “the surprise question” in terms of 1-year mortality of patients. “The surprise question” requires a “Yes” or “No” answer to the question “Would I be surprised if this patient died in [specific time frame].” However, the 1-year time frame is often too long for advanced cancer patients seen by palliative care personnel. “The surprise question” with shorter time frames is needed for decision making. We examined the accuracy of “the surprise question” for 7-day, 21-day, and 42-day survival in hospitalized patients admitted to palliative care units (PCUs).
Method
This was a prospective multicenter cohort study of 130 adult patients with advanced cancer admitted to 7 hospital-based PCUs in South Korea. The accuracy of “the surprise question” was compared with that of the temporal question for clinician's prediction of survival.
Results
We analyzed 130 inpatients who died in PCUs during the study period. The median survival was 21.0 days. The sensitivity, specificity, and overall accuracy for the 7-day “the surprise question” were 46.7, 88.7, and 83.9%, respectively. The sensitivity, specificity, and overall accuracy for the 7-day temporal question were 6.7, 98.3, and 87.7%, respectively. The c-indices of the 7-day “the surprise question” and 7-day temporal question were 0.662 (95% CI: 0.539–0.785) and 0.521 (95% CI: 0.464–0.579), respectively. The c-indices of the 42-day “the surprise question” and 42-day temporal question were 0.554 (95% CI: 0.509–0.599) and 0.616 (95% CI: 0.569–0.663), respectively.
Significance of results
Surprisingly, “the surprise questions” and temporal questions had similar accuracies. The high specificities for the 7-day “the surprise question” and 7- and 21-day temporal question suggest they may be useful to rule in death if positive.
The aim of this article is to apply a Floer theory to study symmetric periodic Reeb orbits. We define positive equivariant wrapped Floer homology using a (anti-)symplectic involution on a Liouville domain and investigate its algebraic properties. By a careful analysis of index iterations, we obtain a non-trivial lower bound on the minimal number of geometrically distinct symmetric periodic Reeb orbits on a certain class of real contact manifolds. This includes non-degenerate real dynamically convex star-shaped hypersurfaces in
${\mathbb {R}}^{2n}$
which are invariant under complex conjugation. As a result, we give a partial answer to the Seifert conjecture on brake orbits in the contact setting.
Over the past two decades, early detection and early intervention in psychosis have become essential goals of psychiatry. However, clinical impressions are insufficient for predicting psychosis outcomes in clinical high-risk (CHR) individuals; a more rigorous and objective model is needed. This study aims to develop and internally validate a model for predicting the transition to psychosis within 10 years.
Methods
Two hundred and eight help-seeking individuals who fulfilled the CHR criteria were enrolled from the prospective, naturalistic cohort program for CHR at the Seoul Youth Clinic (SYC). The least absolute shrinkage and selection operator (LASSO)-penalized Cox regression was used to develop a predictive model for a psychotic transition. We performed k-means clustering and survival analysis to stratify the risk of psychosis.
Results
The predictive model, which includes clinical and cognitive variables, identified the following six baseline variables as important predictors: 1-year percentage decrease in the Global Assessment of Functioning score, IQ, California Verbal Learning Test score, Strange Stories test score, and scores in two domains of the Social Functioning Scale. The predictive model showed a cross-validated Harrell's C-index of 0.78 and identified three subclusters with significantly different risk levels.
Conclusions
Overall, our predictive model showed a predictive ability and could facilitate a personalized therapeutic approach to different risks in high-risk individuals.
This study examined the perceived competence of Clinical Research Coordinators (CRCs) using several conceptual frameworks. Accurate self-assessment of one’s professional competence is a critical component in the career navigation process and contributes to (a) identifying and securing professional development (training), (b) leveraging professional strengths, and (c) integrating self-knowledge into a comprehensive career plan.
Method:
A survey design gathered responses from a sample of 119 CRCs in a southeastern region of the USA Two conceptual frameworks were used to represent aspects of CRC professional competence: the eight Joint Task Force (JTF) competence domains, and perceptions of strengths and training needs from a list of 12 task categories.
Results:
The JTF domain with the lowest competence level was Development and Regulations, while the highest was Communication. Perceived competence increased incrementally with years of experience. Top strengths involved direct patient interaction and data management. Tasks in need of training included project management and reporting issues. Variations in responses were based on years of experience as a CRC.
Conclusion:
Our results demonstrate an association between the self-reported strengths and training needs of CRCs and experience. This information can contribute to the self-directed career navigation of CRCs.
Social anxiety disorder (SAD) is characterized by anxiety regarding social situations, avoidance of external social stimuli, and negative self-beliefs. Virtual reality self-training (VRS) at home may be a good interim modality for reducing social fears before formal treatment. This study aimed to find neurobiological evidence for the therapeutic effect of VRS.
Methods
Fifty-two patients with SAD were randomly assigned to a VRS or waiting list (WL) group. The VRS group received an eight-session VRS program for 2 weeks, whereas the WL group received no intervention. Clinical assessments and functional magnetic resonance imaging scanning with the distress and speech evaluation tasks were repeatedly performed at baseline and after 3 weeks.
Results
The post-VRS assessment showed significantly decreased anxiety and avoidance scores, distress index, and negative evaluation index for ‘self’, but no change in the negative evaluation index for ‘other’. Patients showed significant responses to the distress task in various regions, including both sides of the prefrontal regions, occipital regions, insula, and thalamus, and to the speech evaluation task in the bilateral anterior cingulate cortex. Among these, significant neuronal changes after VRS were observed only in the right lingual gyrus and left thalamus.
Conclusions
VRS-induced improvements in the ability to pay attention to social stimuli without avoidance and even positively modulate emotional cues are based on functional changes in the visual cortices and thalamus. Based on these short-term neuronal changes, VRS can be a first intervention option for individuals with SAD who avoid society or are reluctant to receive formal treatment.