We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Extended redundancy analysis (ERA), a generalized version of redundancy analysis (RA), has been proposed as a useful method for examining interrelationships among multiple sets of variables in multivariate linear regression models. As a limitation of the extant RA or ERA analyses, however, parameters are estimated by aggregating data across all observations even in a case where the study population could consist of several heterogeneous subpopulations. In this paper, we propose a Bayesian mixture extension of ERA to obtain both probabilistic classification of observations into a number of subpopulations and estimation of ERA models within each subpopulation. It specifically estimates the posterior probabilities of observations belonging to different subpopulations, subpopulation-specific residual covariance structures, component weights and regression coefficients in a unified manner. We conduct a simulation study to demonstrate the performance of the proposed method in terms of recovering parameters correctly. We also apply the approach to real data to demonstrate its empirical usefulness.
The purpose of this study was to compare single- and multi-frequency bioimpedance (BIA) devices against dual-energy X-ray absorptiometry (DXA) for appendicular lean mass (ALM) and muscle quality index (MQI) metrics in Hispanic adults. One hundred thirty-one Hispanic adults (18–55 years) participated in this study. ALM was measured with single-frequency bioimpedance analysis (SFBIA), multi-frequency bioimpedance analysis (MFBIA) and DXA. ALMTOTAL (left arm + right arm + left leg + right leg) and ALMARMS (left arm + right arm) were computed for all three devices. Handgrip strength (HGS) was measured using a dynamometer. The average HGS was used for all MQI models (highest left hand + highest right hand)/2. MQIARMS was defined as the ratio between HGS and ALMARMS. MQITOTAL was established as the ratio between HGS and ALMTOTAL. SFBIA and MFBIA had strong correlations with DXA for all ALM and MQI metrics (Lin’s concordance correlation coefficient values ranged from 0·86 (MQIMFBIA-ARMS) to 0·97 (Arms LMSFBIA); all P < 0·001). Equivalence testing varied between methods (e.g. SFBIA v. DXA) when examining the different metrics (i.e. ALMTOTAL, ALMARMS, MQITOTAL and MQIARMS). MQIARMS was the only metric that did not differ from the line of identity and had no proportional bias when comparing all the devices against each other. The current study findings demonstrate good overall agreement between SFBIA, MFBIA and DXA for ALMTOTAL and ALMARMS in a Hispanic population. However, SFBIA and MFBIA have better agreement with DXA when used to compute MQIARMS than MQITOTAL.
The intensity attenuation of a high-power laser is a frequent task in the measurements of optical science. Laser intensity can be attenuated by inserting an optical element, such as a partial reflector, polarizer or absorption filter. These devices are, however, not always easily applicable, especially in the case of ultra-high-power lasers, because they can alter the characteristics of a laser beam or become easily damaged. In this study, we demonstrated that the intensity of a laser beam could be effectively attenuated using a random pinhole attenuator (RPA), a device with randomly distributed pinholes, without changing the beam properties. With this device, a multi-PW laser beam was successfully attenuated and the focused beam profile was measured without any alterations of its characteristics. In addition, it was confirmed that the temporal profile of a laser pulse, including the spectral phase, was preserved. Consequently, the RPA possesses significant potential for a wide range of applications.
Although, attempts to apply virtual reality (VR) in mental healthcare are rapidly increasing, it is still unclear whether VR relaxation can reduce stress more than conventional biofeedback.
Methods:
Participants consisted of 83 healthy adult volunteers with high stress, which was defined as a score of 20 or more on the Perceived Stress Scale-10 (PSS-10). This study used an open, randomized, crossover design with baseline, stress, and relaxation phases. During the stress phase, participants experienced an intentionally generated shaking VR and serial-7 subtraction. For the relaxation phase, participants underwent a randomly assigned relaxation session on day 1 among VR relaxation and biofeedack, and the other type of relaxation session was applied on day 2. We compared the StateTrait Anxiety Inventory-X1 (STAI-X1), STAI-X2, the Numeric Rating Scale (NRS), and physiological parameters including heart rate variability (HRV) indexes in the stress and relaxation phases.
Results:
A total of 74 participants were included in the analyses. The median age of participants was 39 years, STAI-X1 was 47.27 (SD = 9.92), and NRS was 55.51 (SD = 24.48) at baseline. VR and biofeedback significantly decreased STAI-X1 and NRS from the stress phase to the relaxation phase, while the difference of effect between VR and biofeedback was not significant. However, there was a significant difference in electromyography, LF/HF ratio, LF total, and NN50 between VR relaxation and biofeedback
Conclusion:
VR relaxation was effective in reducing subjectively reported stress in individuals with high stress.
Critical congenital heart disease (CCHD) refers to a group of heart defects that cause serious, life-threatening symptoms in the neonatal period and requires timely surgical or catheter interventions. We tried to explore current status of CCHD burden and the effect of early diagnosis of CCHD to mortality using the Korean national health insurance (NHI) data.
Methods
We analyzed the national health insurance (NHI) data from 2014 to 2018. We identified CCHD patients using the diagnosis codes and intervention codes from the claim data and the prevalence, mortality and medical expenditure of CCHD were analyzed. We linked neonatal data with their mother’s medical claim data and developed retrospective cohort data set for analyzing the effect of early diagnosis to mortality and related outcomes of CCHD treatment.
Results
The annual prevalence of neonatal CCHD in Korea was 0.144% percent. A total of 2,241 CCHD neonates, 1,546 (69.0%) underwent cardiac ultrasound within three days after birth, and mothers of 419 neonates had a record of prenatal fetal ultrasound (18.7%). In our comparison of neonates diagnosed with CCHD within three days of birth with those diagnosed with CCHD on or after day 4 of birth, the probability of early diagnosis increased for preterm infants and infants with low birth rate. Regarding mortality rate, most types of CCHD showed a significantly higher mortality rate in the early diagnosis group.
Conclusions
The reason for the high mortality rate despite a high early diagnosis rate pertains to the high percentage of patients with severe conditions that induce a serious heart rate within three days of birth. More than half of the neonates with CCHD were found to have not undergone a prenatal fetal ultrasound, rendering this an important policy target.
Critical congenital heart disease (CCHD) refers to a group of heart defects that cause serious, life-threatening symptoms in the neonatal period and requires timely surgical or catheter interventions. We reviewed evidence for incorporating a mandatory neonatal CCHD screening test as a national public health project for all neonates born in Korea by analyzing the validity and cost-effectiveness of neonatal CCHD screening using pulse oximetry in Korea.
Methods
We performed a rapid literature review to establish models for the diagnostic accuracy and economic evaluation of pulse oximetry. Also, we analyzed the prevalence, mortality, and medical expenditure for different types of CCHD using the national health insurance (NHI) data. We analyzed the cost-effectiveness of pulse oximetry by comparing the group of neonates who received a combination of a physical examination and pulse oximetry, and group of neonates who only received a physical examination. For the cost-effectiveness analysis for the CCHD screening test in this study, we used a duration of one year, diagnostic accuracy as the clinical endpoint, and Life Year Gain (LYG) as the effectiveness indicator.
Results
We used recent systematic review he pooled sensitivity can be enhanced from 76.5 percent (pulse oximetry alone) to 92 percent (combined with physical examination). We used a total of 2,334 neonates with CCHD data for the economic model. Our analysis revealed that adding pulse oximetry to the routine neonatal physical examination leads to 2.34 of LYG and a cost difference of USD1,080,602, showing a ICER of KRW610,063,240 (USD461,857)/LYG.
Conclusions
Considering the benefit of LYG and cost of reducing the complications and after effects of newborns with CCHD who survived early diagnosis, it is considered to be worthwhile in Korea for a mandatory screening test.
This study aimed to identify the roles of community pharmacists (CPs) during the coronavirus disease 2019 (COVID-19) pandemic, the differences in their role performance compared with their perceived importance, and limiting factors.
Methods:
A cross-sectional online survey of CPs was conducted. The CPs self-measured the importance and performance of each role during the pandemic using a 5-point Likert scale. A paired t-test was used to compare each role’s importance and performance scores. A logistic regression analysis of the roles with low performance scores, despite their level of importance, was conducted to determine the factors affecting performance. The limiting factors were also surveyed.
Results:
The 436 responses to the questionnaire were analyzed. The performance scores were significantly lower than the perceived importance scores for 15 of the 17 roles. The source and update frequency of COVID-19 information and participation in outreach pharmaceutical services were associated with low performance scores. Insufficient economic compensation, the lack of communication channels, and legal limitations were the limiting factors in performing the CPs’ roles.
Conclusions:
The participation in outreach pharmaceutical services, economic compensation, and communication channel should be improved to motivate the CPs in performing their roles.
In this review, we introduce our recent applications of deep learning to solar and space weather data. We have successfully applied novel deep learning methods to the following applications: (1) generation of solar farside/backside magnetograms and global field extrapolation based on them, (2) generation of solar UV/EUV images from other UV/EUV images and magnetograms, (3) denoising solar magnetograms using supervised learning, (4) generation of UV/EUV images and magnetograms from Galileo sunspot drawings, (5) improvement of global IRI TEC maps using IGS TEC ones, (6) one-day forecasting of global TEC maps through image translation, (7) generation of high-resolution magnetograms from Ca II K images, (8) super-resolution of solar magnetograms, (9) flare classification by CNN and visual explanation by attribution methods, and (10) forecasting GOES solar X-ray profiles. We present major results and discuss them. We also present future plans for integrated space weather models based on deep learning.
Nosocomial transmission of COVID-19 among immunocompromised hosts can have a serious impact on COVID-19 severity, underlying disease progression and SARS-CoV-2 transmission to other patients and healthcare workers within hospitals. We experienced a nosocomial outbreak of COVID-19 in the setting of a daycare unit for paediatric and young adult cancer patients. Between 9 and 18 November 2020, 473 individuals (181 patients, 247 caregivers/siblings and 45 staff members) were exposed to the index case, who was a nursing staff. Among them, three patients and four caregivers were infected. Two 5-year-old cancer patients with COVID-19 were not severely ill, but a 25-year-old cancer patient showed prolonged shedding of SARS-CoV-2 RNA for at least 12 weeks, which probably infected his mother at home approximately 7–8 weeks after the initial diagnosis. Except for this case, no secondary transmission was observed from the confirmed cases in either the hospital or the community. To conclude, in the day care setting of immunocompromised children and young adults, the rate of in-hospital transmission of SARS-CoV-2 was 1.6% when applying the stringent policy of infection prevention and control, including universal mask application and rapid and extensive contact investigation. Severely immunocompromised children/young adults with COVID-19 would have to be carefully managed after the mandatory isolation period while keeping the possibility of prolonged shedding of live virus in mind.
There are growing concerns about the impact of the COVID-19 pandemic on the mental health of older adults. We examined the effect of the pandemic on the risk of depression in older adults.
Methods
We analyzed data from the prospective cohort study of Korean older adults, which has been followed every 2 years. Among the 2308 participants who completed both the third and the fourth follow-up assessments, 58.4% completed their fourth follow-up before the outbreak of COVID-19 and the rest completed it during the pandemic. We conducted face-to-face diagnostic interviews using Mini International Neuropsychiatric Interview and used Geriatric Depression Scale. We performed generalized estimating equations and logistic regression analyses.
Results
The COVID-19 pandemic was associated with increased depressive symptoms in older adults [b (standard error) = 0.42 (0.20), p = 0.040] and a doubling of the risk for incident depressive disorder even in euthymic older adults without a history of depression (odds ratio = 2.44, 95% confidence interval 1.18–5.02, p = 0.016). Less social activities, which was associated with the risk of depressive disorder before the pandemic, was not associated with the risk of depressive disorder during the pandemic. However, less family gatherings, which was not associated with the risk of depressive disorder before the pandemic, was associated with the doubled risk of depressive disorder during the pandemic.
Conclusions
The COVID-19 pandemic significantly influences the risk of late-life depression in the community. Older adults with a lack of family gatherings may be particularly vulnerable.
Accumulating evidence suggests that alterations in inflammatory biomarkers are important in depression. However, previous meta-analyses disagree on these associations, and errors in data extraction may account for these discrepancies.
Methods
PubMed/MEDLINE, Embase, PsycINFO, and the Cochrane Library were searched from database inception to 14 January 2020. Meta-analyses of observational studies examining the association between depression and levels of tumor necrosis factor-α (TNF-α), interleukin 1-β (IL-1β), interleukin-6 (IL-6), and C-reactive protein (CRP) were eligible. Errors were classified as follows: incorrect sample sizes, incorrectly used standard deviation, incorrect participant inclusion, calculation error, or analysis with insufficient data. We determined their impact on the results after correction thereof.
Results
Errors were noted in 14 of the 15 meta-analyses included. Across 521 primary studies, 118 (22.6%) showed the following errors: incorrect sample sizes (20 studies, 16.9%), incorrect use of standard deviation (35 studies, 29.7%), incorrect participant inclusion (7 studies, 5.9%), calculation errors (33 studies, 28.0%), and analysis with insufficient data (23 studies, 19.5%). After correcting these errors, 11 (29.7%) out of 37 pooled effect sizes changed by a magnitude of more than 0.1, ranging from 0.11 to 1.15. The updated meta-analyses showed that elevated levels of TNF- α, IL-6, CRP, but not IL-1β, are associated with depression.
Conclusions
These findings show that data extraction errors in meta-analyses can impact findings. Efforts to reduce such errors are important in studies of the association between depression and peripheral inflammatory biomarkers, for which high heterogeneity and conflicting results have been continuously reported.
This study aimed to evaluate manufacturers’ perceptions of the decision-making process for new drug reimbursement and to formulate implications in operating a health technology assessment system. In 2019, we conducted a questionnaire survey and a semistructured group interview for domestic (n = 6) and foreign manufacturers (n = 9) who had vast experience in introducing new medicines into the market through a health technology assessment. Representatives of manufacturers indicated that disease severity, budget impact, existence of alternative treatment, and health-related quality of life were relevant criteria when assessing reimbursement decisions. Compared with domestic manufacturers, foreign manufacturers were risk takers when making reimbursement decisions in terms of adopting a new drug and managing pharmaceutical expenditure. However, foreign manufacturers were risk-averse when evaluating new drugs with uncertainties based on real-world data such as clinical effectiveness. Based on manufacturers’ perceptions of the decision-making process for new drug reimbursement, there is room for improvement in health technology assessment systems. Explaining the underlying reasons behind their decisions, unbiased participation by various stakeholders and their embedded roles in the decision-making process need to be emphasized. However, the measures suggested in this study should be introduced with cautions. The process of health technology assessment might be a target for those who undermine the system in pursuit of their private interests.
Background: The purpose of this study was to find out the relationship between appropriateness of antibiotic prescription and clinical outcomes in patients with community-acquired acute pyelonephritis (CA-APN). Methods: A multicenter prospective cohort study was performed in 8 Korean hospitals from September 2017 to August 2018. All hospitalized patients aged ≥19 years diagnosed with CA-APN at admission were recruited. Pregnant women and patients with insufficient data were excluded. In addition, patients with prolonged hospitalization due to medical problems that were not associated with APN treatment were excluded. The appropriateness of empirical and definitive antibiotics was divided into “optimal,” “suboptimal,” and “inappropriate,” and optimal and suboptimal were regarded as appropriate antibiotic use. The standard for the classification of empirical antibiotics was defined reflecting the Korean national guideline for the antibiotic use in urinary tract infection 2018. The standards for the classification of definitive antibiotics were defined according to the result of in vitro susceptibility tests of causative organisms. Clinical outcomes including clinical failure (mortality or recurrence) rate, hospitalization days, and medical costs were compared between patients who were prescribed antibiotics appropriately and those who were prescribed them inappropriately. Results: In total, 397 and 318 patients were eligible for the analysis of the appropriateness of empirical and definitive antibiotics, respectively. Of these, 10 (2.5%) and 18 (5.7%) were inappropriately prescribed empirical and definitive antibiotics, respectively, and 28 (8.8%) were prescribed either empirical or definitive antibiotics inappropriately. Patients who were prescribed empirical antibiotics appropriately showed a lower mortality rate (0 vs 10%; P = .025), shorter hospitalization days (9 vs 12.5 days; P = .014), and lower medical costs (US$2,333 vs US$4,531; P = .007) compared to those who were prescribed empirical antibiotics “inappropriately.” In comparison, we detected no significant differences in clinical outcomes between patients who were prescribed definitive antibiotics appropriately and those who were prescribed definitive antibiotics inappropriately. Patients who were prescribed both empirical and definitive antibiotics appropriately showed a lower clinical failure rate (0.3 vs 7.1%; P = .021) and shorter hospitalization days (9 vs 10.5 days; P = .041) compared to those who were prescribed either empirical or definitive antibiotics inappropriately. Conclusions: Appropriate use of antibiotics leads patients with CA-APN to better clinical outcomes including fewer hospitalization days and lower medical costs.
Early replacement of a new central venous catheter (CVC) may pose a risk of persistent or recurrent infection in patients with a catheter-related bloodstream infection (CRBSI). We evaluated the clinical impact of early CVC reinsertion after catheter removal in patients with CRBSIs.
Methods:
We conducted a retrospective chart review of adult patients with confirmed CRBSIs in 2 tertiary-care hospitals over a 7-year period.
Results:
To treat their infections, 316 patients with CRBSIs underwent CVC removal. Among them, 130 (41.1%) underwent early CVC reinsertion (≤3 days after CVC removal), 39 (12.4%) underwent delayed reinsertion (>3 days), and 147 (46.5%) did not undergo CVC reinsertion. There were no differences in baseline characteristics among the 3 groups, except for nontunneled CVC, presence of septic shock, and reason for CVC reinsertion. The rate of persistent CRBSI in the early CVC reinsertion group (22.3%) was higher than that in the no CVC reinsertion group (7.5%; P = .002) but was similar to that in the delayed CVC reinsertion group (17.9%; P > .99). The other clinical outcomes did not differ among the 3 groups, including rates of 30-day mortality, complicated infection, and recurrence. After controlling for several confounding factors, early CVC reinsertion was not significantly associated with persistent CRBSI (OR, 1.59; P = .35) or 30-day mortality compared with delayed CVC reinsertion (OR, 0.81; P = .68).
Conclusions:
Early CVC reinsertion in the setting of CRBSI may be safe. Replacement of a new CVC should not be delayed in patients who still require a CVC for ongoing management.
The study aims to examine whether cognitive deficits are different between patients with early stage Alzheimer's disease (AD) and patients with early stage vascular dementia (VaD) using the Korean version of the CERAD neuropsychological battery (CERAD-K-N).
Methods
Patients with early stage dementia, global Clinical Dementia Rating (CDR) 0.5 or 1 were consecutively recruited among first visitors to a dementia clinic, 257 AD patients and 90 VaD patients completed the protocol of the Korean version of the CERAD clinical assessment battery. CERAD-K-N was administered for the comprehensive evaluation of the neuropsychological function.
Results
Of the total 347 participants, 257 (69.1%) were AD group (CDR 0.5 = 66.9%) and 90 (21.9%) were VaD group (CDR 0.5 = 40.0%). Patients with very mild AD showed poorer performances in Boston naming test (BNT) (P = 0.028), word list memory test (P < 0.001), word list recall test (P < 0.001) and word list recognition test (WLRcT) (P = 0.006) than very mild VaD after adjustment of T score of MMSE-KC. However, the performance of trail making A (TMA) was more impaired in VaD group than in AD group. The performance of WLRcT (P < 0.001) was the worst among neuropsychological tests within AD group, whereas TMA was performed worst within VaD group.
Conclusions
Patients with early-stage AD have more cognitive deficits on memory and language while patients with early-stage VaD show worse cognitive function on attention/processing speed. In addition, as the first cognitive deficit, memory dysfunction comes in AD and deficit in attention/processing speed in VaD.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Serotonergic dysfunction may play an important role in motor and nonmotor symptoms of Parkinson’s disease (PD). The loudness dependence of auditory evoked potentials (LDAEP) has been used to evaluate serotonergic activity. Therefore, this study aimed to determine central serotonergic activity using LDAEP in de novo PD according to the age at onset and changes in serotonergic activity after dopaminergic treatment.
Methods:
A total of 30 patients with unmedicated PD, 16 in the early-onset and 14 in the late-onset groups, were enrolled. All subjects underwent comprehensive neurological examination, laboratory tests, the Unified Parkinson’s Disease Rating Scale, and LDAEP. The LDAEP was calculated as the slope of the two N1/P2 peaks measured at the Cz electrode, first at baseline conditions (pretreatment) and a second time after 12 weeks (post-treatment) following dopaminergic medications.
Results:
The absolute values of pretreatment N1/P2 LDAEP (early-onset: late-onset, 0.99 ± 0.68: 1.62 ± 0.88, p = 0.035) and post-treatment N1 LDAEP (early-onset: late-onset, −0.61 ± 0.61: −1.26 ± 0.91, p = 0.03) were significantly lower in the early-onset group compared with those of the late-onset group. In addition, a higher value of pretreatment N1/P2 LDAEP was significantly correlated with the late-onset group (coefficient = 1.204, p = 0.044). The absolute value of the N1 LDAEP decreased after 12 weeks of taking dopaminergic medication (pretreatment: post-treatment, −1.457 ± 1.078: −0.904 ± 0.812, p = 0.0018).
Conclusions:
Based on the results of this study, LDAEP could be a marker for serotonergic neurotransmission in PD. Central serotonergic activity assessed by LDAEP may be more preserved in early-onset PD patients and can be altered with dopaminergic medication.
Since the significance of metacognition as the theoretical basis of a psychological intervention for schizophrenia first emerged, there have been ongoing attempts to restore or strengthen patients’ metacognitive abilities.
Aim:
A Korean version of the metacognitive training (MCT) program was developed, and its effects on theory of mind, positive and negative symptoms, and interpersonal relationships were examined in stable outpatients with schizophrenia.
Method:
A pre-test–post-test design with a control group was used. The participants were 59 outpatients (30 in experimental group, 29 in control group) registered at five mental health facilities in a city in South Korea. The developed MCT program was applied for a total of 18 sessions, 60 min per session, over a period of 14 weeks. The hinting task, false belief task, Scale for the Assessment of Positive and Negative Symptoms, and Relationship Change Scale were used to verify the effects of this program. Data were analysed by the chi-square test, t-test, and Mann–Whitney U-test using the SPSS/PASW 18.0 statistics program.
Results:
The general characteristics, intelligence, and outcome variables of the two groups were homogeneous. After the intervention, the experimental group showed significant improvements in theory of mind, positive and negative symptoms and interpersonal relationships compared with the control group.
Conclusion:
These results suggest that the MCT program can be a complementary psychotherapy that contributes to symptom relief and interpersonal functioning in patients with schizophrenia, and is effective in the Korean culture, beyond the Western context.