To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There are growing concerns about the impact of the COVID-19 pandemic on the mental health of older adults. We examined the effect of the pandemic on the risk of depression in older adults.
We analyzed data from the prospective cohort study of Korean older adults, which has been followed every 2 years. Among the 2308 participants who completed both the third and the fourth follow-up assessments, 58.4% completed their fourth follow-up before the outbreak of COVID-19 and the rest completed it during the pandemic. We conducted face-to-face diagnostic interviews using Mini International Neuropsychiatric Interview and used Geriatric Depression Scale. We performed generalized estimating equations and logistic regression analyses.
The COVID-19 pandemic was associated with increased depressive symptoms in older adults [b (standard error) = 0.42 (0.20), p = 0.040] and a doubling of the risk for incident depressive disorder even in euthymic older adults without a history of depression (odds ratio = 2.44, 95% confidence interval 1.18–5.02, p = 0.016). Less social activities, which was associated with the risk of depressive disorder before the pandemic, was not associated with the risk of depressive disorder during the pandemic. However, less family gatherings, which was not associated with the risk of depressive disorder before the pandemic, was associated with the doubled risk of depressive disorder during the pandemic.
The COVID-19 pandemic significantly influences the risk of late-life depression in the community. Older adults with a lack of family gatherings may be particularly vulnerable.
Large herbivores can disperse seeds over long distances through endozoochory. The Korean water deer (Hydropotes inermis argyropus), an internationally vulnerable species but locally considered a vermin, is a potential endozoochorous seed dispersal vector. In this study, feeding experiments were conducted to test the efficiency of seed dispersal through gut ingestion by the Korean water deer, its temporal pattern and the effect of gut passage on seed recovery and germination rate. Eight plant species, including species that formerly germinated from its faeces, were used to feed three Korean water deer. Once the deer had consumed all the provided seeds, their faeces were collected after 24, 48, 72 and 96 h. The collected faeces were air-dried, and the number of seeds retrieved from the faeces was counted every 24 h (0–24, 24–48, 48–72 and 72–96 h). Among the eight plant species, six species were retrieved with intact seeds. Panicum bisulcatum had the highest recovery rate of 33.7%, followed by Amaranthus mangostanus (24.5%) and Chenopodium album (14.4%). Most of the seeds were recovered within the 24–48 h time interval. Germination tests were conducted on the ingested and uningested seeds for the four species which had a sufficient recovery rate. The effects of gut passage on seed germination differed according to plant species. The germination rate substantially decreased after gut passage. The results suggest that the Korean water deer can disperse seeds, potentially over long distances albeit at a high cost of low seed recovery and germination rate.
Network approach has been applied to a wide variety of psychiatric disorders. The aim of the present study was to identify network structures of remitters and non-remitters in patients with first-episode psychosis (FEP) at baseline and the 6-month follow-up.
Participants (n = 252) from the Korean Early Psychosis Study (KEPS) were enrolled. They were classified as remitters or non-remitters using Andreasen's criteria. We estimated network structure with 10 symptoms (three symptoms from the Positive and Negative Syndrome Scale, one depressive symptom, and six symptoms related to schema and rumination) as nodes using a Gaussian graphical model. Global and local network metrics were compared within and between the networks over time.
Global network metrics did not differ between the remitters and non-remitters at baseline or 6 months. However, the network structure and nodal strengths associated with positive-self and positive-others scores changed significantly in the remitters over time. Unique central symptoms for remitters and non-remitters were cognitive brooding and negative-self, respectively. The correlation stability coefficients for nodal strength were within the acceptable range.
Our findings indicate that network structure and some nodal strengths were more flexible in remitters. Negative-self could be an important target for therapeutic intervention.
We calculated the human resources required for an antimicrobial stewardship program (ASP) in Korean hospitals.
Multicenter retrospective study.
Eight Korean hospitals ranging in size from 295 to 1,337 beds.
The time required for performing ASP activities for all hospitalized patients under antibiotic therapy was estimated and converted into hours per week. The actual time spent on patient reviews of each ASP activity was measured with a small number of cases, then the total time was estimated by applying the determined times to a larger number of cases. Full-time equivalents (FTEs) were measured according to labor laws in Korea (52 hours per week).
In total, 225 cases were reviewed to measure time spent on patient reviews. The median time spent per patient review for ASP activities ranged from 10 to 16 minutes. The total time spent on the review for all hospitalized patients was estimated using the observed number of ASP activities for 1,534 patients who underwent antibiotic therapy on surveillance days. The most commonly observed ASP activity was ‘review of surgical prophylactic antibiotics’ (32.7%), followed by ‘appropriate antibiotics recommendations for patients with suspected infection without a proven site of infection but without causative pathogens’ (28.6%). The personnel requirement was calculated as 1.20 FTEs (interquartile range [IQR], 1.02–1.38) per 100 beds and 2.28 FTEs (IQR, 1.93–2.62) per 100 patients who underwent antibiotic therapy, respectively.
The estimated time required for human resources performing extensive ASP activities on all hospitalized patients undergoing antibiotic therapy in Korean hospitals was ~1.20 FTEs (IQR, 1.02–1.38) per 100 beds.
Several studies on the treatment of coronavirus disease 2019 (COVID-19) are being conducted, and various drugs are being tried; however, the results have not been uniform. Steroids have been widely used in the treatment of COVID-19, but their effects are controversial. As immunosuppressive and anti-inflammatory agents, steroids are considered to reduce lung damage by regulating various inflammatory responses. We report a case of severe acute respiratory syndrome coronavirus-2 pneumonia manifesting as a cryptogenic organizing pneumonia-like reaction and discuss its treatment, clinical course, and favorable outcomes after steroid administration.
The experiments reported in this research paper aimed to determine the effect of supplementing different forms of L-methionine (L-Met) and acetate on protein synthesis in immortalized bovine mammary epithelial cell line (MAC-T cells). Treatments were Control, L-Met, conjugated L-Met and acetate (CMA), and non-conjugated L-Met and Acetate (NMA). Protein synthesis mechanism was determined by omics method. NMA group had the highest protein content in the media and CSN2 mRNA expression levels (P < 0.05). The number of upregulated and downregulated proteins observed were 39 and 77 in L-Met group, 62 and 80 in CMA group and 50 and 81 in NMA group from 448 proteins, respectively (P < 0.05). L-Met, NMA and CMA treatments stimulated pathways related to protein and energy metabolism (P < 0.05). Metabolomic analysis also revealed that L-Met, CMA and NMA treatments resulted in increases of several metabolites (P < 0.05). In conclusion, NMA treatment increased protein concentration and expression level of CSN2 mRNA in MAC-T cells compared to control as well as L-Met and CMA treatments through increased expression of milk protein synthesis-related genes and production of the proteins and metabolites involved in energy and protein synthesis pathways.
Somatization is known to be more prevalent in Asian than in Western populations. Using a South Korean adolescent and young adult twin sample (N = 1754; 367 monozygotic male, 173 dizygotic male, 681 monozygotic female, 274 dizygotic female and 259 opposite-sex dizygotic twins), the present study aimed to estimate heritability of somatization and to determine common genetic and environmental influences on somatization and hwabyung (HB: anger syndrome). Twins completed self-report questionnaires of the HB symptoms scale and the somatization scale via a telephone interview. The results of the general sex-limitation model showed that 43% (95% CI [36, 50]) of the total variance of somatization was attributable to additive genetic factors, with the remaining variance, 57% (95% CI [50, 64]), being due to individual-specific environmental influences, including measurement error. These estimates were not significantly different between the two sexes. The phenotypic correlation between HB and somatization was .53 (p < .001). The bivariate model-fitting analyses revealed that the genetic correlation between the two symptoms was .68 (95% CI [.59, .77]), while the individual-specific environmental correlation, including correlated measurement error, was .41 (95% CI [.34, .48]). Of the additive genetic factors of 43% that influence somatization, approximately half (20%) were associated with those related to HB, with the remainder being due to genes unique to somatization. A substantial part (48%) of individual environmental variance in somatization was unrelated to HB; only 9% of the environmental variance was shared with HB. Our findings suggest that HB and somatization have shared genetic etiology, but environmental factors that precipitate the development of HB and somatization may be largely independent from each other.
The present study aimed to estimate heritability of Hwabyung (HB) symptoms in adolescent and young adult twins in South Korea. The sample included 1,601 twins consisting of 143 pairs of monozygotic male (MZM), 67 pairs of dizygotic male (DZM), 295 pairs of monozygotic female (MZF), 114 pairs of dizygotic female (DZF), and 117 pairs of opposite-sex dizygotic (OSDZ) twins and 129 twins with non-participating co-twins (mean age = 19.1 ± 3.1 years; range: 12–29 years). An HB symptom questionnaire was given to twins via a telephone interview. Consistent with the literature of HB, the mean level of HB was significantly higher in females than in males. Maximum likelihood twin correlations for HB were 0.31 (95% CI [0.16, 0.45]) for MZM, 0.19 (95% CI [-0.05, 0.41]) for DZM, 0.50 (95% CI [0.41, 0.58]) for MZF, 0.28 (95% CI [0.11, 0.44]) for DZF, and 0.23 (95% CI [0.05, 0.40]) for OSDZ twins. These patterns of twin correlations suggested the presence of additive genetic influences on HB. Model-fitting analysis showed that additive genetic and individual-specific environmental influences on HB were 44% (95% CI [37, 51]) and 56% (95% CI [49, 63]), respectively. Shared environmental influences were not significant. These parameter estimates were not significantly different between two sexes, and did not change significantly with age in the present sample, suggesting that genetic and environmental influences on HB in both sexes are stable across adolescence and young adulthood.
In this study the authors focus on reviewing imaging studies that used resting state functional magnetic resonance imaging for individuals with a history of heroin use. This review study compiled existing research addressing the effect of heroin use on decision making by reviewing available functional neuroimaging data. Systematic review ofthe literatures using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist. Eligible articles were retrieved through a computer-based MEDLINE and PsycINFO search from 1960 to December 2015 using the major medical subject headings “heroin, fMRI” (all fields). Only English language was included. Thirty-seven articles were initially included in the review. Sixteen were excluded because they did not meet the inclusion criteria. The results of 21 articles that met all the inclusion criteria were presented. Based on the 21 studies included in the current review, there is evidence that heroin use may have a direct and damaging effect on certain brain functions and that these changes may be associated with impulsive and unhealthy decision making. From the review of these studies, the authors understand that a longer duration of heroinuse may be associated with more damaging effects on brain functions. The authors also understand that these brain changes could last long after abstinence, which may increase the risk of relapse to heroin use. More research is needed to create a biomarker map for patients with heroin use disorder that can be used to guide and assess response to treatment.
Lack of understanding the effects of single- and multiple-weed interference on soybean yield has led to inadequate weed management in Primorsky Krai, resulting in much lower average yield than neighboring regions. A 2 yr field experiment was conducted in a soybean field located in Bogatyrka (43.82°N, 131.6°E), Primorsky Krai, Russia, in 2013 and 2014 to investigate the effects of single and multiple interference caused by naturally established weeds on soybean yield and to model these effects. Aboveground dry weight was negatively affected the most by weed interference, followed by number of pods and seeds. Soybean yield under single-weed interference was best demonstrated by a rectangular hyperbolic model, showing that common ragweed and barnyardgrass were the most competitive weed species, followed by annual sowthistle, American sloughgrass, and common lambsquarters. In the case of multiple-weed interference, soybean yield loss was accurately described by a multivariate rectangular hyperbolic model, with total density equivalent as the independent variable. Parameter estimates indicated that weed-free soybean yields were similar in 2013 and 2014, i.e., estimated as 1.72 t and 1.75 t ha−1, respectively, and competitiveness of each weed species was not significantly different between the two years. Economic thresholds for single-weed interference were 0.74, 0.66, 1.15, 1.23, and 1.45 plants m−2 for common ragweed, barnyardgrass, annual sowthistle, American sloughgrass, and common lambsquarters, respectively. The economic threshold for multiple-weed interference was 0.70 density equivalent m−2. These results, including the model, thus can be applied to a decision support system for weed management in soybean cultivation under single and multiple-weed interference in Primorsky Krai and its neighboring regions of Russia.
Hyperlipidaemia is a major cause of atherosclerosis and related CVD and can be prevented with natural substances. Previously, we reported that a novel Bacillus-fermented green tea (FGT) exerts anti-obesity and hypolipidaemic effects. This study further investigated the hypotriglyceridaemic and anti-obesogenic effects of FGT and its underlying mechanisms. FGT effectively inhibited pancreatic lipase activity in vitro (IC50, 0·48 mg/ml) and ameliorated postprandial lipaemia in rats (26 % reduction with 500 mg/kg FGT). In hypertriglyceridaemic hamsters, FGT administration significantly reduced plasma TAG levels. In mice, FGT administration (500 mg/kg) for 2 weeks augmented energy expenditure by 22 % through the induction of plasma serotonin, a neurotransmitter that modulates energy expenditure and mRNA expressions of lipid metabolism genes in peripheral tissues. Analysis of the gut microbiota showed that FGT reduced the proportion of the phylum Firmicutes in hamsters, which could further contribute to its anti-obesity effects. Collectively, these data demonstrate that FGT decreases plasma TAG levels via multiple mechanisms including inhibition of pancreatic lipase, augmentation of energy expenditure, induction of serotonin secretion and alteration of gut microbiota. These results suggest that FGT may be a useful natural agent for preventing hypertriglyceridaemia and obesity.
A life-threatening cardiopulmonary resuscitation (CPR)-related injury can cause recurrent arrest after return of circulation. Such injuries are difficult to identify during resuscitation, and their contribution to failed resuscitation can be missed given the limitations of conventional CPR. Extracorporeal cardiopulmonary resuscitation (ECPR), increasingly being considered for selected patients with potentially reversible etiology of arrest, may identify previously occult CPR-related injuries by restoring arterial pressure and flow. Herein, we describe two cases of severe CPR-related injuries contributing to recurrent arrest. Each case had ECPR implemented within 60 minutes of the start of CPR. After the presumed cardiac etiology had been addressed with percutaneous coronary intervention, life-threatening cardiovascular injuries with recurrent arrest were noted, and resuscitative thoracotomy was performed under ECPR. One patient survived to hospital discharge.
ECPR may provide an opportunity to identify and correct severe resuscitation-related injuries causing recurrent arrest. Chest compression depth >6 cm, especially in older women, may contribute to these injuries.
Steel coils coated with Zn–Mg alloy containing high Mg content develop dark rust when exposed to an extremely limited amount of aqueous environment. To understand the nature of the dark rust and its formation mechanism, the steel is evaluated by the immersion test and high temperature–humidity test followed by critical evaluation with transmission electron microscopy for cross-sectional observation, field-emission scanning electron microscopy for surface morphology observation, Auger electron spectroscopy and glow discharge spectroscopy for identification of chemical composition as a function of depth. The results indicate that the dark rust is formed by precipitation of Mg-based corrosion product on the outermost surface when the steel is exposed to aqueous environment at high temperature. This is due mainly to preferential dissolution of Mg phases by the galvanic action with MgZn2 and Mg2Zn11 composed of the coating layer, and easy precipitation of Mg2+ ion in a form of Mg(OH)2 in a limited volume of the condensed water film on the surface.
We report on the formation of highly flexible and transparent TiO2/Ag/ITO multilayer films deposited on polyethylene terephthalate substrates. The optical and electrical properties of the multilayer films were investigated as a function of oxide thickness. The transmission window gradually shifted toward lower energies with increasing oxide thickness. The TiO2 (40 nm)/Ag (18 nm)/ITO (40 nm) films gave the transmittance of 93.1% at 560 nm. The relationship between transmittance and oxide thickness was simulated using the scattering matrix method to understand high transmittance. As the oxide thickness increased from 20 to 50 nm, the carrier concentration gradually decreased from 1.08 × 1022 to 6.66 × 1021 cm−3, while the sheet resistance varied from 5.8 to 6.1 Ω/sq. Haacke's figure of merit reached a maximum at 40 nm and then decreased with increasing oxide thickness. The change in resistance for the 60 nm-thick ITO single film rapidly increased with increasing bending cycles, while that of the TiO2/Ag/ITO (40 nm/18 nm/40 nm) film remained virtually unchanged during the bending test.
To determine the influence of early pain relief for patients with suspected appendicitis on the diagnostic performance of surgical residents.
A prospective randomized, double-blind, placebo-controlled trial was conducted for patients with suspected appendicitis. The patients were randomized to receive placebo (normal saline intravenous [IV]) infusions over 5 minutes or the study drug (morphine 5 mg IV). All of the clinical evaluations by surgical residents were performed 30 minutes after administration of the study drug or placebo. After obtaining the clinical probability of appendicitis, as determined by the surgical residents, abdominal computed tomography was performed. The primary objective was to compare the influence of IV morphine on the ability of surgical residents to diagnose appendicitis.
A total of 213 patients with suspected appendicitis were enrolled. Of these patients, 107 patients received morphine, and 106 patients received placebo saline. The negative appendectomy percentages in each group were similar (3.8% in the placebo group and 3.2% in the pain control group, p=0.62). The perforation rates in each group were also similar (18.9% in the placebo group and 14.3% in the pain control group, p=0.75). Receiver operating characteristic analysis revealed that the overall diagnostic accuracy in each group was similar (the area under the curve of the placebo group and the pain control group was 0.63 v. 0.61, respectively, p=0.81).
Early pain control in patients with suspected appendicitis does not affect the diagnostic performance of surgical residents.
This study examined changes in health-related quality of life (HRQoL) and quality of care (QoC) as perceived by terminally ill cancer patients and a stratified set of HRQoL or QoC factors that are most likely to influence survival at the end of life (EoL).
We administered questionnaires to 619 consecutive patients immediately after they were diagnosed with terminal cancer by physicians at 11 university hospitals and at the National Cancer Center in Korea. Subjects were followed up over 161.2 person-years until their deaths. We measured HRQoL using the core 30-item European Organization for Research and Treatment of Cancer Quality of Life Questionnaire, and QoC using the Quality Care Questionnaire–End of Life (QCQ–EoL). We evaluated changes in HRQoL and QoC issues during the first three months after enrollment, performing sensitivity analysis by using data generated via four methods (complete case analysis, available case analysis, the last observation carried forward, and multiple imputation).
Emotional and cognitive functioning decreased significantly over time, while dyspnea, constipation, and pain increased significantly. Dignity-conserving care, care by healthcare professionals, family relationships, and QCQ–EoL total score decreased significantly. Global QoL, appetite loss, and Eastern Cooperative Oncology Group Performance Status (ECOG–PS) scores were significantly associated with survival.
Significance of results:
Future standardization of palliative care should be focused on assessment of these deteriorated types of quality. Accurate estimates of the length of life remaining for terminally ill cancer patients by such EoL-enhancing factors as global QoL, appetite loss, and ECOG–PS are needed to help patients experience a dignified and comfortable death.