To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives: Central-line–associated bloodstream infection (CLABSI) has been the leading cause of healthcare-associated infections (HAIs) in the intensive care unit (ICU) setting. Previous studies have shown that a care bundle is effective in reducing CLABSI rates; however, the data on long-term sustainability and cost savings of bundled care are limited. Methods: From January 2011 to December 2020, a prospective surveillance was performed to monitor CLABSI at a university hospital in northern Taiwan. To reduce the CLABSI rate, a hospital-wide bundled care program for CLABSI prevention was implemented in 2013. We evaluated the long-term effect of the care bundle on CLABSI incidence and length of stay in the ICU. Results: During the study period, the overall CLABSI incidence decreased from 8.22 per 1,000 catheter days before the care bundle was implemented to 6.33 per 1,000 catheter days in 2020 (P for trend <.01). The most common pathogens causing CLABSI were gut organisms (1,420 of 2,363, 60.1%), followed by environmental organisms (734 of 2,363, 31.1%) and skin organisms (177 of 2,363, 7.5%). The decreasing trend was statistically significant in the incidence of CLABSI caused by skin organisms (P for trend < .01), but not in the incidence of CLABSI caused by environmental organisms (P for trend = .86) or gut organisms (P for trend = .06). In the multivariable analysis, implementation of this care bundle was independently associated with a decrease in the CLABSI rate (RR, 0.77; 95% CI, 0.66–0.88). Compared with patients without CLABSI, patients with CLABSI had a longer average ICU length of stay (27 vs 17 days). Conclusions: A sustainable reduction in the incidence of CLABSI caused by common commensals could be achieved through a cost-saving bundled care program.
One of the most common harmful mites in edible fungi is Histiostoma feroniarum Dufour (Acaridida: Histiostomatidae), a fungivorous astigmatid mite that feeds on hyphae and fruiting bodies, thereby transmitting pathogens. This study examined the effects of seven constant temperatures and 10 types of mushrooms on the growth and development of H. feroniarum, as well as its host preference. Developmental time for the total immature stages was significantly affected by the type of mushroom species, ranging from 4.3 ± 0.4 days (reared on Pleurotus eryngii var. tuoliensis Mou at 28°C) to 17.1 ± 2.3 days (reared on Auricularia polytricha Sacc. at 19°C). The temperature was a major factor in the formation of facultative heteromorphic deutonymphs (hypopi). The mite entered the hypopus stage when the temperature dropped to 16°C or rose above 31°C. The growth and development of this mite were significantly influenced by the type of species and variety of mushrooms. Moreover, the fungivorous astigmatid mite preferred to feed on the ‘Wuxiang No. 1’ strain of Lentinula edodes (Berk.) Pegler and the ‘Gaowenxiu’ strain of P. pulmonarius (Fr.) Quél., with a shorter development period compared with that of feeding on other strains. These results therefore quantify the effect of host type and temperature on fungivorous astigmatid mite growth and development rates, and provide a reference for applying mushroom cultivar resistance to biological pest control.
Coastal eutrophication and hypoxia remain a persistent environmental crisis despite the great efforts to reduce nutrient loading and mitigate associated environmental damages. Symptoms of this crisis have appeared to spread rapidly, reaching developing countries in Asia with emergences in Southern America and Africa. The pace of changes and the underlying drivers remain not so clear. To address the gap, we review the up-to-date status and mechanisms of eutrophication and hypoxia in global coastal oceans, upon which we examine the trajectories of changes over the 40 years or longer in six model coastal systems with varying socio-economic development statuses and different levels and histories of eutrophication. Although these coastal systems share common features of eutrophication, site-specific characteristics are also substantial, depending on the regional environmental setting and level of social-economic development along with policy implementation and management. Nevertheless, ecosystem recovery generally needs greater reduction in pressures compared to that initiated degradation and becomes less feasible to achieve past norms with a longer time anthropogenic pressures on the ecosystems. While the qualitative causality between drivers and consequences is well established, quantitative attribution of these drivers to eutrophication and hypoxia remains difficult especially when we consider the social economic drivers because the changes in coastal ecosystems are subject to multiple influences and the cause–effect relationship is often non-linear. Such relationships are further complicated by climate changes that have been accelerating over the past few decades. The knowledge gaps that limit our quantitative and mechanistic understanding of the human-coastal ocean nexus are identified, which is essential for science-based policy making. Recognizing lessons from past management practices, we advocate for a better, more efficient indexing system of coastal eutrophication and an advanced regional earth system modeling framework with optimal modules of human dimensions to facilitate the development and evaluation of effective policy and restoration actions.
Slowed information processing speed (IPS) is the core contributor to cognitive impairment in patients with late-life depression (LLD). The hippocampus is an important link between depression and dementia, and it may be involved in IPS slowing in LLD. However, the relationship between a slowed IPS and the dynamic activity and connectivity of hippocampal subregions in patients with LLD remains unclear.
One hundred thirty-four patients with LLD and 89 healthy controls were recruited. Sliding-window analysis was used to assess whole-brain dynamic functional connectivity (dFC), dynamic fractional amplitude of low-frequency fluctuations (dfALFF) and dynamic regional homogeneity (dReHo) for each hippocampal subregion seed.
Cognitive impairment (global cognition, verbal memory, language, visual–spatial skill, executive function and working memory) in patients with LLD was mediated by their slowed IPS. Compared with the controls, patients with LLD exhibited decreased dFC between various hippocampal subregions and the frontal cortex and decreased dReho in the left rostral hippocampus. Additionally, most of the dFCs were negatively associated with the severity of depressive symptoms and were positively associated with various domains of cognitive function. Moreover, the dFC between the left rostral hippocampus and middle frontal gyrus exhibited a partial mediation effect on the relationships between the scores of depressive symptoms and IPS.
Patients with LLD exhibited decreased dFC between the hippocampus and frontal cortex, and the decreased dFC between the left rostral hippocampus and right middle frontal gyrus was involved in the underlying neural substrate of the slowed IPS.
Although attentional bias modification training (ABM) and cognitive behavioural therapy (CBT) are two effective methods to decrease the symptoms of generalized anxiety disorders (GAD), to date, no randomized controlled trials have yet evaluated the effectiveness of an intervention combining internet-based cognitive behavioural therapy (ICBT) and ABM for adults with GAD.
This study aimed to investigate the effectiveness of an intervention combining ICBT and ABM for adults with GAD.
Sixty-three participants diagnosed with GAD were randomly assigned to the treatment group (ICBT with ABM; 31 participants) or the control group (ICBT with ABM placebo; 32 participants), and received 8 weeks of treatment and three evaluations. The CBT, ABM and ABM-placebo training were conducted via the internet. The evaluations were conducted at baseline, 8 weeks later, and 1 month later, respectively.
Both the treatment and control groups reported significantly reduced anxiety symptoms and attentional bias, with no clear superiority of either intervention. However, the treatment group showed a greater reduction in negative automatic thoughts than the control group after treatment and at 1-month follow-up (η2 = 0.123).
The results suggest that although not differing in therapeutic efficacy, the intervention combining ICBT and ABM is superior to the intervention combining ICBT and ABM-placebo in the reduction of negative automatic thoughts. ABM may be a useful augmentation of ICBT on reducing anxiety symptoms.
The relationship of a diet low in fibre with mortality has not been evaluated. This study aims to assess the burden of non-communicable chronic diseases (NCD) attributable to a diet low in fibre globally from 1990 to 2019.
All data were from the Global Burden of Disease (GBD) Study 2019, in which the mortality, disability-adjusted life-years (DALY) and years lived with disability (YLD) were estimated with Bayesian geospatial regression using data at global, regional and country level acquired from an extensively systematic review.
All data sourced from the GBD Study 2019.
All age groups for both sexes.
The age-standardised mortality rates (ASMR) declined in most GBD regions; however, in Southern sub-Saharan Africa, the ASMR increased from 4·07 (95 % uncertainty interval (UI) (2·08, 6·34)) to 4·60 (95 % UI (2·59, 6·90)), and in Central sub-Saharan Africa, the ASMR increased from 7·46 (95 % UI (3·64, 11·90)) to 9·34 (95 % UI (4·69, 15·25)). Uptrends were observed in the age-standardised YLD rates attributable to a diet low in fibre in a number of GBD regions. The burden caused by diabetes mellitus increased in Central Asia, Southern sub-Saharan Africa and Eastern Europe.
The burdens of disease attributable to a diet low in fibre in Southern sub-Saharan Africa and Central sub-Saharan Africa and the age-standardised YLD rates in a number of GBD regions increased from 1990 to 2019. Therefore, greater efforts are needed to reduce the disease burden caused by a diet low in fibre.
This paper examines whether changes in US presidential administration and central bank turnover during the period 1976–2016 caused regime shifts in Taylor rule deviations. Using a dynamic stochastic general equilibrium model to construct the welfare-maximizing policy rule and deviations from the optimal rule, we find evidence that politics indeed play a key role in explaining these deviations. In addition to politics, unemployment rates and the interest rate spread significantly account for regime shifts in Taylor rule deviations.
Maternal gestational weight gain (GWG) is an important determinant of infant birth weight, and having adequate total GWG has been widely recommended. However, the association of timing of GWG with birth weight remains controversial. We aimed to evaluate this association, especially among women with adequate total GWG. In a prospective cohort study, pregnant women’s weight was routinely measured during pregnancy, and their GWG was calculated for the ten intervals: the first 13, 14–18, 19–23, 24–28, 29–30, 31–32, 33–34, 35–36, 37–38 and 39–40 weeks. Birth weight was measured, and small-for-gestational-age (SGA) and large-for-gestational-age were assessed. Generalized linear and Poisson models were used to evaluate the associations of GWG with birth weight and its outcomes after multivariate adjustment, respectively. Of the 5049 women, increased GWG in the first 30 weeks was associated with increased birth weight for male infants, and increased GWG in the first 28 weeks was associated with increased birth weight for females. Among 1713 women with adequate total GWG, increased GWG percent between 14 and 23 weeks was associated with increased birth weight. Moreover, inadequate GWG between 14 and 23 weeks, compared with the adequate GWG, was associated with an increased risk of SGA (43 (13·7 %) v. 42 (7·2 %); relative risk 1·83, 95 % CI 1·21, 2·76). Timing of GWG may influence infant birth weight differentially, and women with inadequate GWG between 14 and 23 weeks may be at higher risk of delivering SGA infants, despite having adequate total GWG.
During the first postnatal week in rodents, cholinergic retinal waves initiate in starburst amacrine cells (SACs), propagating to retinal ganglion cells (RGCs) and visual centers, essential for visual circuit refinement. By modulating exocytosis in SACs, dynamic changes in the protein kinase A (PKA) activity can regulate the spatiotemporal patterns of cholinergic waves. Previously, cysteine string protein-α (CSPα) is found to interact with the core exocytotic machinery by PKA-mediated phosphorylation at serine 10 (S10). However, whether PKA-mediated CSPα phosphorylation may regulate cholinergic waves via SACs remains unknown. Here, we examined how CSPα phosphorylation in SACs regulates cholinergic waves. First, we identified that CSPα1 is the major isoform in developing rat SACs and the inner plexiform layer during the first postnatal week. Using SAC-specific expression, we found that the CSPα1-PKA-phosphodeficient mutant (CSP-S10A) decreased wave frequency, but did not alter the wave spatial correlation compared to control, wild-type CSPα1 (CSP-WT), or two PKA-phosphomimetic mutants (CSP-S10D and CSP-S10E). These suggest that CSPα-S10 phosphodeficiency in SACs dampens the frequency of cholinergic waves. Moreover, the level of phospho-PKA substrates was significantly reduced in SACs overexpressing CSP-S10A compared to control or CSP-WT, suggesting that the dampened wave frequency is correlated with the decreased PKA activity. Further, compared to control or CSP-WT, CSP-S10A in SACs reduced the periodicity of wave-associated postsynaptic currents (PSCs) in neighboring RGCs, suggesting that these RGCs received the weakened synaptic inputs from SACs overexpressing CSP-S10A. Finally, CSP-S10A in SACs decreased the PSC amplitude and the slope to peak PSC compared to control or CSP-WT, suggesting that CSPα-S10 phosphodeficiency may dampen the speed of the SAC-RGC transmission. Thus, via PKA-mediated phosphorylation, CSPα in SACs may facilitate the SAC-RGC transmission, contributing to the robust frequency of cholinergic waves.
Evaluating the association of water intake and hydration status with nephrolithiasis risk at the population level.
It is a cross-sectional study in which daily total plain water intake and total fluid intake were estimated together with blood osmolality, urine creatinine, urine osmolality, urine flow rate (UFR), free water clearance (FWC) and urine/blood osmolality ratio (Uosm:Bosm). The associations of fluid intake and hydration markers with nephrolithiasis were evaluated using multivariable logistic regression.
General US population.
A total of 8195 adults aged 20 years or older from the National Health and Nutritional Examination Survey 2009–2012 cycles.
The population medians (interquartile ranges, IQR) for daily total plain water intake and total fluid intake were 807 (336–1481) and 2761 (2107–3577) ml/d, respectively. The adjusted OR (95 % CI) of nephrolithiasis for each IQR increase in total plain water intake and total fluid intake were 0·92 (95 % CI 0·79, 1·06) and 0·84 (95 % CI 0·72, 0·97), respectively. The corresponding OR of nephrolithiasis for UFR, blood osmolality, Uosm:Bosm and urine creatinine were 0·87 (95 % CI 0·76, 0·99), 1·18 (95 % CI 1·06, 1·32), 1·38 (95 % CI 1·17, 1·63) and 1·27 (95 % CI 1·11, 1·45), respectively. A linear protective relationship of fluid intake, UFR and FWC with nephrolithiasis risk was observed. Similarly, positive dose–response associations of nephrolithiasis risk with markers of insufficient hydration were identified. Encouraging a daily water intake of >2500 ml/d and maintaining a urine output of 2 l/d was associated with a lower prevalence of nephrolithiasis.
This study verified the beneficial role of general water intake recommendations in nephrolithiasis prevention in the general US population.
A proportion of patients with bipolar disorder (BD) manifests with only unipolar mania (UM). This study examined relevant clinical features and psychosocial characteristics in UM compared with depressive-manic (D-M) subgroups. Moreover, comorbidity patterns of physical conditions and psychiatric disorders were evaluated between the UM and D-M groups.
This clinical retrospective study (N = 1015) analyzed cases with an average of 10 years of illness duration and a nationwide population-based cohort (N = 8343) followed up for 10 years in the Taiwanese population. UM was defined as patients who did not experience depressive episodes and were not prescribed adequate antidepressant treatment during the disease course of BD. Logistic regression models adjusted for relevant covariates were used to evaluate the characteristics and lifetime comorbidities in the two groups.
The proportion of UM ranged from 12.91% to 14.87% in the two datasets. Compared with the D-M group, the UM group had more psychotic symptoms, fewer suicidal behaviors, a higher proportion of morningness chronotype, better sleep quality, higher extraversion, lower neuroticism, and less harm avoidance personality traits. Substantially different lifetime comorbidity patterns were observed between the two groups.
Patients with UM exhibited distinct clinical and psychosocial features compared with patients with the D-M subtype. In particular, a higher risk of comorbid cardiovascular diseases and anxiety disorders is apparent in patients with D-M. Further studies are warranted to investigate the underlying mechanisms for diverse presentations in subgroups of BDs.
Rheumatoid arthritis (RA) is a heterogeneous autoimmune disorder that leads to severe joint deformities, negatively affecting the patient's quality of life. Extracellular vesicles (EVs), which include exosomes and ectosomes, act as intercellular communication mediators in several physiological and pathological processes in various diseases including RA. In contrast, EVs secreted by mesenchymal stem cells perform an immunomodulatory function and stimulate cartilage repair, showing promising therapeutic results in animal models of RA. EVs from other sources, including dendritic cells, neutrophils and myeloid-derived suppressor cells, also influence the biological function of immune and joint cells. This review describes the role of EVs in the pathogenesis of RA and presents evidence supporting future studies on the therapeutic potential of EVs from different sources. This information will contribute to a better understanding of RA development, as well as a starting point for exploring cell-free-based therapies for RA.
Listeriosis is a rare but serious foodborne disease caused by Listeria monocytogenes. This matched case–control study (1:1 ratio) aimed to identify the risk factors associated with food consumption and food-handling habits for the occurrence of sporadic listeriosis in Beijing, China. Cases were defined as patients from whom Listeria was isolated, in addition to the presence of symptoms, including fever, bacteraemia, sepsis and other clinical manifestations corresponding to listeriosis, which were reported via the Beijing Foodborne Disease Surveillance System. Basic patient information and possible risk factors associated with food consumption and food-handling habits were collected through face-to-face interviews. One hundred and six cases were enrolled from 1 January 2018 to 31 December 2020, including 52 perinatal cases and 54 non-perinatal cases. In the non-perinatal group, the consumption of Chinese cold dishes increased the risk of infection by 3.43-fold (95% confidence interval 1.27–9.25, χ2 = 5.92, P = 0.02). In the perinatal group, the risk of infection reduced by 95.2% when raw and cooked foods were well-separated (χ2 = 5.11, P = 0.02). These findings provide important scientific evidence for preventing infection by L. monocytogenes and improving the dissemination of advice regarding food safety for vulnerable populations.
Previous analyses of grey and white matter volumes have reported that schizophrenia is associated with structural changes. Deep learning is a data-driven approach that can capture highly compact hierarchical non-linear relationships among high-dimensional features, and therefore can facilitate the development of clinical tools for making a more accurate and earlier diagnosis of schizophrenia.
To identify consistent grey matter abnormalities in patients with schizophrenia, 662 people with schizophrenia and 613 healthy controls were recruited from eight centres across China, and the data from these independent sites were used to validate deep-learning classifiers.
We used a prospective image-based meta-analysis of whole-brain voxel-based morphometry. We also automatically differentiated patients with schizophrenia from healthy controls using combined grey matter, white matter and cerebrospinal fluid volumetric features, incorporated a deep neural network approach on an individual basis, and tested the generalisability of the classification models using independent validation sites.
We found that statistically reliable schizophrenia-related grey matter abnormalities primarily occurred in regions that included the superior temporal gyrus extending to the temporal pole, insular cortex, orbital and middle frontal cortices, middle cingulum and thalamus. Evaluated using leave-one-site-out cross-validation, the performance of the classification of schizophrenia achieved by our findings from eight independent research sites were: accuracy, 77.19–85.74%; sensitivity, 75.31–89.29% and area under the receiver operating characteristic curve, 0.797–0.909.
These results suggest that, by using deep-learning techniques, multidimensional neuroanatomical changes in schizophrenia are capable of robustly discriminating patients with schizophrenia from healthy controls, findings which could facilitate clinical diagnosis and treatment in schizophrenia.
To compare the prevalence of overweight or obesity (ow/ob) with WHO BMI cut-off points, International Obesity Task Force (IOTF) cut-off points and Chinese BMI criteria and examine its potential factors among preschool children in Hunan Province.
A cross-sectional survey including anthropometric measurements and questionnaires about children’s information, caregivers’ socio-demographic characteristics and maternal characteristics. χ2 tests and univariate and multivariate binary logistic regression were performed to evaluate the possible factors of ow/ob.
Hunan, China, from September to October 2019.
In total, 7664 children 2 to 6 years of age.
According to Chinese BMI criteria, about 1 in 7–8 children aged 2–6 years had ow/ob in Hunan, China. The overall estimated prevalence of ow/ob among 2- to 6-year-old children was significantly higher when based on the Chinese BMI criteria compared with the WHO BMI cut-off points and IOTF cut-off points. According to Chinese BMI criteria, ow/ob was associated with residing in urban areas, older age, male sex, eating snacking food more frequently, macrosomia delivery, caesarean birth, heavier maternal prepregnancy weight and pre-delivery weight.
The prevalence of ow/ob in preschool children in Hunan Province remains high. More ow/ob children could be screened out according to Chinese BMI cut-offs compared with WHO and IOTF BMI criteria. In the future, targeted intervention studies with matched controls will be needed to assess the long-term effects of intervention measures to provide more information for childhood obesity prevention and treatment.
The Brain Health Test-7 (BHT-7) is a revised tool from the original BHT, containing more tests about frontal lobe function. It was developed with theaim of identifying patients with mild cognitive impairment (MCI) and early dementia.
Here we report the validity of the BHT-7 versus the Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA) in differentpsychiatry or neurology clinics.
Patients with memory complaints were recruited in this study from the outpatient clinic of psychiatry or neurology in 3 different kinds of hospitals. Allpatients underwent the evaluation of the BHT-7, MMSE, MoCA, and clinical dementia rating (CDR). The clinical diagnosis (normal, MCI, dementia) was made by consensus meeting, taking into account all available data.
Demographic data and the scores of the MMSE, MoCA, and BHT-7 between groups were compared. Logistic regression was adopted for analysis of optimal cutoff values, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), receiver operating characteristic (ROC) curve,and the area under the ROC curve (AUC).
We enrolled a total of 1090 subjects (normal 402, MCI 317, dementia 371); of them, 705 (64.7%) were female. There was a statistically significant differencein age, years of education, and 3 cognitive test scores among the 3 groups.
Compared with the MMSE and MoCA, the BHT-7 performed slightly betterthan MMSE and MoCA in differentiating MCI or dementia from the normalcontrols (Table 1). For BHT- 7, the cutoff point was 17 between normal andMCI, and 14 between normal and dementia. These cutoff points for BHT-7were consistent through 3 different clinical settings, but inconsistent for MMSE and MoCA. The testing time for the BHT-7 was about 5-7 minutes, shorter than that of the MMSE and MoCA.
Compared with MMSE and MoCA, the BHT-7 showed slightly better performance in differentiating normal from MCI or dementia subjects. The testing time for the BHT-7 was shorter, and its cutoff points were consistent through different outpatient clinic settings. The results support that BHT-7 is auseful cognitive screening tool for MCI or early dementia in various hospital settings.
Comparisons of the performance of BHT-7, MMSE, MoCA
Nutritional Risk Screening index is a standard tool to assess nutritional risk, but epidemiological data are scarce on controlling nutritional status (CONUT) as a prognostic marker in acute haemorrhagic stroke (AHS). We aimed to explore whether the CONUT may predict a 3-month functional outcome in AHS. In total, 349 Chinese patients with incident AHS were consecutively recruited, and their malnutrition risks were determined using a high CONUT score of ≥ 2. The cohort patients were divided into high-CONUT (≥ 2) and low-CONUT (< 2) groups, and primary outcomes were a poor functional prognosis defined as the modified Rankin Scale (mRS) score of ≥ 3 at post-discharge for 3 months. Odds ratios (OR) with 95 % confidence intervals (CI) for the poor functional prognosis at post-discharge were estimated by using a logistic analysis with additional adjustments for unbalanced variables between the high-CONUT and low-CONUT groups. A total of 328 patients (60·38 ± 12·83 years; 66·77 % male) completed the mRS assessment at post-discharge for 3 months, with 172 patients at malnutrition risk at admission and 104 patients with a poor prognosis. The levels of total cholesterol and total lymphocyte counts were significantly lower in high-CONUT patients than low-CONUT patients (P = 0·012 and < 0·001, respectively). At 3-month post discharge, there was a greater risk for the poor outcome in the high-CONUT compared with the low-CONUT patients at admission (OR: 2·32, 95 % CI: 1·28, 4·17). High-CONUT scores independently predict a 3-month poor prognosis in AHS, which helps to identify those who need additional nutritional managements.
The staining procedure is critical for investigating intra- and extra-cellular ultrastructure of microorganisms by transmission electron microscopy (TEM). Here, we propose a new ultra-low lead staining (ULLS) technique for preparing the ultrathin sections for TEM analysis. Sections of Enterobacter sp. (bacteria), Aspergillus niger (filamentous fungi), Rhodotorula mucilaginosa (fungi), and Chlamydomonas reinhardtii (microalgae) were tested. Compared with the sections prepared by the typical double-staining technique, ULLS-based sections showed evident advantages: (i) the staining process only required the addition of Pb(NO3)2; (ii) the Pb level during incubation was set as low as 1 mg/L, which had negligible toxicity to most microbial cells; (iii) the Pb cations were added during microbial culture, which avoided complicated sample preparation as in typical double staining. Taking C. reinhardtii as an example, the ULLS technique allowed fine investigation of microbial ultrastructure, e.g., starch granule, mitochondrion, Golgi apparatus, vacuole, and vesicle. Meanwhile, the physiological processes of the cells such as cell lysis and exocytosis were successfully captured, with relatively high contrast. This study hence shows a bright future on preparation of the high-quality ultrathin sections of microbial cells by the ULLS technique.
The present study evaluated whether fat mass assessment using the triceps skinfold (TSF) thickness provides additional prognostic value to the Global Leadership Initiative on Malnutrition (GLIM) framework in patients with lung cancer (LC). We performed an observational cohort study including 2672 LC patients in China. Comprehensive demographic, disease and nutritional characteristics were collected. Malnutrition was retrospectively defined using the GLIM criteria, and optimal stratification was used to determine the best thresholds for the TSF. The associations of malnutrition and TSF categories with survival were estimated independently and jointly by calculating multivariable-adjusted hazard ratios (HR). Malnutrition was identified in 808 (30·2 %) patients, and the best TSF thresholds were 9·5 mm in men and 12 mm in women. Accordingly, 496 (18·6 %) patients were identified as having a low TSF. Patients with concurrent malnutrition and a low TSF had a 54 % (HR = 1·54, 95 % CI = 1·25, 1·88) greater death hazard compared with well-nourished individuals, which was also greater compared with malnourished patients with a normal TSF (HR = 1·23, 95 % CI = 1·06, 1·43) or malnourished patients without TSF assessment (HR = 1·31, 95 % CI = 1·14, 1·50). These associations were concentrated among those patients with adequate muscle mass (as indicated by the calf circumference). Additional fat mass assessment using the TSF enhances the prognostic value of the GLIM criteria. Using the population-derived thresholds for the TSF may provide significant prognostic value when used in combination with the GLIM criteria to guide strategies to optimise the long-term outcomes in patients with LC.