To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Sichuan cuisine was previously fitted into the Chinese Heart-Healthy Diet (CHH) trial to verify the antihypertensive effect. Whether the modified Sichuan diet lessens cardiovascular disease (CVD) is not fully explored. We aimed to estimate the effects of the Sichuan version of CHH diet (CHH diet-SC) on the 10-year risk of CVD and vascular age. A single-blinded randomised controlled feeding trial was conducted. General CVD prediction model was used in manners of intention-to-treat and per-protocol set. After a 7-d run-in period, fifty-three participants with pre- and grade I hypertension from local communities were randomised and provided with either CHH diet-SC (n 27) or a control diet (n 26) for 4 weeks. Mean absolute and relative estimated CVD risks were reduced by 4·5 % and 27·9 % in the CHH diet-SC group, and the between-group relative risk reduction was 19·5 % (P < 0·001) using linear mixed-effects models. The sensitivity analysis with datasets and models showed consistent results, and pre-specified factors were not associated with the intervention effects. The vascular age of CHH-SC group was theoretically 4·4 years younger than that of the control group after intervention. Compared with a typical diet, adopting the CHH diet-SC over 1 month significantly reduced 10-year CVD risks and vascular ages among local adults with mild hypertension.
Galls function as provide shelter for gall inducers, guarding them against their natural enemies. Previous research has illuminated the interactions between galls, gall inducers, and their corresponding parasitoids within various caltrop plants. However, less is known about these relationships within Nitraria sibirica, particularly regarding the efficacy of parasitism. Therefore, this study aimed to identify the morphometric relationships among the swollen galls, gall inducers, and their parasitoids. Two species of gall inducers and three species of parasitoids were obtained from the swollen galls of N. sibirica. The correlations of the parasitization indexes, the lifespan of gall inhabitants, and temperature and the morphometric relationships between the galls and their inhabitants were analyzed. The dominant gall inducer identified was Contarinia sp. (Diptera: Cecidomyiidae). Furthermore, it was observed that three solitary parasitoids attacked Contarinia sp. in the swollen galls, with only Eupelmus gelechiphagus acting as an idiobiont ectoparasitoid. The dominant parasitoids were Platygaster sp. and Cheiloneurus elegans at sites 1 and 2, respectively, with Platygaster sp. displaying greater abundance than C. elegans in the swollen galls. The lifespan of the gall inhabitants shortened gradually as the temperature increased. Moreover, the optimal number of gall chambers ranged from two to four per swollen gall with maximized fitness, which can be considered the optimal population density for the gall inducer Contarinia sp. Morphometric analysis exhibited a strong linear correlation between gall size and chamber number or the number of gall inhabitants, as well as a weak correlation between gall size and body size of the primary inhabitants of swollen galls. Our results highlight the importance of the biological investigation of parasitoids and gall inducers living in closed galls with multiple chambers and may pave the way for potential application in biological control.
To report the processes used to design and implement an assessment tool to inform funding decisions for competing health innovations in a tertiary hospital.
We designed an assessment tool for health innovation proposals with three components: “value to the institution,” “novelty,” and “potential for adoption and scaling.” The “value to the institution” component consisted of twelve weighted value attributes identified from the host institution’s annual report; weights were allocated based on a survey of the hospital’s leaders. The second and third components consisted of open-ended questions on “novelty” and “barriers to implementation” to support further dialogue. Purposive literature review was performed independently by two researchers for each assessment. The assessment tool was piloted during an institutional health innovation funding cycle.
We used 17 days to evaluate ten proposals. The completed assessments were shared with an independent group of panellists, who selected five projects for funding. Proposals with the lowest scores for “value to the institution” had less perceived impact on the patient-related value attributes of “access,” “patient centeredness,” “health outcomes,” “prevention,” and “safety.” Similar innovations were reported in literature in seven proposals; potential barriers to implementation were identified in six proposals. We included a worked example to illustrate the assessment process.
We developed an assessment tool that is aligned with local institutional priorities. Our tool can augment the decision-making process when funding health innovation projects. The tool can be adapted by others facing similar challenges of trying to choose the best health innovations to fund.
The current study examined the effects of a 16-week creative expression program on brain activity during a story creating task and resting-state functional network connectivity in mild cognitive impairment (MCI) adults.
Thirty-six MCI adults were allocated to either the creative expression program (CrExp, n = 18) or control group (CG,n = 18). Before and after intervention, all participants were scanned with functional magnetic resonance imaging (fMRI) during story creating task performance and a resting state. The two-group comparison was calculated between the blood oxygenation level-dependent (BOLD) signal changes for each cluster to investigate the differences in fMRI activation and functional connectivity (FC) between two groups.
Task activation analyses showed an increase in the right anterior cingulate gyrus (ACG), right medial frontal gyrus (MFG), right lentiform nucleus (LN), left hippocampus (HIP), left middle occipital gyrus (MOG), and left cerebellum posterior lobe (CPL) (p < 0.05). Story creating performance improvements were associated with greater activation in the left HIP region. Resting-state functional connectivity (FC) between left HIP and certain other brain areas shown a significant interaction of creative expression group versus control group. Moreover, connectivity between the right angular gyrus (ANG), right inferior temporal gyrus (ITG), right superior occipital gyrus (SOG), left ANG, and left MFG were related to improved cognitive performance (p < 0.05).
These data extend current knowledge by indicating that the creative expression program can improve cognitive activation in MCI, and these enhancements may be related to the neurocognitive network plasticity changes induced by creative expression training.
Understanding factors associated with post-discharge sleep quality among COVID-19 survivors is important for intervention development.
This study investigated sleep quality and its correlates among COVID-19 patients 6 months after their most recent hospital discharge.
Healthcare providers at hospitals located in five different Chinese cities contacted adult COVID-19 patients discharged between 1 February and 30 March 2020. A total of 199 eligible patients provided verbal informed consent and completed the interview. Using score on the single-item Sleep Quality Scale as the dependent variable, multiple linear regression models were fitted.
Among all participants, 10.1% reported terrible or poor sleep quality, and 26.6% reported fair sleep quality, 26.1% reported worse sleep quality when comparing their current status with the time before COVID-19, and 33.7% were bothered by a sleeping disorder in the past 2 weeks. After adjusting for significant background characteristics, factors associated with sleep quality included witnessing the suffering (adjusted B = −1.15, 95% CI = −1.70, −0.33) or death (adjusted B = −1.55, 95% CI = −2.62, −0.49) of other COVID-19 patients during hospital stay, depressive symptoms (adjusted B = −0.26, 95% CI = −0.31, −0.20), anxiety symptoms (adjusted B = −0.25, 95% CI = −0.33, −0.17), post-traumatic stress disorders (adjusted B = −0.16, 95% CI = −0.22, −0.10) and social support (adjusted B = 0.07, 95% CI = 0.04, 0.10).
COVID-19 survivors reported poor sleep quality. Interventions and support services to improve sleep quality should be provided to COVID-19 survivors during their hospital stay and after hospital discharge.
This article outlines the research progress on radiocarbon (14C) dating of the Erlitou site. The Erlitou site, belonging to the Bronze Age, located in Yanshi, Henan province, China, was discovered by archaeologists in 1959 when they investigated the Xia people’s remains in the area where the Xia people lived according to the records of ancient documents. Since then, there has been a standing debate about whether the site belongs to the Xia or Shang dynasty. By the mid-1990s, several hundred discussion articles on the issue had been published, but the question was still unresolved. Therefore, evidence from the chronology has attracted a great amount of attention. The dating of the Erlitou site began in the 1970s, and since the Xia-Shang-Zhou Chronology Project began in the mid-1990s, by application of wiggle-matching on the basis of improving the dating accuracy, the date of the Erlitou site has gradually become clear, which provides a basis for the archaeological research on the Xia and Shang dynasties.
This study investigates the mechanism by which maternal protein restriction induces hepatic autophagy-related gene expression in the offspring of rats. Pregnant Sprague-Dawley rats were fed either a control diet (C, 18 % energy from protein) or a low-protein diet (LP, 8·5 % energy from protein) during gestation, followed by the control diet during lactation and post-weaning. Liver tissue was collected from the offspring at postnatal day 38 and divided into four groups according to sex and maternal diet (F-C, F-LP, M-C and M-LP) for further analysis. Autophagy-related mRNA and protein levels were determined by real-time PCR and Western blotting, respectively. In addition, chromatin immunoprecipitation (ChIP) was performed to investigate the interactions between transcription factors and autophagy-related genes. Protein levels of p- eukaryotic translation initiation factor 2a and activating transcription factor 4 (ATF4) were increased only in the female offspring born to dams fed the LP diet. Correlatively, the mRNA expression of hepatic autophagy-related genes including Map1lc3b, P62/Sqstm1, Becn1, Atg3, Atg7 and Atg10 was significantly greater in the F-LP group than in the F-C group. Furthermore, ChIP results showed greater ATF4 and C/EBP homology protein (CHOP) binding at the regions of a set of autophagy-related genes in the F-LP group than in the F-C group. Our data demonstrated that a maternal LP diet transcriptionally programmed hepatic autophagy-related gene expression only in female rat offspring. This transcriptional programme involved the activation of the eIF2α/ATF4 pathway and intricate regulation by transcription factors ATF4 and CHOP.
Fruit intake may influence gestational diabetes mellitus (GDM) risk. However, prospective evidence remains controversial and limited. The current study aimed to investigate whether total fruit and specific fruit intake influence GDM risk.
A prospective cohort study was conducted. Dietary information was collected by a 3-d 24-h dietary recall. All participants underwent a standard 75-g oral glucose tolerance test at 24–28 gestational weeks. Log-binomial models were used to estimate the association between fruit intake and GDM risk, and the results are presented as relative risks (RR) and 95 % CI.
Totally, 1453 healthy pregnant women in 2017.
Total fruit intake was not associated with lower GDM risk (RR of 1·03 (95 % CI 0·83, 1·27) (Ptrend = 0·789)). The RR of GDM risk was 0·73 for the highest anthocyanin-rich fruit intake quartile compared with the lowest quartile (95 % CI 0·56, 0·93; Ptrend = 0·015). A higher grape intake had a linear inverse association with GDM risk (Q4 v. Q1: RR = 0·65; 95 % CI 0·43, 0·98; Ptrend = 0·044), and after further adjustment for anthocyanin intake, the inverse association tended to be non-linear (Q4 v. Q1: RR = 0·65; 95 % CI 0·44, 0·98; Ptrend = 0·079). However, we did not find an association between glycaemic index-grouped fruit, glycaemic load-grouped fruit or other fruit subtype intake and GDM risk.
In conclusion, specific fruit intake (particularly anthocyanin-rich fruit and grapes) but not total fruit intake was inversely associated with GDM risk.
To establish optimal gestational weight gain (GWG) in Chinese pregnant women by Chinese-specific BMI categories and compare the new recommendations with the Institute of Medicine (IOM) 2009 guidelines.
Multicentre, prospective cohort study. Unconditional logistic regression analysis was used to evaluate the OR, 95 % CI and the predicted probabilities of adverse pregnancy outcomes. The optimal GWG range was defined as the range that did not exceed a 1 % increase from the lowest predicted probability in each pre-pregnancy BMI group.
From nine cities in mainland China.
A total of 3731 women with singleton pregnancy were recruited from April 2013 to December 2014.
The optimal GWG (ranges) by Chinese-specific BMI was 15·0 (12·8–17·1), 14·2 (12·1–16·4) and 12·6 (10·4–14·9) kg for underweight, normal weight and overweight pregnant women, respectively. Inappropriate GWG was associated with several adverse pregnancy outcomes. Compared with women gaining weight within our proposed recommendations, women with excessive GWG had higher risk for macrosomia, large for gestational age and caesarean section, whereas those with inadequate GWG had higher risk for low birth weight, small for gestational age and preterm delivery. The comparison between our proposed recommendations and IOM 2009 guidelines showed that our recommendations were comparable with the IOM 2009 guidelines and could well predict the risk of several adverse pregnancy outcomes.
Inappropriate GWG was associated with higher risk of several adverse pregnancy outcomes. Optimal GWG recommendations proposed in the present study could be applied to Chinese pregnant women.
We aimed to examine the association between low-carbohydrate diet (LCD) scores during the first trimester and gestational diabetes mellitus (GDM) risk in a Chinese population. A total of 1455 women were included in 2017. Dietary information during the first trimester was collected by 24-h dietary recalls for 3 d. The overall, animal and plant LCD scores, which indicated adherence to different low-carbohydrate dietary patterns, were calculated. GDM was diagnosed based on the results of a 75-g, 2-h oral glucose tolerance test at 24–28 weeks gestation. Log-binomial models were used to estimate relative risks (RR) and 95 % CI. The results showed that the multivariable-adjusted RR of GDM from the lowest to the highest quartiles of the overall LCD score were 1·00 (reference), 1·15 (95 % CI 0·92, 1·42), 1·30 (95 % CI 1·06, 1·60) and 1·24 (95 % CI 1·01, 1·52) (P = 0·026 for trend). Multivariable-adjusted RR (95 % CI) of GDM from the lowest to the highest quartiles of the animal LCD score were 1·00 (reference), 1·20 (95 % CI 0·96, 1·50), 1·41 (95 % CI 1·14, 1·73) and 1·29 (95 % CI 1·04, 1·59) (P = 0·002 for trend). After additional adjustment for gestational weight gain before GDM diagnosis, the association of the overall LCD score with GDM risk was non-significant, while the association of animal LCD score with GDM risk remained significant. In conclusion, a low-carbohydrate dietary pattern characterised by high animal fat and protein during the first trimester is associated with an increased risk of GDM in Chinese women.
Drug-induced liver injury (DILI) is a common adverse drug reaction leading to the interruption of tuberculosis (TB) therapy. We aimed to identify whether the hepatitis B virus (HBV) infection would increase the risk of DILI during first-line TB treatment. A meta-analysis of cohort studies searched in PubMed, Web of Science and China National Knowledge Infrastructure was conducted. Effect sizes were reported as risk ratios (RRs) and 95% confidence intervals (CIs) and calculated by R software. Sixteen studies with 3960 TB patients were eligible for analysis. The risk of DILI appeared to be higher in TB patients co-infected with HBV (RR 2.66; 95% CI 2.13–3.32) than those without HBV infection. Moreover, patients with positive hepatitis B e antigen (HBeAg) were more likely to develop DILI (RR 3.42; 95% CI 1.95–5.98) compared to those with negative HBeAg (RR 2.30; 95% CI 1.66–3.18). Co-infection with HBV was not associated with a higher rate of anti-TB DILI in latent TB patients (RR 4.48; 95% CI 0.80–24.99). The effect of HBV infection on aggravating anti-TB DILI was independent of study participants, whether they were newly diagnosed with TB or not. Besides, TB and HBV co-infection patients had a longer duration of recovery from DILI compared to non-co-infected patients (SMD 2.26; 95% CI 1.87–2.66). To conclude, the results demonstrate that HBV infection would increase the risk of DILI during TB therapy, especially in patients with positive HBeAg, and close liver function monitoring is needed for TB and HBV co-infection patients.
One of the leading challenges in chemical sciences is the separation of complex mixtures. This is of vital importance for areas such as commodity chemical generation, where there is a need for the generation of high-purity chemical streams. Due to this, there has been a strong push toward the investigation of new materials capable of achieving chemoselective separation, with self-assembled materials having shown a great deal of promise for such separations. Many self-assembled materials are desirable candidates due to their low-cost synthesis, structural self-regulation, tunable properties, and an overall ease of composite material preparation. In this article, we aim to introduce examples of novel self-assembled materials and their practical usage in chemical separations. The specific approaches to fabricate these materials, as well as the strengths and shortcomings associated with their structures, will also be described. The strategies presented here will emphasize the production and employment of nonconventional self-assembled materials that exhibit a high potential for the advancement of the science of chemical separations.
The aim of this study was to explore the application of the flipped classroom approach in the training of Mass Casualty Triage (MCT) to medical undergraduate students.
In this study, 103 fourth-year medical students were randomly divided into a Flipped Classroom (FC) group (n = 51) and a Traditional Lecture-based Classroom (TLC) group (n = 52). A post-class quiz, simulated field triage (SFT) and feedback questionnaires were performed to assess both groups of students for their learning of the course.
In the post-quiz, the median (IQR) scores achieved by students from the FC and TLC groups were 42(5) and 39(5.5), respectively. Significant differences were found between the two groups. In the SFT, overall triage accuracy was 67.06% for FC, and 64.23% for TLC students. Over-triage and under-triage errors occurred in 18.43% and 14.50% of the FC group, respectively. The TLC group had a similar pattern of 20.77% over-triage and 15.0% under-triage errors. No significant differences were found regarding overall triage accuracy or triage errors between the two groups.
The FC approach could enhance course grades reflected in the post-quiz and improve students’ satisfaction with the class. However, there was no significant difference of competency between the two groups demonstrated in the SFT exercise.
The effect of dietary vitamin D, calcium and dairy products intake on colorectal cancer risk is controversial. This study aims to investigate the associations between dietary vitamin D, calcium, dairy products intake and colorectal cancer risk among Chinese population.
Materials and Methods
During July 2010 to December 2018, 2380 incident, first primary, histologically confirmed colorectal cancer cases and 2389 sex and age-matched (5-year interval) controls were recruited. Dietary intake information was collected by face-to-face interviews using a validated food frequency questionnaire. Energy and other nutrient intakes such as dietary calcium were computed on the basis of the 2002 Chinese Food Composition Table, and the dietary vitamin D intake was calculated according to the United States Department of Agriculture Food Composition Database. Unconditional multivariable logistic regression models were used to calculate the odds ratios (ORs) and 95% confidence interval (CI) after adjusting for various confounders, including socio-demographic characteristics, lifestyle factors, BMI, family history of cancer, energy intake and several nutrient intakes.
The energy-adjusted mean dietary vitamin D, calcium and total dairy products intakes were 5.69μg/d, 406.94mg/d, 4.02g/d for cases and 6.81μg/d, 468.21mg/d, 9.50g/d for controls. Compared with the controls, cases had a lower intake of dietary vitamin D, calcium and total dairy (P < 0.001). A higher intake of dietary vitamin D and calcium was found to be associated with 43% and 51% reduction in colorectal cancer. The ORs of the highest quartile compared with the lowest quartile intake were 0.57 (95% CI: 0.46, 0.70, Ptrend < 0.001) for dietary vitamin D and 0.49 (95% CI: 0.39, 0.61, Ptrend < 0.001) for dietary calcium. We observed a statistically significant inverse association of dairy products intake with colorectal cancer risk. Compared with the lowest tertile, the adjusted ORs for the highest tertile were 0.31 (95% CI: 0.26, 0.38, Ptrend < 0.001) for total dairy. The inverse associations of dietary vitamin D, calcium and dairy products intakes with colorectal cancer risk were observed in both men and women, colon and rectal cancer.
Our study indicated that higher dietary vitamin D, calcium and dairy products intakes were associated with a lower colorectal cancer risk.
Hypertension is a common comorbidity in COVID-19 patients. However, the association of hypertension with the severity and fatality of COVID-19 remain unclear. In the present meta-analysis, relevant studies reported the impacts of hypertension on SARS-CoV-2 infection were identified by searching PubMed, Elsevier Science Direct, Web of Science, Wiley Online Library, Embase and CNKI up to 20 March 2020. As the results shown, 12 publications with 2389 COVID-19 patients (674 severe cases) were included for the analysis of disease severity. The severity rate of COVID-19 in hypertensive patients was much higher than in non-hypertensive cases (37.58% vs 19.73%, pooled OR: 2.27, 95% CI: 1.80–2.86). Moreover, the pooled ORs of COVID-19 severity for hypertension vs. non-hypertension was 2.21 (95% CI: 1.58–3.10) and 2.32 (95% CI: 1.70–3.17) in age <50 years and ⩾50 years patients, respectively. Additionally, six studies with 151 deaths of 2116 COVID-19 cases were included for the analysis of disease fatality. The results showed that hypertensive patients carried a nearly 3.48-fold higher risk of dying from COVID-19 (95% CI: 1.72–7.08). Meanwhile, the pooled ORs of COVID-19 fatality for hypertension vs. non-hypertension was 6.43 (95% CI: 3.40–12.17) and 2.66 (95% CI: 1.27–5.57) in age <50 years and ⩾50 years patients, respectively. Neither considerable heterogeneity nor publication bias was observed in the present analysis. Therefore, our present results provided further evidence that hypertension could significantly increase the risks of severity and fatality of SARS-CoV-2 infection.
The South Altyn Orogenic Belt (SAOB) is one of the most important orogenic belts in NW China, consisting of the South Altyn Continental Block and the Apa–Mangya Ophiolitic Mélange Belt. However, its Palaeozoic tectonic evolution is still controversial. Here, we present petrological, geochemical, zircon U–Pb and Lu–Hf isotopic data for the Mangya plutons with the aim of establishing the Palaeozoic tectonic evolution. We divide the Early Palaeozoic magmatism in the Apa–Mangya Ophiolitic Mélange Belt into four episodes and propose a plate tectonic model for the formation of these rocks. During 511–494 Ma, the South Altyn Ocean (SAO) was in a spreading stage, and some shoshonite series, I-type granitic rocks were generated. From 484 to 458 Ma, the oceanic crust of the SAO subducted northward, accompanied by large-scale magmatic events resulting in the generation of vast high-K calc-alkaline series, I-type granitic rocks. During 450–433 Ma, the SAO closed, and break-off of the subducted oceanic slab occurred, with the generation of some high-K calc-alkaline series, I–S transitional type granites. The SAOB was in post-orogenic extensional environment from 419 to 404 Ma, and many A-type granites were generated.