To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this work, a confined-doped fiber with the core/inner-cladding diameter of 40/250 μm and a relative doping ratio of 0.75 is fabricated through a modified chemical vapor deposition method combined with the chelate gas deposition technique, and subsequently applied in a tandem-pumped fiber amplifier for high-power operation and transverse mode instability (TMI) mitigation. Notably, the impacts of the seed laser power and mode purity are preliminarily investigated through comparative experiments. It is found that the TMI threshold could be significantly affected by the seed laser mode purity. The possible mechanism behind this phenomenon is proposed and revealed through comprehensive comparative experiments and theoretical analysis. Finally, a maximum output power of 7.49 kW is obtained with the beam quality factor of approximately 1.83, which is the highest output power ever reported in a forward tandem-pumped confined-doped fiber amplifier. This work could provide a good reference and practical solution to improve the TMI threshold and realize high-power high-brightness fiber lasers.
The sequential occurrence of three layers of smooth muscle layers (SML) in human embryos and fetus is not known. Here, we investigated the process of gut SML development in human embryos and fetuses and compared the morphology of SML in fetuses and neonates. The H&E, Masson trichrome staining, and Immunohistochemistry were conducted on 6–12 gestation week human embryos and fetuses and on normal neonatal intestine. We showed that no lumen was seen in 6–7th gestation week embryonic gut, neither gut wall nor SML was developed in this period. In 8–9th gestation week embryonic and fetal gut, primitive inner circular SML (IC-SML) was identified in a narrow and discontinuous gut lumen with some vacuoles. In 10th gestation week fetal gut, the outer longitudinal SML (OL-SML) in gut wall was clearly identifiable, both the inner and outer SML expressed α-SMA. In 11–12th gestation week fetal gut, in addition to the IC-SML and OL-SML, the muscularis mucosae started to develop as revealed by α-SMA immune-reactivity beneath the developing mucosal epithelial layer. Comparing with the gut of fetuses of 11–12th week of gestation, the muscularis mucosae, IC-SML, and OL-SML of neonatal intestine displayed different morphology, including branching into glands of lamina propria in mucosa and increased thickness. In conclusions, in the human developing gut between week-8 to week-12 of gestation, the IC-SML develops and forms at week-8, followed by the formation of OL-SML at week-10, and the muscularis mucosae develops and forms last at week-12.
The study investigated the strategies used by Chinese students in inferring meanings of unfamiliar words and the influential factors of successful use of different lexical inferencing strategies. A total of 104 fourth graders inferred 36 unfamiliar semitransparent compound words in three conditions: word in isolation, contextual information only, and both word and context. Results revealed that students were more likely to obtain the correct meaning of words when both morphological information and contextual information were available. The likelihood of using a morpheme-based or context-based lexical inferencing strategy was strongly influenced by the presentation condition of target words and precursors. Students with higher vocabulary knowledge and reading comprehension ability were more sensitive to morphological and contextual information and were able to synthesize multiple sources of information, whereas children with lower vocabulary knowledge and reading comprehension ability showed difficulties in integration and tended to overly rely on morphological information. The findings reveal the interactions between available source information and individual differences in vocabulary knowledge and reading comprehension in predicting lexical inferencing and have implications for vocabulary and reading instruction.
The selection of high-quality sperms is critical to intracytoplasmic sperm injection, which accounts for 70–80% of in vitro fertilization (IVF) treatments. So far, sperm screening is usually performed manually by clinicians. However, the performance of manual screening is limited in its objectivity, consistency, and efficiency. To overcome these limitations, we have developed a fast and noninvasive three-stage method to characterize morphology of freely swimming human sperms in bright-field microscopy images using deep learning models. Specifically, we use an object detection model to identify sperm heads, a classification model to select in-focus images, and a segmentation model to extract geometry of sperm heads and vacuoles. The models achieve an F1-score of 0.951 in sperm head detection, a z-position estimation error within ±1.5 μm in in-focus image selection, and a Dice score of 0.948 in sperm head segmentation, respectively. Customized lightweight architectures are used for the models to achieve real-time analysis of 200 frames per second. Comprehensive morphological parameters are calculated from sperm head geometry extracted by image segmentation. Overall, our method provides a reliable and efficient tool to assist clinicians in selecting high-quality sperms for successful IVF. It also demonstrates the effectiveness of deep learning in real-time analysis of live bright-field microscopy images.
The horse played a crucial role in China through the first millennium BC, used both for military advantage and, through incorporation into elite burials, to express social status. Details of how horses were integrated into mortuary contexts during the Qin Empire, however, are poorly understood. Here, the authors present new zooarchaeological data for 24 horses from an accessory pit in Qin Shihuang's mausoleum, indicating that the horses chosen were tall, adult males. These findings provide insights into the selection criteria for animals to be included in the emperor's tomb and invite consideration of questions concerning horse breeds, husbandry practices, and the military and symbolic importance of horses in early imperial China.
In the past 10–15 years, the government of China has made various efforts in tackling excessive antibiotics use. Yet, little is known about their effects at rural primary care settings. This study aimed to determine the impact of government policies and the COVID-19 pandemic on antibiotic prescribing practices at such settings utilizing data from separate studies carried out pre- and during the pandemic, in 2016 and 2021 in Anhui province, China, using identical sampling and survey approaches. Data on antibiotics prescribed, diagnosis, socio-demographic, etc., were obtained through non-participative observation and a structured exit survey. Data analysis comprised mainly descriptive comparisons of 1153 and 762 patients with respiratory infections recruited in 2016 and 2021, respectively. The overall antibiotics prescription rate decreased from 89.6% in 2016 to 69.1% in 2021, and the proportion of prescriptions for two or more classes of antibiotics was estimated as 35.9% in 2016 and 11.0% in 2021. There was a statistically significant decrease in the number of days from symptom onset to clinic visits between the year groups. In conclusion, measures to constrain excessive prescription of antibiotics have led to some improvements at the rural primary care level, and the COVID-19 pandemic has had varying effects on antibiotic use.
Previous analyses of grey and white matter volumes have reported that schizophrenia is associated with structural changes. Deep learning is a data-driven approach that can capture highly compact hierarchical non-linear relationships among high-dimensional features, and therefore can facilitate the development of clinical tools for making a more accurate and earlier diagnosis of schizophrenia.
To identify consistent grey matter abnormalities in patients with schizophrenia, 662 people with schizophrenia and 613 healthy controls were recruited from eight centres across China, and the data from these independent sites were used to validate deep-learning classifiers.
We used a prospective image-based meta-analysis of whole-brain voxel-based morphometry. We also automatically differentiated patients with schizophrenia from healthy controls using combined grey matter, white matter and cerebrospinal fluid volumetric features, incorporated a deep neural network approach on an individual basis, and tested the generalisability of the classification models using independent validation sites.
We found that statistically reliable schizophrenia-related grey matter abnormalities primarily occurred in regions that included the superior temporal gyrus extending to the temporal pole, insular cortex, orbital and middle frontal cortices, middle cingulum and thalamus. Evaluated using leave-one-site-out cross-validation, the performance of the classification of schizophrenia achieved by our findings from eight independent research sites were: accuracy, 77.19–85.74%; sensitivity, 75.31–89.29% and area under the receiver operating characteristic curve, 0.797–0.909.
These results suggest that, by using deep-learning techniques, multidimensional neuroanatomical changes in schizophrenia are capable of robustly discriminating patients with schizophrenia from healthy controls, findings which could facilitate clinical diagnosis and treatment in schizophrenia.
Jinghu is the lead accompaniment instrument in the Peking opera ensemble. Qinshi are the accompanists who play jinghu. Traditionally, jinghu was regarded as a male instrument; female qinshi first appeared in educational institutions subsidised by the People’s Republic of China in the 1950s. Nevertheless, till now, the-all-male jinghu performance history and standards influenced by the virile ethos have deeply influenced contemporary female qinshi’s performances and recognition of their musicality and contributions to jinghu performance. In this article, I explore the rise of female qinshi, their challenges to jinghu performance conventions, and their contributions to contemporary jinghu performance.
We aim to examine the relation of several folate forms (5-methyltetrahydrofolate (5-mTHF), unmetabolised folic acid (UMFA) and MeFox) with kidney function and albuminuria, which remained uncertain. The cross-sectional study was conducted in 18 757 participants from National Health and Nutrition Examination Survey 2011–2018. The kidney outcomes were reduced estimated glomerular filtration rate (eGFR) (<60 ml/min/1·73 m2), microalbuminuria (albumin:creatinine ratio (ACR) of 30–299 mg/g) and macroalbuminuria (ACR ≥ 300 mg/g). Overall, there were significant inverse associations between serum 5-mTHF and kidney outcomes with significant lower prevalence of reduced eGFR (OR, 0·71; 95 % CI: 0·57, 0·87) and macroalbuminuria (OR, 0·65; 95 % CI: 0·46, 0·91) in participants in quartiles 3–4 (v. quartiles 1–2; both Pfor trend across quartiles <0·05). In contrast, there were significant positive relationship between serum UMFA and kidney outcomes with significant higher prevalence of reduced eGFR in participants in quartiles 2–4 (v. quartile 1; OR, 2·12; 95 % CI: 1·45, 3·12; Pfor trend <0·001) and higher prevalence of macroalbuminuria in participants in quartile 4 (v. quartiles 1–3; OR, 1·46; 95 % CI: 1·06, 2·01; Pfor trend <0·001). However, there was no significant associations of 5-mTHF and UMFA with microalbuminuria. In addition, there were significant positive relationships of serum MeFox with reduced eGFR, microalbuminuria and macroalbuminuria (all Pfor trend <0·01). In conclusion, higher 5-mTHF level, along with lower UMFA and MeFox level, was associated with lower prevalence of kidney outcomes, which may help counsel future clinical trials and nutritional guidelines regarding the folate supplement.
Salicylic acid (SA), a phytohormone, has been considered to be a key regulator mediating plant defence against pathogens. It is still vague how SA activates plant defence against herbivores such as chewing and sucking pests. Here, we used an aphid-susceptible wheat variety to investigate Sitobion avenae response to SA-induced wheat plants, and the effects of exogenous SA on some defence enzymes and phenolics in the plant immune system. In SA-treated wheat seedlings, intrinsic rate of natural increase (rm), fecundity and apterous rate of S. avenae were 0.25, 31.4 nymphs/female and 64.4%, respectively, and significantly lower than that in the controls (P < 0.05). Moreover, the increased activities of phenylalanine-ammonia-lyase, polyphenol oxidase (PPO) and peroxidase in the SA-induced seedlings obviously depended on the sampling time, whereas activities of catalase and 4-coumarate:CoA ligase were suppressed significantly at 24, 48 and 72 h in comparison with the control. Dynamic levels of p-coumaric acid at 96 h, caffeic acid at 24 and 72 h and chlorogenic acid at 24, 48 and 96 h in wheat plants were significantly upregulated by exogenous SA application. Nevertheless, only caffeic acid content was positively correlated with PPO activity in SA-treated wheat seedlings (P = 0.031). These findings indicate that exogenous SA significantly enhanced the defence of aphid-susceptible wheat variety against aphids by regulating the plant immune system, and may prove a potential application of SA in aphid control.
Manganese (Mn) oxides have been prevalent on Earth since before the Great Oxidation Event and the Mn cycle is one of the most important biogeochemical processes on the Earth's surface. In sunlit natural environments, the photochemistry of Mn oxides has been discovered to enable solar energy harvesting and conversion in both geological and biological systems. One of the most widespread Mn oxides is birnessite, which is a semiconducting layered mineral that actively drives Mn photochemical cycling in Nature. The oxygen-evolving centre in biological photosystem II (PSII) is also a Mn-cluster of Mn4CaO5, which transforms into a birnessite-like structure during the photocatalytic oxygen evolution process. This phenomenon draws the potential parallel of Mn-functioned photoreactions between the organic and inorganic world. The Mn photoredox cycling involves both the photo-oxidation of Mn(II) and the photoreductive dissolution of Mn(IV/III) oxides. In Nature, the occurrence of Mn(IV/III) photoreduction is usually accompanied with the oxidative degradation of natural organics. For Mn(II) oxidation into Mn oxides, mechanisms of biological catalysis mediated by microorganisms (such as Pseudomonas putida and Bacillus species) and abiotic photoreactions by semiconducting minerals or reactive oxygen species have both been proposed. In particular, anaerobic Mn(II) photo-oxidation processes have been demonstrated experimentally, which shed light on Mn oxide emergence before atmospheric oxygenation on Earth. This review provides a comprehensive and up-to-date elaboration of Mn oxide photoredox cycling in Nature, and gives brand-new insight into the photochemical properties of semiconducting Mn oxides widespread on the Earth's surface.
We aimed to examine whether baseline neutrophil counts affected the risk of new-onset proteinuria in hypertensive patients, and, if so, whether folic acid treatment is particularly effective in proteinuria prevention in such a setting. A total of 8208 eligible participants without proteinuria at baseline were analysed from the renal substudy of the China Stroke Primary Prevention Trial. Participants were randomised to receive a double-blind daily treatment of 10 mg of enalapril and 0·8 mg of folic acid (n 4101) or 10 mg of enalapril only (n 4107). The primary outcome was new-onset proteinuria, defined as a urine dipstick reading of ≥1+ at the exit visit. The mean age of the participants was 59·5 (sd, 7·4) years, 3088 (37·6 %) of the participants were male. The median treatment duration was 4·4 years. In the enalapril-only group, a significantly higher risk of new-onset proteinuria was found among participants with higher neutrophil counts (quintile 5; ≥4·8 × 109/l, OR 1·44; 95 % CI 1·00, 2·06), compared with those in quintiles 1–4. For those with enalapril and folic acid treatment, compared with the enalapril-only group, the new-onset proteinuria risk was reduced from 5·2 to 2·8 % (OR 0·49; 95 % CI 0·29, 0·82) among participants with higher neutrophil counts (≥4·8 × 109/l), whereas there was no significant effect among those with neutrophil counts <4·8 × 109/l. In summary, among hypertensive patients, those with higher neutrophil counts had increased risk of new-onset proteinuria, and this risk was reduced by 51 % with folic acid treatment.
The association between dietary Fe intake and diabetes risk remains inconsistent. We aimed to explore the association between dietary Fe intake and type 2 diabetes mellitus (T2DM) risk in middle-aged and older adults in urban China. This study used data from the Guangzhou Nutrition and Health Study, an on-going community-based prospective cohort study. Participants were recruited from 2008 to 2013 in Guangzhou community. A total of 2696 participants aged 40–75 years without T2DM at baseline were included in data analyses, with a median of 5·6 (interquartile range 4·1–5·9) years of follow-up. T2DM was identified by self-reported diagnosis, fasting glucose ≥ 7·0 mmol/l or glycosylated Hb ≥ 6·5 %. Cox proportional hazard models were used to estimate hazard ratios (HR) and 95 % CI. We ascertained 205 incident T2DM cases during 13 476 person-years. The adjusted HR for T2DM risk in the fourth quartile of haem Fe intake was 1·92 (95 % CI 1·07, 3·46; Ptrend = 0·010), compared with the first quartile intake. These significant associations were found in haem Fe intake from total meat (HR 2·74; 95 % CI 1·22, 6·15; Ptrend = 0·011) and haem Fe intake from red meat (HR 1·86; 95 % CI 1·01, 3·44; Ptrend = 0·034), but not haem Fe intake from processed meat, poultry or fish/shellfish. The association between dietary intake of total Fe or non-haem Fe with T2DM risk had no significance. Our findings suggested that higher dietary intake of haem Fe (especially from red meat), but not total Fe or non-haem Fe, was associated with greater T2DM risk in middle-aged and older adults.
Eating disorders have increasingly become a public health concern globally. This study aimed to reveal the burden of eating disorders at the global, regional and national levels using the Global Burden of Disease (GBD) Study 2017 data.
We extracted the age-standardised rates (ASRs) of prevalence and disability-adjusted life years (DALYs) and their 95% uncertainty intervals (UIs) of eating disorders, including anorexia nervosa and bulimia nervosa, between 1990 and 2017 from the GBD 2017 data. The estimated annual percentage changes (EAPCs) were calculated to quantify the secular trends of the burden of eating disorders.
The ASRs of prevalence and the DALYs of eating disorders continuously increased worldwide from 1990 to 2017 by an average of 0.65 (95% UI: 0.59–0.71) and 0.66 (95% UI: 0.60–0.72), respectively. The burden of eating disorders was higher in females than in males, but the increment in ASRs was greater in males than in females over time. In 2017, the highest burden of eating disorders was observed in the high sociodemographic index (SDI) regions, especially Australasia (ASR of prevalence = 807.13, 95% UI: 664.20–982.30; ASR of DALYs = 170.74, 95% UI: 113.43–244.14, per 100 000 population), Western Europe and high-income North America. However, the most significant increment of the burden of eating disorders was observed in East Asia (EAPC for prevalence = 2.23, 95% UI: 2.14–2.32; EAPC for DALYs = 2.22, 95% UI: 2.13–2.31), followed by South Asia. An increasing trend in the burden of eating disorders at the national level was observed among most countries or territories. The countries with the top three highest increasing trends were Equatorial Guinea, Bosnia and Herzegovina and China. Positive associations were found between the burden estimates and the SDI levels in almost all geographic regions during the observed 28-year period. We also found that the human development indexes in 2017 were positively correlated with the EAPCs of the ASRs of prevalence (ρ = 0.222, P = 0.002) and DALYs (ρ = 0.208, P = 0.003).
The highest burden of eating disorders remains in the high-income western countries, but an increasing trend was observed globally and in all SDI-quintiles, especially in Asian regions that were highly populous. These results could help governments worldwide formulate suitable medical and health policies for the prevention and early intervention of eating disorders.
Some studies have suggested that the Toll-like receptor 9 polymorphism (TLR9 rs352140) is closely related to the risk of bacterial meningitis (BM), but this is subject to controversy. This study set out to estimate whether the TLR9 rs352140 polymorphism confers an increased risk of BM. Relevant literature databases were searched including PubMed, Embase, the Cochrane Library and China National Knowledge Infrastructure (CNKI) up to August 2020. Seven case-control studies from four publications were enrolled in the present meta-analysis. Odds ratios (OR) and confidence intervals (95% CI) were calculated to estimate associations between BM risk and the target polymorphism. Significant associations identified were allele contrast (A vs. G: OR 0.66, 95% CI 0.59–0.75, P = 0.000), homozygote comparison (AA vs. AG/GG: OR 0.62, 95% CI 0.49–0.78, P = 0.000), heterozygote comparison (A vs. G: OR 0.74, 95% CI 0.61–0.91, P = 0.005), recessive genetic model (AA vs. AG/GG: OR 0.78, 95% CI 0.65–0.93, P = 0.006) and dominant genetic model (AA vs. AG/GG: OR 0.70, 95% CI 0.57–0.85, P = 0.000). The findings indicate that, in contrast to some studies, the TLR9 rs352140 polymorphism is associated with a decreased risk for BM.
The associations between sugar-sweetened beverage (SSB) and artificially sweetened beverage (ASB) consumption and the risk of metabolic syndrome (MetS) remain controversial. A quantitative assessment of dose–response associations has not been reported. This study aims to assess the associations between the risk of MetS and SSB, ASB, and total sweetened beverage (TSB, the combination of SSB and ASB) consumption by reviewing population-based epidemiological studies.
We searched PubMed, Embase and Web of Science databases prior to 4 November 2019, for relevant studies investigating the SSB–MetS and ASB–MetS associations. A random effects model was used to estimate pooled relative risks (RR) and 95 % CI. Dose–response association was assessed using a restricted cubic splines model.
We identified seventeen articles (twenty-four studies, including 93 095 participants and 20 749 MetS patients).
The pooled RR for the risk of MetS were 1·51 (95 % CI 1·34, 1·69), 1·56 (1·32, 1·83) and 1·44 (1·19, 1·75) in high consumption group of TSB, SSB and ASB, respectively; and 1·20 (1·13, 1·28), 1·19 (1·11, 1·28) and 1·31 (1·05, 1·65) per 250 ml/d increase in TSB, SSB and ASB consumption, respectively. Additionally, we found evidence of non-linear, TSB–MetS and SSB–MetS dose–response associations and a linear ASB–MetS dose–response association.
TSB, SSB and ASB consumption was associated with the risk of MetS. The present findings provide evidence that supports reducing intake of these beverages to lower the TSB-, SSB- and ASB-related risk of MetS.
The risk of environmental contamination by severe acute respiratory coronavirus virus 2 (SARS-CoV-2) in the intensive care unit (ICU) is unclear. We evaluated the extent of environmental contamination in the ICU and correlated this with patient and disease factors, including the impact of different ventilatory modalities.
In this observational study, surface environmental samples collected from ICU patient rooms and common areas were tested for SARS-CoV-2 by polymerase chain reaction (PCR). Select samples from the common area were tested by cell culture. Clinical data were collected and correlated to the presence of environmental contamination. Results were compared to historical data from a previous study in general wards.
In total, 200 samples from 20 patient rooms and 75 samples from common areas and the staff pantry were tested. The results showed that 14 rooms had at least 1 site contaminated, with an overall contamination rate of 14% (28 of 200 samples). Environmental contamination was not associated with day of illness, ventilatory mode, aerosol-generating procedures, or viral load. The frequency of environmental contamination was lower in the ICU than in general ward rooms. Eight samples from the common area were positive, though all were negative on cell culture.
Environmental contamination in the ICU was lower than in the general wards. The use of mechanical ventilation or high-flow nasal oxygen was not associated with greater surface contamination, supporting their use and safety from an infection control perspective. Transmission risk via environmental surfaces in the ICUs is likely to be low. Nonetheless, infection control practices should be strictly reinforced, and transmission risk via droplet or airborne spread remains.
The outbreak of COVID-19 generated severe emotional reactions, and restricted mobility was a crucial measure to reduce the spread of the virus. This study describes the changes in public emotional reactions and mobility patterns in the Chinese population during the COVID-19 outbreak.
We collected data on public emotional reactions in response to the outbreak through Weibo, the Chinese Twitter, between 1st January and 31st March 2020. Using anonymized location-tracking information, we analyzed the daily mobility patterns of approximately 90% of Sichuan residents.
There were three distinct phases of the emotional and behavioral reactions to the COVID-19 outbreak. The alarm phase (19th–26th January) was a restriction-free period, characterized by few new daily cases, but a large amount public negative emotions [the number of negative comments per Weibo post increased by 246.9 per day, 95% confidence interval (CI) 122.5–371.3], and a substantial increase in self-limiting mobility (from 45.6% to 54.5%, changing by 1.5% per day, 95% CI 0.7%–2.3%). The epidemic phase (27th January–15th February) exhibited rapidly increasing numbers of new daily cases, decreasing expression of negative emotions (a decrease of 27.3 negative comments per post per day, 95% CI −40.4 to −14.2), and a stabilized level of self-limiting mobility. The relief phase (16th February–31st March) had a steady decline in new daily cases and decreasing levels of negative emotion and self-limiting mobility.
During the COVID-19 outbreak in China, the public's emotional reaction was strongest before the actual peak of the outbreak and declined thereafter. The change in human mobility patterns occurred before the implementation of restriction orders, suggesting a possible link between emotion and behavior.
The European Society for Clinical Nutrition and Metabolism (ESPEN) guidelines recommend the Royal Free Hospital-Nutritional Prioritizing Tool (RFH-NPT) to identify malnutrition risk in patients with liver disease. However, little is known about the application of the RFH-NPT to screen for the risk of malnutrition in China, where patients primarily suffer from hepatitis virus-related cirrhosis. A total of 155 cirrhosis patients without liver cancer or uncontrolled co-morbid illness were enrolled in this prospective study. We administered the Nutritional Risk Screening 2002 (NRS-2002), RFH-NPT, Malnutrition Universal Screening Tool (MUST) and Liver Disease Undernutrition Screening Tool (LDUST) to the patients within 24 h after admission and performed follow-up observations for 1·5 years. The RFH-NPT and NRS-2002 had higher sensitivities (64·8 and 52·4 %) and specificities (60 and 70 %) than the other tools with regard to screening for malnutrition risk in cirrhotic patients. The prevalence of nutritional risk was higher under the use of the RFH-NPT against the NRS-2002 (63 v. 51 %). The RFH-NPT tended more easily to detect malnutrition risk in patients with advanced Child–Pugh classes (B and C) and lower Model for End-stage Liver Disease scores (<15) compared with NRS-2002. RFH-NPT score was an independent predictive factor for mortality. Patients identified as being at high malnutrition risk with the RFH-NPT had a higher mortality rate than those at low risk; the same result was not obtained with the NRS-2002. Therefore, we suggest that using the RFH-NPT improves the ability of clinicians to predict malnutrition risk in patients with cirrhosis primarily caused by hepatitis virus infection at an earlier stage.