We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An acute gastroenteritis (AGE) outbreak caused by a norovirus occurred at a hospital in Shanghai, China, was studied for molecular epidemiology, host susceptibility and serological roles. Rectal and environmental swabs, paired serum samples and saliva specimens were collected. Pathogens were detected by real-time polymerase chain reaction and DNA sequencing. Histo-blood group antigens (HBGA) phenotypes of saliva samples and their binding to norovirus protruding proteins were determined by enzyme-linked immunosorbent assay. The HBGA-binding interfaces and the surrounding region were analysed by the MegAlign program of DNAstar 7.1. Twenty-seven individuals in two care units were attacked with AGE at attack rates of 9.02 and 11.68%. Eighteen (78.2%) symptomatic and five (38.4%) asymptomatic individuals were GII.6/b norovirus positive. Saliva-based HBGA phenotyping showed that all symptomatic and asymptomatic cases belonged to A, B, AB or O secretors. Only four (16.7%) out of the 24 tested serum samples showed low blockade activity against HBGA-norovirus binding at the acute phase, whereas 11 (45.8%) samples at the convalescence stage showed seroconversion of such blockade. Specific blockade antibody in the population played an essential role in this norovirus epidemic. A wide HBGA-binding spectrum of GII.6 supports a need for continuous health attention and surveillance in different settings.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
To investigate the value of narrow-band imaging training for differentiating between benign and malignant vocal fold leukoplakia.
Method
Thirty cases of vocal fold leukoplakia were selected.
Results
Narrow-band imaging endoscopy training had a significant positive effect on the specificity of the differential diagnosis of vocal fold leukoplakia. In addition, the consistency of diagnostic typing of vocal fold leukoplakia by narrow-band imaging improved to ‘moderate agreement’ following the combination of types I and II and the combination of types IV, V and VI in the typing of vocal fold leukoplakia.
Conclusion
The narrow-band imaging training course may improve the ability of laryngologists to diagnose vocal fold leukoplakia. The new endoscopic diagnostic classification by narrow-band imaging needs to be further simplified to facilitate clinical application.
Accretionary orogens contain key evidence for the conversion of oceanic to continental crust. The late tectonic history and closure time of the Palaeo-Asian Ocean are recorded in the Mazongshan subduction–accretion complex in the southern Beishan margin of the Central Asian Orogenic Belt. We present new data on the structure, petrology, geochemistry and zircon U–Pb isotope ages of the Mazongshan subduction–accretion complex, which is a tectonic mélange with a block-in-matrix structure. The blocks are of serpentinized peridotite, basalt, gabbro, basaltic andesite, chert and seamount sediments within a matrix that is mainly composed of fore-arc-trench turbidites. U–Pb zircon ages of two gabbros are 454.6 ± 2.5 Ma and 434.1 ± 3.6 Ma, an andesite has a U–Pb zircon age of 451.3 ± 3.5 Ma and a tuffaceous slate has the youngest U–Pb zircon age of 353.6 ± 5.1 Ma. These new isotopic ages, combined with published data on ophiolitic mélanges from central Beishan, indicate that the subduction–accretion of Beishan in the southernmost Central Asian Orogenic Belt lasted until Late Ordovician – Early Carboniferous time. Structure and age data demonstrate that the younging direction of accretion was southwards and that the subduction zone dipped continuously to the north. Accordingly, these results record the conversion of oceanic to continental crust in the southern Beishan accretionary collage.
Current available antidepressants exhibit low remission rate with a long response lag time. Growing evidence has demonstrated acute sub-anesthetic dose of ketamine exerts rapid, robust, and lasting antidepressant effects. However, a long term use of ketamine tends to elicit its adverse reactions. The present study aimed to investigate the antidepressant-like effects of intermittent and consecutive administrations of ketamine on chronic unpredictable mild stress (CUMS) rats, and to determine whether ketamine can redeem the time lag for treatment response of classic antidepressants. The behavioral responses were assessed by the sucrose preference test, forced swimming test, and open field test. In the first stage of experiments, all the four treatment regimens of ketamine (10 mg/kg ip, once daily for 3 or 7 consecutive days, or once every 7 or 3 days, in a total 21 days) showed robust antidepressant-like effects, with no significant influence on locomotor activity and stereotype behavior in the CUMS rats. The intermittent administration regimens produced longer antidepressant-like effects than the consecutive administration regimens and the administration every 7 days presented similar antidepressant-like effects with less administration times compared with the administration every 3 days. In the second stage of experiments, the combination of ketamine (10 mg/kg ip, once every 7 days) and citalopram (20 mg/kg po, once daily) for 21 days caused more rapid and sustained antidepressant-like effects than citalopram administered alone. In summary, repeated sub-anesthestic doses of ketamine can redeem the time lag for the antidepressant-like effects of citalopram, suggesting the combination of ketamine and classic antidepressants is a promising regimen for depression with quick onset time and stable and lasting effects.
There are strong links between circadian disturbance and some of the most characteristic symptoms of clinical major depressive disorder (MDD). However there are no published studies of changes in expression of clock genes or of other neuropeptides related to circadian-rhythm regulation, which may influence recurrent susceptibility after treatment with antidepressant in MDD.
Methods
Blood samples were collected from twelve healthy controls and twelve male major depressive patients pre- and post- treated with escitalopram for eight weeks at 4-hour intervals for 24 hours. Outcome measures were the relative expression of mRNA of clock genes (hPERIOD1, hPERIOD2, hPERIOD3, hCRY1, hBMAL1, hNPAS2 and hGSK-3beta) and the levels of serum melatonin, Vasoactive Intestinal Peptide (VIP), cortisol, Adrenocorticotropic Hormone (ACTH), Insulin-like Growth Factor-1(IGF-1) and growth hormone (GH) in twelve healthy controls and twelve pre- and post- treated MDD patients.
Results
Compared with healthy controls, MDD patients showed disruptions in diurnal rhythms of expression of hPERIOD1, hPERIOD2, hCRY1, hBMAL1, hNPAS2 and hGSK-3beta, along with disruptions in diurnal rhythms of release of melatonin, VIP, cortisol, ACTH, IGF-1, and GH. Several of these disruptions (hPER1, hCRY1, melatonin, VIP, cortisol, ACTH, and IGF-1) persisted after eight weeks escitalopram treatment, as did elevation of 24-hour levels of VIP and decreases in 24-hour levels of cortisol and ACTH.
Conclusion
These persisted neurobiological changes may play a role in MDD symptoms that are thought to contribute to recurrence vulnerability and in maintenance therapy for a long term.
Schizophrenia is one of the most severe and chronic forms of mental illness. Quantum resonance spectrometer (QRS) test may be useful as a biological marker for the clinical diagnosis of psychiatric disorders of Schizophrenia.
Objectives
To evaluate reliability and psychiatric clinical value of QRS via thought disorder detection.
Methods
We studied 1014 schizophrenic patients, 155 patients with bipolar disorders patient, and 100 normal controls. Thought disorder symptoms of same subjects obtained from QRS test and psychiatrists' diagnoses were compared. Also Thought disorder symptoms of renumbered 65 schizophrenia patient and 100 normal controls were discriminated using QRS test.
Results
Kappa values of thought disorders detection and diagnosed were more than 65% in 6/9 symptoms of schizophrenia, and more than 74% in all 3 symptoms of bipolar disorder. Same consistency could also be seen in Pearson R value, and ROC AUC. In the discriminated analysis, sensitivity, specificity, positive predictive value and negative predictive of delusion, looseness of thought and paralogism thinking detected utilizing QRS are more than 70% same compared with psychiatrists diagnoses.
Conclusions
QRS in thought disorder detection seem to have a predictable value for outcome in schizophrenia and bipolar disorder, would become an objective identification and diagnosis instrument, and might promote psychiatric clinical diagnosis.
The aim of this study was to develop and externally validate a simple-to-use nomogram for predicting the survival of hospitalised human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) patients (hospitalised person living with HIV/AIDS (PLWHAs)). Hospitalised PLWHAs (n = 3724) between January 2012 and December 2014 were enrolled in the training cohort. HIV-infected inpatients (n = 1987) admitted in 2015 were included as the external-validation cohort. The least absolute shrinkage and selection operator method was used to perform data dimension reduction and select the optimal predictors. The nomogram incorporated 11 independent predictors, including occupation, antiretroviral therapy, pneumonia, tuberculosis, Talaromyces marneffei, hypertension, septicemia, anaemia, respiratory failure, hypoproteinemia and electrolyte disturbances. The Likelihood χ2 statistic of the model was 516.30 (P = 0.000). Integrated Brier Score was 0.076 and Brier scores of the nomogram at the 10-day and 20-day time points were 0.046 and 0.071, respectively. The area under the curves for receiver operating characteristic were 0.819 and 0.828, and precision-recall curves were 0.242 and 0.378 at two time points. Calibration plots and decision curve analysis in the two sets showed good performance and a high net benefit of nomogram. In conclusion, the nomogram developed in the current study has relatively high calibration and is clinically useful. It provides a convenient and useful tool for timely clinical decision-making and the risk management of hospitalised PLWHAs.
Many schizophrenia patients experience residual symptoms even after treatment. Electroconvulsive therapy (ECT) is often used in medication-resistant schizophrenia patients when pharmacologic interventions have failed; however, the mechanism of action is unclear. Brain-derived neurotrophic factor (BDNF) levels are reduced in drug-naive, first-episode schizophrenia and are increased by antipsychotic treatment. We tested the hypothesis that ECT increases serum BDNF levels by measuring BDNF concentrations in schizophrenia patients before and after they received ECT.
Methods
A total of 160 patients with schizophrenia were examined. The ECT group (n = 80) was treated with antipsychotics and ECT (eight to 10 sessions administered every other day). The drug therapy group (n = 80) received only antipsychotic treatment. A control group (n = 77) was recruited that served as the baseline for comparison.
Results
Baseline serum BDNF level in ECT group was lower than in controls (9.7 ± 2.1 vs. 12.4 ± 3.2 ng/ml; P < 0.001), but increased after ECT, such that there was no difference between the two groups (11.9 ± 3.3 vs. 12.4 ± 3.2 ng/ml; P = 0.362). There was no correlation between patients’ Positive and Negative Syndrome Scale (PANSS) score and serum BDNF level before ECT; however, a negative correlation was observed after ECT (total: r = −0.692; P < 0.01). From baseline to remission after ECT, serum BDNF level increased (P < 0.001) and their PANSS score decreased (P < 0.001). Changes in BDNF level (2.21 ± 4.10 ng/ml) and PANSS score (28.69 ± 14.96) were positively correlated in the ECT group (r = 0.630; P < 0.01).
Conclusions
BDNF level was lower in schizophrenia patients relative to healthy controls before ECT and medication. BDNF level increased after ECT and medication, and its longitudinal change was associated with changes in patients’ psychotic symptoms. These results indicate that BDNF mediates the antipsychotic effects of ECT.
Chlamydia trachomatis (CT) infection has been a major public health threat globally. Monitoring and prediction of CT epidemic status and trends are important for programme planning, allocating resources and assessing impact; however, such activities are limited in China. In this study, we aimed to apply a seasonal autoregressive integrated moving average (SARIMA) model to predict the incidence of CT infection in Shenzhen city, China. The monthly incidence of CT between January 2008 and June 2019 in Shenzhen was used to fit and validate the SARIMA model. A seasonal fluctuation and a slightly increasing pattern of a long-term trend were revealed in the time series of CT incidence. The monthly CT incidence ranged from 4.80/100 000 to 21.56/100 000. The mean absolute percentage error value of the optimal model was 8.08%. The SARIMA model could be applied to effectively predict the short-term CT incidence in Shenzhen and provide support for the development of interventions for disease control and prevention.
Cytomegalovirus (CMV) enters latency after primary infection and can reactivate periodically with virus excreted in body fluids which can be called shedding. CMV shedding during the early stage of pregnancy is associated with adverse pregnancy outcome. The shedding pattern in healthy seropositive women who plan to have babies has not been well characterised. Vaginal swabs, urine and blood were collected from 1262 CMV IgG-positive women who intended to have babies and tested for CMV DNA by fluorogenic quantitative PCR method. Serum IgM was also detected. The association between sociodemographic characteristics and CMV shedding prevalence was analysed. Among 1262 seropositive women, 12.8% (161/1262) were detected CMV DNA positive in at least one body fluid. CMV DNA was more frequently detected in vaginal secretion (10.5%) than in urine (3.2%) and blood (0.6%) also with higher viral loads (P < 0.00). CMV shedding was more likely detected in IgM-positive women than IgM-negative women (29.5% (13/44) vs. 12.2% (148/1218); OR 3.03, 95% CI 1.55–5.93; P = 0.001). CMV shedding in vaginal secretion was highly correlated with shedding in urine, the immune state of IgM, the adverse pregnant history and younger age. CMV shedding was more commonly detected in vaginal secretion than in urine or blood with higher viral loads among healthy seropositive women of reproductive age. Further studies are needed to figure out whether the shedding is occasional or continuous and whether it is associated with adverse pregnancy outcomes.
Tuberculosis (TB) is the leading cause of death among infectious diseases. China has a high burden of TB and accounted for almost 13% of the world's cases of multi-drug resistant (MDR) TB. Spinal TB is one reason for the resurgence of TB in China. Few large case studies of MDR spinal TB in China have been conducted. The aim of this research was to observe the epidemiological characteristics of inpatients with MDR spinal TB in six provinces and cities of China from 1999–2015. This is a multicentre retrospective observational study. Patients' information was collected from the control disease centre and infectious disease database of hospitals in six provinces and cities in China. A total of 3137 patients with spinal TB and 272 patients with MDR spinal TB were analysed. The result showed that MDR spinal TB remains a public health concern and commonly affects patients 15–30 years of age (34.19%). The most common lesions involved the thoracolumbar spine (35.66%). Local pain was the most common symptom (98.53%). Logistic analysis showed that for spinal TB patients, reside in rural district (OR 1.79), advanced in years (OR 1.92) and high education degree (OR 2.22) were independent risk factors for the development of MDR spinal TB. Women were associated with a lower risk of MDR spinal TB (OR 0.48). The most common first-line and second-line resistant drug was isoniazid (68.75%) and levofloxacin (29.04%), respectively. The use of molecular diagnosis resulted in noteworthy clinical advances, including earlier initiation of MDR spinal TB treatment, improved infection control and better clinical outcome. Chemotherapy and surgery can yield satisfactory outcomes with timely diagnosis and long-term treatment. These results enable a better understanding of the MDR spinal TB in China among the general public.
Coronavirus disease 2019 (COVID-19) pandemic is a major public health concern all over the world. Little is known about the impact of COVID-19 pandemic on mental health in the general population. This study aimed to assess the mental health problems and associated factors among a large sample of college students during the COVID-19 outbreak in China.
Methods
This cross-sectional and nation-wide survey of college students was conducted in China from 3 to 10 February 2020. A self-administered questionnaire was used to assess psychosocial factors, COVID-19 epidemic related factors and mental health problems. Acute stress, depressive and anxiety symptoms were measured by the Chinese versions of the impact of event scale-6, Patient Health Questionnaire-9 and Generalized Anxiety Disorder-7, respectively. Univariate and hierarchical logistic regression analyses were performed to examine factors associated with mental health problems.
Results
Among 821 218 students who participated in the survey, 746 217 (90.9%) were included for the analysis. In total, 414 604 (55.6%) of the students were female. About 45% of the participants had mental health problems. The prevalence rates of probable acute stress, depressive and anxiety symptoms were 34.9%, 21.1% and 11.0%, respectively. COVID-19 epidemic factors that were associated with increased risk of mental health problems were having relatives or friends being infected (adjusted odds ratio = 1.72–2.33). Students with exposure to media coverage of the COVID-19 ≥3 h/day were 2.13 times more likely than students with media exposure <1 h/day to have acute stress symptoms. Individuals with low perceived social support were 4.84–5.98 times more likely than individuals with high perceived social support to have anxiety and depressive symptoms. In addition, senior year and prior mental health problems were also significantly associated with anxiety or/and depressive symptoms.
Conclusions
In this large-scale survey of college students in China, acute stress, anxiety and depressive symptoms are prevalent during the COVID-19 pandemic. Multiple epidemic and psychosocial factors, such as family members being infected, massive media exposure, low social support, senior year and prior mental health problems were associated with increased risk of mental health problems. Psychosocial support and mental health services should be provided to those students at risk.
Enhancing the supply of arginine (Arg), a semi-essential amino acid, has positive effects on immune function in dairy cattle experiencing metabolic stress during early lactation. Our objective was to determine the effects of Arg supplementation on biomarkers of liver damage and inflammation in cows during early lactation. Six Chinese Holstein lactating cows with similar BW (508 ± 14 kg), body condition score (3.0), parity (4.0 ± 0), milk yield (30.6 ± 1.8 kg) and days in milk (20 ± days) were randomly assigned to three treatments in a replicated 3 × 3 Latin square design balanced for carryover effects. Each period was 21 days with 7 days for infusion and 14 days for washout. Treatments were (1) Control: saline; (2) Arg group: saline + 0.216 mol/day l-Arg; and (3) Alanine (Ala) group: saline + 0.868 mol/day l-Ala (iso-nitrogenous to the Arg group). Blood and milk samples from the experimental cows were collected on the last day of each infusion period and analyzed for indices of liver damage and inflammation, and the count and composition of somatic cells in milk. Compared with the Control, the infusion of Arg led to greater concentrations of total protein, immunoglobulin M and high density lipoprotein cholesterol coupled with lower concentrations of haptoglobin and tumor necrosis factor-α, and activity of aspartate aminotransferase in serum. Infusion of Ala had no effect on those biomarkers compared with the Control. Although milk somatic cell count was not affected, the concentration of granulocytes was lower in response to Arg infusion compared with the Control or Ala group. Overall, the biomarker analyses indicated that the supplementation of Arg via the jugular vein during early lactation alleviated inflammation and metabolic stress.
Seasonal influenza virus epidemics have a major impact on healthcare systems. Data on population susceptibility to emerging influenza virus strains during the interepidemic period can guide planning for resource allocation of an upcoming influenza season. This study sought to assess the population susceptibility to representative emerging influenza virus strains collected during the interepidemic period. The microneutralisation antibody titers (MN titers) of a human serum panel against representative emerging influenza strains collected during the interepidemic period before the 2018/2019 winter influenza season (H1N1-inter and H3N2-inter) were compared with those against influenza strains representative of previous epidemics (H1N1-pre and H3N2-pre). A multifaceted approach, incorporating both genetic and antigenic data, was used in selecting these representative influenza virus strains for the MN assay. A significantly higher proportion of individuals had a ⩾four-fold reduction in MN titers between H1N1-inter and H1N1-pre than that between H3N2-inter and H3N2-pre (28.5% (127/445) vs. 4.9% (22/445), P < 0.001). The geometric mean titer (GMT) of H1N1-inter was significantly lower than that of H1N1-pre (381 (95% CI 339–428) vs. 713 (95% CI 641–792), P < 0.001), while there was no significant difference in the GMT between H3N2-inter and H3N2-pre. Since A(H1N1) predominated the 2018–2019 winter influenza epidemic, our results corroborated the epidemic subtype.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Optimizing the dietary calcium (Ca) level is essential to maximize the eggshell quality, egg production and bone formation in poultry. This study aimed to establish the Ca requirements of egg-type duck breeders from 23 to 57 weeks of age on egg production, eggshell, incubation, tibial, plasma and ovary-related indices, as well as the expression of matrix protein-related genes. Totally, 450 Longyan duck breeders aged 21 weeks of age were allotted randomly into five treatments, each with six replicates of 15 individually caged birds. The data collection started from 23 weeks of age and continued over the following 35 weeks. The five groups corresponded to five dietary treatments containing either 2.8%, 3.2%, 3.6%, 4.0% or 4.4% Ca. The tested dietary Ca levels increased (linear, P <0.01) egg production and egg mass, and linearly improved (P <0.01) the feed conversion ratio (FCR). Increasing the dietary Ca levels from 2.8% to 4.4% increased (P <0.01) the eggshell thickness and eggshell content. The tested Ca levels showed a quadratic effect on eggshell thickness and ovarian weight (P <0.01); the highest values were obtained with the Ca levels 4.0% and 3.6%, respectively. Dietary Ca levels affected the small yellow follicles (SYF) number and SYF weight/ovarian weight, and the linear response (P <0.01) was significant vis-à-vis SYF number. In addition, dietary Ca levels increased (P <0.05) the tibial dry weight, breaking strength, mineral density and ash content. Plasma and tibial phosphorus concentration exhibited a quadratic (P <0.01) response to dietary Ca levels. Plasma calcitonin concentration linearly (P <0.01) increased as dietary Ca levels increased. The relative expression of carbonic anhydrase 2 in the uterus rose (P <0.01) with the increment of dietary Ca levels, and the highest value was obtained with 3.2% Ca. In conclusion, Longyan duck breeders fed a diet with 4.0% Ca had superior eggshell and tibial quality, while those fed a diet with 3.6% Ca had the heaviest ovarian weights. The regression model indicated that the dietary Ca levels 3.86%, 3.48% and 4.00% are optimal levels to obtain maximum eggshell thickness, ovarian weight and tibial mineral density, respectively.
Co-receptor tropism has been identified to correlate with HIV-1 transmission and the disease progression in patients. A molecular epidemiology investigation of co-receptor tropism is important for clinical practice and effective control of HIV-1. In this study, we investigated the co-receptor tropism on HIV-1 variants of 85 antiretroviral-naive patients with Geno2pheno algorithm at a false-positive rate of 10%. Our data showed that a majority of the subjects harboured the CCR5-tropic virus (81.2%, 69/85). No significant differences in gender, age, baseline CD4+ T-cell counts and transmission routes were observed between subjects infected with CXCR4-tropic or CCR5-tropic virus. The co-receptor tropism appeared to be associated with the virus genotype; a significantly more CXCR4-use was predicted in CRF01_AE infections whereas all CRF07_BC and CRF08_BC were predicted to use CCR5 co-receptor. Sequences analysis of V3 revealed a higher median net charge in the CXCR4 viruses over CCR5 viruses (4.0 vs. 3.0, P < 0.05). The predicted N-linked glycosylation site between amino acids 6 and 8 in the V3 region was conserved in CCR5 viruses, but not in CXCR4 viruses. Besides, variable crown motifs were observed in both CCR5 and CXCR4 viruses, of which the most prevalent motif GPGQ existed in both viral tropism and almost all genotypes identified in this study except subtype B. These findings may offer important implications for clinical practice and enhance our understanding of HIV-1 biology.
The aim of the study was to investigate any association between extrauterine growth restriction (EUGR) and intestinal flora of <30-week-old preterm infants. A total of 59 preterm infants were assigned to EUGR (n=23) and non-EUGR (n=36) groups. Intestinal bacteria were compared by using high-throughput sequencing of bacterial rRNA. The total abundance of bacteria in 344 genera (7568 v. 13,760; P<0.0001) and 456 species (10,032 v. 18,240; P<0.0001) was significantly decreased in the EUGR group compared with the non-EUGR group. After application of a multivariate logistic model and adjusting for potential confounding factors, as well as false-discovery rate corrections, we found four bacterial genera with higher and one bacterial genus with lower abundance in the EUGR group compared with the control group. In addition, the EUGR group showed significantly increased abundances of six species (Streptococcus parasanguinis, Bacterium RB5FF6, two Klebsiella species and Microbacterium), but decreased frequencies of three species (one Acinetobacter species, Endosymbiont_of_Sphenophorus_lev and one Enterobacter_species) compared with the non-EUGR group. Taken together, there were significant changes in the intestinal microflora of preterm infants with EUGR compared to preterm infants without EUGR.
Rabies is one of the major public health problems in China, and the mortality rate of rabies remains the highest among all notifiable infectious diseases. A meta-analysis was conducted to investigate the post-exposure prophylaxis (PEP) vaccination rate and risk factors for human rabies in mainland China. The PubMed, Web of Science, Chinese National Knowledge Infrastructure, Chinese Science and Technology Periodical and Wanfang databases were searched for articles on rabies vaccination status (published between 2007 and 2017). In total, 10 174 human rabies cases from 136 studies were included in this meta-analysis. Approximately 97.2% (95% confidence interval (CI) 95.1–98.7%) of rabies cases occurred in rural areas and 72.6% (95% CI 70.0–75.1%) occurred in farmers. Overall, the vaccination rate in the reported human rabies cases was 15.4% (95% CI 13.7–17.4%). However, among vaccinated individuals, 85.5% (95% CI 79.8%–83.4%) did not complete the vaccination regimen. In a subgroup analysis, the PEP vaccination rate in the eastern region (18.8%, 95% CI 15.9–22.1%) was higher than that in the western region (13.3%, 95% CI 11.1–15.8%) and this rate decreased after 2007. Approximately 68.9% (95% CI 63.6–73.8%) of rabies cases experienced category-III exposures, but their PEP vaccination rate was 27.0% (95% CI 14.4–44.9%) and only 6.1% (95% CI 4.4–8.4%) received rabies immunoglobulin. Together, these results suggested that the PEP vaccination rate among human rabies cases was low in mainland China. Therefore, standardised treatment and vaccination programs of dog bites need to be further strengthened, particularly in rural areas.