To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Despite a higher prevalence of traumatic spinal cord injury (TSCI) amongst Canadian Indigenous peoples, there is a paucity of studies focused on Indigenous TSCI. We present the first Canada-wide study comparing TSCI amongst Canadian Indigenous and non-Indigenous peoples. Methods: This study is a retrospective analysis of prospectively-collected TSCI data from the Rick Hansen Spinal Cord Injury Registry (RHSCIR) from 2004-2019. We divided participants into Indigenous and non-Indigenous cohorts and compared them with respect to demographics, injury mechanism, level, severity, and outcomes. Results: Compared with non-Indigenous patients, Indigenous patients were younger, more female, less likely to have higher education, and less likely to be employed. The mechanism of injury was more likely due to assault or transportation-related trauma in the Indigenous group. The length of stay for Indigenous patients was longer. Indigenous patients were more likely to be discharged to a rural setting, less likely to be discharged home, and more likely to be unemployed following injury. Conclusions: Our results suggest that more resources need to be dedicated for transitioning Indigenous patients sustaining a TSCI to community living and for supporting these patients in their home communities. A focus on resources and infrastructure for Indigenous patients by engagement with Indigenous communities is needed.
To study the effectiveness of unilateral cochlear implantation, binaural-bimodal hearing devices, and bilateral cochlear implantation in children with inner-ear malformation.
This study comprised 261 patients who were allocated to inner-ear malformation or control groups. Twenty-four months after surgery, aided sound-field thresholds were tested, and the Meaningful Auditory Integration Scale, Infant-Toddler Meaningful Auditory Integration Scale, Meaningful Use of Speech Scale, Categories of Auditory Performance scale and Speech Intelligibility Rating test were completed.
Aided sound-field thresholds were significantly better for bilateral cochlear implantation patients than for unilateral cochlear implantation or binaural-bimodal hearing device patients. There was no significant difference in Meaningful Auditory Integration Scale, Infant-Toddler Meaningful Auditory Integration Scale, or Categories of Auditory Performance scores among the three groups. The binaural-bimodal hearing device patients outperformed unilateral cochlear implantation patients on both Meaningful Use of Speech Scale and Speech Intelligibility Rating scores. No statistical difference was observed between the two subgroups.
Children who received bilateral cochlear implants have the best auditory awareness in a quiet environment. Children with binaural-bimodal hearing devices have better voice control and verbal skills than unilateral cochlear implantation patients, and people are more likely to understand them. Children with inner-ear malformations benefit from cochlear implantation.
This study investigated the characteristics and prognosis of the feeling of ear fullness in patients with unilateral all-frequency sudden sensorineural hearing loss.
Our study included 56 patients with a diagnosis of unilateral all-frequency sudden sensorineural hearing loss accompanied by a feeling of ear fullness and 48 patients without a feeling of ear fullness. The condition of these patients was prospectively observed.
Positive correlations were observed between grading of feeling of ear fullness and hearing loss in patients with a feeling of ear fullness (r = 0.599, p < 0.001). No significant differences were observed in the total effective rate of hearing recovery between patients with and without a feeling of ear fullness after one month of treatment (Z = −0.641, p = 0.521). Eighty-six per cent of patients (48 out of 56) showed complete recovery from the feeling of ear fullness. There was no correlation between feeling of ear fullness recovery and hearing recovery (r = 0.040, p = 0.769).
The prognosis of feeling of ear fullness is good. There was no correlation between feeling of ear fullness recovery and hearing recovery for all-frequency sudden sensorineural hearing loss patients.
The aim of this study was to explore the frequency and distribution of gene mutations that are related to isoniazid (INH) and rifampin (RIF)-resistance in the strains of the multidrug-resistant tuberculosis (MDR-TB) Mycobacterium tuberculosis (M.tb) in Beijing, China. In this retrospective study, the genotypes of 173 MDR-TB strains were analysed by spoligotyping. The katG, inhA genes and the promoter region of inhA, in which genetic mutations confer INH resistance; and the rpoB gene, in which genetic mutations confer RIF resistance, were sequenced. The percentage of resistance-associated nucleotide alterations among the strains of different genotypes was also analysed. In total, 90.8% (157/173) of the MDR strains belonged to the Beijing genotype. Population characteristics were not significantly different among the strains of different genotypes. In total, 50.3% (87/173) strains had mutations at codon S315T of katG; 16.8% (29/173) of strains had mutations in the inhA promoter region; of them, 5.5% (15/173) had point mutations at −15 base (C→T) of the inhA promoter region. In total, 86.7% (150/173) strains had mutations at rpoB gene; of them, 40% (69/173) strains had mutations at codon S531L of rpoB. The frequency of mutations was not significantly higher in Beijing genotypic MDR strains than in non-Beijing genotypes. Beijing genotypic MDR-TB strains were spreading in Beijing and present a major challenge to TB control in this region. A high prevalence of katG Ser315Thr, inhA promoter region (−15C→T) and rpoB (S531L) mutations was observed. Molecular diagnostics based on gene mutations was a useful method for rapid detection of MDR-TB in Beijing, China.
Lifestyle interventions are an important and viable approach for preventing cognitive deficits. However, the results of studies on alcohol, coffee and tea consumption in relation to cognitive decline have been divergent, likely due to confounds from dose–response effects. This meta-analysis aimed to find the dose–response relationship between alcohol, coffee or tea consumption and cognitive deficits.
Prospective cohort studies or nested case-control studies in a cohort investigating the risk factors of cognitive deficits were searched in PubMed, Embase, the Cochrane and Web of Science up to 4th June 2020. Two authors searched the databases and extracted the data independently. We also assessed the quality of the studies with the Newcastle-Ottawa scale. Stata 15.0 software was used to perform model estimation and plot the linear or nonlinear dose–response relationship graphs.
The search identified 29 prospective studies from America, Japan, China and some European countries. The dose–response relationships showed that compared to non-drinkers, low consumption (<11 g/day) of alcohol could reduce the risk of cognitive deficits or only dementias, but there was no significant effect of heavier drinking (>11 g/day). Low consumption of coffee reduced the risk of any cognitive deficit (<2.8 cups/day) or dementia (<2.3 cups/day). Green tea consumption was a significant protective factor for cognitive health (relative risk, 0.94; 95% confidence intervals, 0.92–0.97), with one cup of tea per day brings a 6% reduction in risk of cognitive deficits.
Light consumption of alcohol (<11 g/day) and coffee (<2.8 cups/day) was associated with reduced risk of cognitive deficits. Cognitive benefits of green tea consumption increased with the daily consumption.
This study aimed to evaluate the benefits of betahistine or vestibular rehabilitation (Tetrax biofeedback) on the quality of life and fall risk in patients with Ménière's disease.
Sixty-six patients with Ménière's disease were randomly divided into three groups: betahistine, Tetrax and control groups. Patients’ Dizziness Handicap Index and Tetrax fall index scores were obtained before and after treatment.
Patients in the betahistine and Tetrax groups showed significant improvements in Dizziness Handicap Index and fall index scores after treatment versus before treatment (p < 0.05). The improvements in the Tetrax group were significantly greater than those in the betahistine group (p < 0.05).
Betahistine and vestibular rehabilitation (Tetrax biofeedback) improve the quality of life and reduce the risk of falling in patients with Ménière's disease. Vestibular rehabilitation (Tetrax biofeedback) is an effective management method for Ménière's disease.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
A disruption database characterizing the current quench of disruptions with ITER-like tungsten divertor has been developed on EAST. It provides a large number of plasma parameters describing the predisruptive plasma, current quench time, eddy current, and mitigation by massive impurity injection, which shows that the current quench time strongly depends on magnetic energy and post-disruption electron temperature. Further, the energy balance and magnetic energy dissipation during the current quench phase has been well analysed. Magnetic energy is also demonstrated to be dissipated mainly by ohmic reheating and inductive coupling, and both of the two channels have great effects on current quench time. Also, massive gas injection is an efficient method to speed up the current quench and increase the fraction of impurity radiation.
Leg weakness (LW) issues are a great concern for pig breeding industry. And it also has a serious impact on animal welfare. To dissect the genetic architecture of limb-and-hoof firmness in commercial pigs, a genome-wide association study was conducted on bone mineral density (BMD) in three sow populations, including Duroc, Landrace and Yorkshire. The BMD data were obtained by ultrasound technology from 812 pigs (including Duroc 115, Landrace 243 and Yorkshire 454). In addition, all pigs were genotyped using genome-by-sequencing and a total of 224 162 single-nucleotide polymorphisms (SNPs) were obtained. After quality control, 218 141 SNPs were used for subsequent genome-wide association analysis. Nine significant associations were identified on chromosomes 3, 5, 6, 7, 9, 10, 12 and 18 that passed Bonferroni correction threshold of 0.05/(total SNP numbers). The most significant locus that associated with BMD (P value = 1.92e−14) was detected at approximately 41.7 Mb on SSC6 (SSC stands for Sus scrofa chromosome). CUL7, PTK7, SRF, VEGFA, RHEB, PRKAR1A and TPO that are located near the lead SNP of significant loci were highlighted as functionally plausible candidate genes for sow limb-and-hoof firmness. Moreover, we also applied a new method to measure the BMD data of pigs by ultrasound technology. The results provide an insight into the genetic architecture of LW and can also help to improve animal welfare in pigs.
Our research group demonstrated that vitamin A restriction affected meat quality of Angus cross and Simmental steers. Therefore, the aim of this study is to highlight the genotype variations in response to dietary vitamin A levels. Commercial Angus and Simmental steers (n = 32 per breed; initial BW = 337.2 ± 5.9 kg; ~8 months of age) were fed a low-vitamin A (LVA) (1017 IU/kg DM) backgrounding diet for 95 days to reduce hepatic vitamin A stores. During finishing, steers were randomly assigned to treatments in a 2 × 2 factorial arrangement of genotype × dietary vitamin A concentration. The LVA treatment was a finishing diet with no supplemental vitamin A (723 IU vitamin A/kg DM); the control (CON) was the LVA diet plus supplementation with 2200 IU vitamin A/kg DM. Blood samples were collected at three time points throughout the study to analyze serum retinol concentration. At the completion of finishing, steers were slaughtered at a commercial abattoir. Meat characteristics assessed were intramuscular fat concentration, color, Warner-Bratzler shear force, cook loss and pH. Camera image analysis was used for determination of marbling, 12th rib back fat and longissimus muscle area (LMA). The LVA steers had lower (P < 0.001) serum retinol concentration than CON steers. The LVA treatment resulted in greater (P = 0.03) average daily gain than the CON treatment, 1.52 and 1.44 ± 0.03 kg/day, respectively; however, there was no effect of treatment on final BW, DM intake or feed efficiency. Cooking loss and yield grade were greater and LMA was smaller in LVA steers (P < 0.05). There was an interaction between breed and treatment for marbling score (P = 0.01) and percentage of carcasses grading United States Department of Agriculture (USDA) Prime (P = 0.02). For Angus steers, LVA treatment resulted in a 16% greater marbling score than CON (683 and 570 ± 40, respectively) and 27% of LVA Angus steers graded USDA Prime compared with 0% for CON. Conversely, there was no difference in marbling score or USDA Quality Grades between LVA and CON for Simmental steers. In conclusion, feeding a LVA diet during finishing increased marbling in Angus but not in Simmental steers. Reducing the vitamin A level of finishing diets fed to cattle with a high propensity to marble, such as Angus, has the potential to increase economically important traits such as marbling and quality grade without negatively impacting gain : feed or yield grade.
This study aimed to investigate the clinical characteristics and to analyse the epidemiological features of coronavirus disease 2019 (COVID-19) patients during convalescence. In this study, we enrolled 71 confirmed cases of COVID-19 who were discharged from hospital and transferred to isolation wards from 6 February to 26 March 2020. They were all employees of Zhongnan Hospital of Wuhan University or their family members of which three cases were <18 years of age. Clinical data were collected and analysed statistically. Forty-one cases (41/71, 57.7%) comprised medical faculty, young and middle-aged patients (aged ⩽60 years) accounted for 81.7% (58/71). The average isolation time period for all adult patients was 13.8 ± 6.1 days. During convalescence, RNA detection results of 35.2% patients (25/71) turned from negative to positive. The longest RNA reversed phase time was 7 days. In all, 52.9% of adult patients (36/68) had no obvious clinical symptoms, and the remaining ones had mild and non-specific clinical symptoms (e.g. cough, sputum, sore throat, disorders of the gastrointestinal tract etc.). Chest CT signs in 89.7% of adult patients (61/68) gradually improved, and in the others, the lesions were eventually absorbed and improved after short-term repeated progression. The main chest CT manifestations of adult patients were normal, GGO or fibre streak shadow, and six patients (8.8%) had extrapulmonary manifestations, but there was no significant correlation with RNA detection results (r = −0.008, P > 0.05). The drug treatment was mainly symptomatic support therapy, and antibiotics and antiviral drugs were ineffective. It is necessary to re-evaluate the isolation time and standard to terminate isolation for discharged COVID-19 patients.
Previous studies have shown that African American youth are over-represented in the Criminal Justice System (CJS). Substance use problems are common among those with CJS involvement. However, less is known regarding racial disparities, among youth with CJS involvement, in receiving substance use treatment services.
To examine racial disparities with regard to receiving treatment services for substance use related problems, among youth with (CJS) involvement.
Data were obtained from the 2006–2008 United States National Survey on Drug Use and Health (NSDUH) in USA. Among White and African American adolescents (Ages 12–17) with recent CJS involvement and who met criteria for alcohol or illicit drug abuse or dependence (N = 602), racial differences in receiving treatment services for substance use problems were examined. Multiple logistic regression analyses were performed to identify predictors of service access among the adolescents, to see if the racial disparity could be explained by individual-level, family-level, and criminal justice system involvement factors.
While 31.2% of White adolescent substance abusers with CJS involvement had received treatment for substance use related problems, only 11.6% of their African American counterparts had received such treatment (P = 0.0005). Multiple logistic regression analyses showed that access to treatment services can be predicted by substance use related delinquent behaviors, but that racial disparities in treatment still exist after adjusting for these factors (AOR = 0.24, 95%CI = (0.09,0.59), P = 0.0027).
There is an urgent need to reduce racial disparities in receiving substance use treatment among U.S. youth with CJS involvement.
The relative effect of the atypical antipsychotic drugs and conventional agents on neurocognition in patients with early-stage schizophrenia has not been comprehensively determined.
The present study aimed to assess the cognitive effects of atypical and conventional antipsychotic drugs on neurocognition under naturalistic treatment conditions.
In a 12 months open-label, multicenter study, 698 patients with early-stage schizophrenia (< 5 years) were monotherapy with chlorpromazine, sulpiride, clozapine, risperidone, olanzapine, quetiapine or aripiprazole. Wechsler Memory Scale--Revised Visual Reproduction Test, Wechsler Adult Intelligence Scale Revised Digit Symbol Test and Digit-span Task Test, Trail Making Tests Part A and Part B, and Wisconsin Card Sorting Test were administered at baseline and 12 months follow-up evaluation. The primary outcome was change in a cognitive composite score after 12 months of treatment.
Compared with scores at baseline, the composite cognitive test scores and individual test scores had significant improvement for all seven treatment groups at 12-month follow-up evaluation (all p-values ≤ 0.013). However, olanzapine and quetiapine provided greater improvement than that provided by chlorpromazine and sulpiride in the composite score, processing speed and executive function (all p-values ≤ 0.045).
Both conventional and atypical antipsychotic medication long-term maintenance treatment can benefit congitive function in patients with early-stage schizophrenia, but olanzapine and quetiapine may be superior to chlorpromazine and sulpiride in improving some areas of neurocognitive function.
The aim of this study was to develop and externally validate a simple-to-use nomogram for predicting the survival of hospitalised human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) patients (hospitalised person living with HIV/AIDS (PLWHAs)). Hospitalised PLWHAs (n = 3724) between January 2012 and December 2014 were enrolled in the training cohort. HIV-infected inpatients (n = 1987) admitted in 2015 were included as the external-validation cohort. The least absolute shrinkage and selection operator method was used to perform data dimension reduction and select the optimal predictors. The nomogram incorporated 11 independent predictors, including occupation, antiretroviral therapy, pneumonia, tuberculosis, Talaromyces marneffei, hypertension, septicemia, anaemia, respiratory failure, hypoproteinemia and electrolyte disturbances. The Likelihood χ2 statistic of the model was 516.30 (P = 0.000). Integrated Brier Score was 0.076 and Brier scores of the nomogram at the 10-day and 20-day time points were 0.046 and 0.071, respectively. The area under the curves for receiver operating characteristic were 0.819 and 0.828, and precision-recall curves were 0.242 and 0.378 at two time points. Calibration plots and decision curve analysis in the two sets showed good performance and a high net benefit of nomogram. In conclusion, the nomogram developed in the current study has relatively high calibration and is clinically useful. It provides a convenient and useful tool for timely clinical decision-making and the risk management of hospitalised PLWHAs.
Recently, a triple-network model suggested the abnormal interactions between the executive-control network (ECN), default-mode network (DMN) and salience network (SN) are important characteristics of addiction, in which the SN plays a critical role in allocating attentional resources toward the ECN and DMN. Although increasing studies have reported dysfunctions in these brain networks in Internet gaming disorder (IGD), interactions between these networks, particularly in the context of the triple-network model, have not been investigated in IGD. Thus, we aimed to assess alterations in the inter-network interactions of these large-scale networks in IGD, and to associate the alterations with IGD-related behaviors.
DMN, ECN and SN were identified using group-level independent component analysis (gICA) in 39 individuals with IGD and 34 age and gender matched healthy controls (HCs). Then alterations in the SN-ECN and SN-DMN connectivity, as well as in the modulation of ECN versus DMN by SN, using a resource allocation index (RAI) developed and validated previously in nicotine addiction, were assessed. Further, associations between these altered network coupling and clinical assessments were also examined.
Compared with HCs, IGD had significantly increased SN-DMN connectivity and decreased RAI in right hemisphere (rRAI), and the rRAI in IGD was negatively associated with their scores of craving.
These findings suggest that the deficient modulation of ECN versus DMN by SN might provide a mechanistic framework to better understand the neural basis of IGD and might provide novel evidence for the triple-network model in IGD.
Triptorelin (TRI), a gonadotropin-releasing hormone agonist allowing ovulation synchronization in pigs, is indispensable for fixed-time artificial insemination (FTAI) protocols. However, the effect of FTAI using TRI (FTAI-TRI) on the reproductive performance is controversial. We performed a meta-analysis to determine whether FTAI-TRI affects reproductive performance of pigs, including pregnancy rate (PR), number of pigs born alive per litter (NBA), farrowing rate (FR) and total number of pigs born per litter (TNB). A total of 37 trials from 15 studies were extracted and analysed in Stata. A weighted mean difference (WMD) with 95% confidence interval (CI) was calculated for NBA and TNB, and risk ratio (RR) with 95% CI was calculated for PR and FR. Pregnancy rate, TNB and NBA data were applied to a fixed-effect protocol, and FR data were applied to a random-effect protocol. We found that for weaned sows, the FTAI-TRI group had comparable reproductive performance to the artificial insemination (AI) following oestrus detection (EDAI) group. Fixed-time AI has many advantages, including the elimination of the need to heat-check twice daily, so that FTAI-TRI is a good substitute for EDAI. Subgroup analysis indicated that the optimal timing of triptorelin treatment was 96 h after weaning, which gave significant positive effects on PR (RR = 1.08, P = 0.000) and non-significant positive effects on TNB (WMD = 0.12, P = 0.452). Triptorelin at a dose of 100 μg showed better effects than 200 μg, with significant positive effects on PR (RR = 1.09, P = 0.005) and FR (RR = 1.06, P = 0.036). So a single dose of 100 μg was recommended. The optimal protocol was insemination at 24 h and again at 48 h after triptorelin administration if they remained in standing oestrus, and this provided a significantly higher NBA (WMD = 0.59, P = 0.013) that increased by 0.59. For gilts, the FTAI-TRI group showed decreased (not significant) PR (RR = 0.96, P = 0.127) and significantly decreased FR (RR = 0.93, P = 0.013), TNB (WMD = −0.85, P = 0.006) and NBA (WMD = −0.98, P = 0.000), which were inferior to those in the EDAI group. In conclusion, the effects of FTAI-TRI on the reproductive performance of pigs were parity-, treatment timing-, insemination timing-, and dosage-dependent. Fixed-time AI using triptorelin could effectively replace the EDAI protocol for sows, but not for gilts.
Latrophilin (LPH) is known as an adhesion G-protein-coupled receptor which involved in multiple physiological processes in organisms. Previous studies showed that lph not only involved the susceptibility to anticholinesterase insecticides but also affected fecundity in Tribolium castaneum. However, its regulatory mechanisms in these biological processes are still not clear. Here, we identified two potential downstream carboxylesterase (cce) genes of Tclph, esterase4 and esterase6, and further characterized their interactions with Tclph. After treatment of T. castaneum larvae with carbofuran or dichlorvos insecticides, the transcript levels of Tcest4 and Tcest6 were significantly induced from 12 to 72 h. RNAi against Tcest4 or Tcest6 led to the higher mortality compared with the controls after the insecticides treatment, suggesting that these two genes play a vital role in detoxification of insecticides in T. castaneum. Furthermore, with insecticides exposure to Tclph knockdown beetles, the expression of Tcest4 was upregulated but Tcest6 was downregulated, indicating that beetles existed a compensatory response against the insecticides. Additionally, RNAi of Tcest6 resulted in 43% reductions in female egg laying and completely inhibited egg hatching, which showed the similar phenotype as that of Tclph knockdown. These results indicated that Tclph affected fecundity by positively regulating Tcest6 expression. Our findings will provide a new insight into the molecular mechanisms of Tclph involved in physiological functions in T. castaneum.
Starch digestion in the small intestines of the dairy cow is low, to a large extent, due to a shortage of syntheses of α-amylase. One strategy to improve the situation is to enhance the synthesis of α-amylase. The mammalian target of rapamycin (mTOR) signalling pathway, which acts as a central regulator of protein synthesis, can be activated by leucine. Our objectives were to investigate the effects of leucine on the mTOR signalling pathway and to define the associations between these signalling activities and the synthesis of pancreatic enzymes using an in vitro model of cultured Holstein dairy calf pancreatic tissue. The pancreatic tissue was incubated in culture medium containing l-leucine for 3 h, and samples were collected hourly, with the control being included but not containing l-leucine. The leucine supplementation increased α-amylase and trypsin activities and the messenger RNA expression of their coding genes (P <0.05), and it enhanced the mTOR synthesis and the phosphorylation of mTOR, ribosomal protein S6 kinase 1 and eukaryotic initiation factor 4E-binding protein 1 (P <0.05). In addition, rapamycin inhibited the mTOR signal pathway factors during leucine treatment. In sum, the leucine regulates α-amylase and trypsin synthesis in dairy calves through the regulation of the mTOR signal pathways.
Starchy grain is usually supplemented to diets containing low-quality forage to provide sufficient energy for ruminant animals. Ruminal degradation of grain starch mainly depends on the hydrolysis of the endosperm, which may be variable among grain sources. This study was conducted to investigate the influence of endosperm structure of wheat and corn on in vitro rumen fermentation and nitrogen (N) utilization of rice straw. The 3×4 factorial design included three ratios of concentrate to forage (35:65, 50:50 and 65:35) and four ratios of wheat to corn starch (20:80, 40:60, 60:40 and 80:20). The endosperm structure was detected by scanning electronic microscopy and a confocal laser scanning microscopic. An in vitro gas test was performed to evaluate the rumen fermentation characteristics and N utilization. Starch granules were embedded in the starch–protein matrix in corn, but more granules were separated from the matrix in the wheat endosperm. With the increasing ratio of wheat, rate and extent of gas production, total volatile fatty acids, and ammonia N increased linearly (P<0.01), but microbial protein concentration decreased (quadratic, P<0.01), with the maximum value at a ratio of 40% wheat. The efficiency of N utilization decreased linearly (P<0.01). Rumen fermentation and N utilization were significantly affected by the concentrate-to-forage ratio (P<0.01). Significant interactions between the concentrate-to-forage ratio and the wheat-to-corn ratio were detected in total volatile fatty acids and the efficiency of N utilization (P<0.01). In summary, the starch–protein matrix and starch granules in the wheat and corn endosperm mixture play an important role in the regulation of rumen fermentation and N utilization under low-quality forage.
Plant nitrogen (N) links with many physiological progresses of crop growth and yield formation. Accurate simulation is key to predict crop growth and yield correctly. The aim of the current study was to improve the estimation of N uptake and translocation processes in the whole rice plant as well as within plant organs in the RiceGrow model by using plant and organ maximum, critical and minimum N dilution curves. The maximum and critical N (Nc) demand (obtained from the maximum and critical curves) of shoot and root and Nc demand of organs (leaf, stem and panicle) are calculated by N concentration and biomass. Nitrogen distribution among organs is computed differently pre- and post-anthesis. Pre-anthesis distribution is determined by maximum N demand with no priority among organs. In post-anthesis distribution, panicle demands are met first and then the remaining N is allocated to other organs without priority. The amount of plant N uptake depends on plant N demand and N supplied by the soil. Calibration and validation of the established model were performed on field experiments conducted in China and the Philippines with varied N rates and N split applications; results showed that this improved model can simulate the processes of N uptake and translocation well.