To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Glutamine synthetase (GS) and glutamate synthase (GOGAT) play a central role in plant nitrogen (N) metabolism. In order to study the effect of powdery mildew (Blumeria graminis f. sp. tritici, Bgt) on N metabolism, field experiments were carried out to evaluate GS and GOGAT activity, GS expression and grain protein content (GPC) in susceptible (Xi'nong 979) and resistant (Zhengmai 103) wheat cultivars under three treatments. The three treatments were no inoculation (CK), inoculated once with Bgt (MP) and inoculated nine times with Bgt (HP). For Xi'nong 979, the activities of GS and GOGAT in grains as well as GS activity in flag leaves increased at 10–15 days after anthesis (DAA), and decreased significantly at 15 or 20–30 DAA in HP and MP. However, GS activity in grains decreased from 20 DAA, which was later than that of flag leaves (15 DAA). At the same time, GS expression in grains was up-regulated at early stage, with GS1 at 10 DAA and GS2 at 15 DAA, followed by a continuous down-regulation. This result indicated that GS and GOGAT activity as well as GS expression were inhibited by powdery mildew, indicating that N metabolism in grains was inhibited at 20–30 DAA. The current study also found out that the yield of the susceptible cultivar decreased significantly, while its GPC increased obviously in HP. It was shown that the increase of GPC was not due to the enhancement of N metabolism, but due to the passive increase caused by yield reduction.
Conservation tillage adoption continues to be threatened by glyphosate and acetolactate synthase-resistant Palmer amaranth and other troublesome weeds. Field experiments were conducted from autumn 2010 through crop harvest in 2013 at two locations in Alabama to evaluate the effect of integrated management practices on weed control and seed cotton yield in glyphosate-resistant cotton. The effects of a cereal rye cover crop using high or low biomass residue, followed by wide or narrow within-row strip-tillage, and three PRE herbicide regimes were evaluated. The three PRE regimes were: 1) pendimethalin at 0.84 kg ae ha-1 plus fomesafen at 0.28 kg ai ha-1 applied broadcast, 2) pendimethalin plus fomesafen applied banded on the row, or 3) no PRE. Each PRE treatment was followed by (fb) glyphosate (1.12 kg ae ha-1) applied POST fb a LAYBY applications of diuron (1.12 kg ai ha-1) plus MSMA (2.24 kg ai ha-1). Low residue plots ranged in biomass from 85 to 464 kg ha-1, while high biomass plots ranged from 3119 to 6929 kg ha-1. In most comparisons, surface disturbance width, residue amount, and soil applied herbicide placement did not influence within-row weed control; however, broadcast PRE resulted in increased carpetweed, large crabgrass, Palmer amaranth, tall morningglory, and yellow nutsedge weed control in row middles compared to plots receiving banded PRE. In addition, high residue increased carpetweed, common purslane, large crabgrass, Palmer amaranth, sicklepod, and tall morningglory weed control between rows. Use of banded PRE herbicides resulted in equivalent yield and revenue in four of six comparisons compared to those with broadcast PRE herbicide application; however, this would likely result in many between row weed escapes. Thus, conservation tillage cotton would benefit from broadcast soil-applied herbicide applications regardless of residue amount and tillage width when infested with Palmer amaranth and other troublesome weed species.
Lifestyle interventions are an important and viable approach for preventing cognitive deficits. However, the results of studies on alcohol, coffee and tea consumption in relation to cognitive decline have been divergent, likely due to confounds from dose–response effects. This meta-analysis aimed to find the dose–response relationship between alcohol, coffee or tea consumption and cognitive deficits.
Prospective cohort studies or nested case-control studies in a cohort investigating the risk factors of cognitive deficits were searched in PubMed, Embase, the Cochrane and Web of Science up to 4th June 2020. Two authors searched the databases and extracted the data independently. We also assessed the quality of the studies with the Newcastle-Ottawa scale. Stata 15.0 software was used to perform model estimation and plot the linear or nonlinear dose–response relationship graphs.
The search identified 29 prospective studies from America, Japan, China and some European countries. The dose–response relationships showed that compared to non-drinkers, low consumption (<11 g/day) of alcohol could reduce the risk of cognitive deficits or only dementias, but there was no significant effect of heavier drinking (>11 g/day). Low consumption of coffee reduced the risk of any cognitive deficit (<2.8 cups/day) or dementia (<2.3 cups/day). Green tea consumption was a significant protective factor for cognitive health (relative risk, 0.94; 95% confidence intervals, 0.92–0.97), with one cup of tea per day brings a 6% reduction in risk of cognitive deficits.
Light consumption of alcohol (<11 g/day) and coffee (<2.8 cups/day) was associated with reduced risk of cognitive deficits. Cognitive benefits of green tea consumption increased with the daily consumption.
The coronavirus disease 2019 (COVID-19) pandemic represents an unprecedented threat to mental health. Herein, we assessed the impact of COVID-19 on subthreshold depressive symptoms and identified potential mitigating factors.
Participants were from Depression Cohort in China (ChiCTR registry number 1900022145). Adults (n = 1722) with subthreshold depressive symptoms were enrolled between March and October 2019 in a 6-month, community-based interventional study that aimed to prevent clinical depression using psychoeducation. A total of 1506 participants completed the study in Shenzhen, China: 726 participants, who completed the study between March 2019 and January 2020 (i.e. before COVID-19), comprised the ‘wave 1’ group; 780 participants, who were enrolled before COVID-19 and completed the 6-month endpoint assessment during COVID-19, comprised ‘wave 2’. Symptoms of depression, anxiety and insomnia were assessed at baseline and endpoint (i.e. 6-month follow-up) using the Patient Health Questionnaire-9 (PHQ-9), Generalised Anxiety Disorder-7 (GAD-7) and Insomnia Severity Index (ISI), respectively. Measures of resilience and regular exercise were assessed at baseline. We compared the mental health outcomes between wave 1 and wave 2 groups. We additionally investigated how mental health outcomes changed across disparate stages of the COVID-19 pandemic in China, i.e. peak (7–13 February), post-peak (14–27 February), remission plateau (28 February−present).
COVID-19 increased the risk for three mental outcomes: (1) depression (odds ratio [OR] = 1.30, 95% confidence interval [CI]: 1.04–1.62); (2) anxiety (OR = 1.47, 95% CI: 1.16–1.88) and (3) insomnia (OR = 1.37, 95% CI: 1.07–1.77). The highest proportion of probable depression and anxiety was observed post-peak, with 52.9% and 41.4%, respectively. Greater baseline resilience scores had a protective effect on the three main outcomes (depression: OR = 0.26, 95% CI: 0.19–0.37; anxiety: OR = 1.22, 95% CI: 0.14–0.33 and insomnia: OR = 0.18, 95% CI: 0.11–0.28). Furthermore, regular physical activity mitigated the risk for depression (OR = 0.79, 95% CI: 0.79–0.99).
The COVID-19 pandemic exerted a highly significant and negative impact on symptoms of depression, anxiety and insomnia. Mental health outcomes fluctuated as a function of the duration of the pandemic and were alleviated to some extent with the observed decline in community-based transmission. Augmenting resiliency and regular exercise provide an opportunity to mitigate the risk for mental health symptoms during this severe public health crisis.
Previous studies have revealed associations of meteorological factors with tuberculosis (TB) cases. However, few studies have examined their lag effects on TB cases. This study was aimed to analyse nonlinear lag effects of meteorological factors on the number of TB notifications in Hong Kong. Using a 22-year consecutive surveillance data in Hong Kong, we examined the association of monthly average temperature and relative humidity with temporal dynamics of the monthly number of TB notifications using a distributed lag nonlinear models combined with a Poisson regression. The relative risks (RRs) of TB notifications were >1.15 as monthly average temperatures were between 16.3 and 17.3 °C at lagged 13–15 months, reaching the peak risk of 1.18 (95% confidence interval (CI) 1.02–1.35) when it was 16.8 °C at lagged 14 months. The RRs of TB notifications were >1.05 as relative humidities of 60.0–63.6% at lagged 9–11 months expanded to 68.0–71.0% at lagged 12–17 months, reaching the highest risk of 1.06 (95% CI 1.01–1.11) when it was 69.0% at lagged 13 months. The nonlinear and delayed effects of average temperature and relative humidity on TB epidemic were identified, which may provide a practical reference for improving the TB warning system.
Determination of antibodies against ToRCH antigens at the beginning of pregnancy allows assessment of both the maternal immune status and the risks to an adverse pregnancy outcome. Age-standardised seroprevalences were determined in sera from 1009 women of childbearing age residing in Mexico, Brazil, Germany, Poland, Turkey or China using a multiparametric immunoblot containing antigen substrates for antibodies against Toxoplasma gondii, rubella virus, cytomegalovirus (CMV), herpes simplex viruses (HSV-1, HSV-2), Bordetella pertussis, Chlamydia trachomatis, parvovirus B19, Treponema pallidum and varicella zoster virus (VZV). Seroprevalences for antibodies against HSV-1 were >90% in samples from Brazil and Turkey, whereas the other four countries showed lower mean age-adjusted seroprevalences (range: 62.5–87.9%). Samples from Brazilian women showed elevated seroprevalences of antibodies against HSV-2 (40.1%), C. trachomatis (46.8%) and B. pertussis (56.6%) compared to the other five countries. Seroprevalences of anti-T. gondii antibodies (0.5%) and anti-parvovirus B19 antibodies (7.5%) were low in samples from Chinese women, compared to the other five countries. Samples from German women revealed a low age-standardised seroprevalence of anti-CMV antibodies (28.8%) compared to the other five countries. These global differences in immune status of women in childbearing age advocate country-specific prophylaxis strategies to avoid infection with ToRCH pathogens.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
We report key learning from the public health management of the first two confirmed cases of COVID-19 identified in the UK. The first case imported, and the second associated with probable person-to-person transmission within the UK. Contact tracing was complex and fast-moving. Potential exposures for both cases were reviewed, and 52 contacts were identified. No further confirmed COVID-19 cases have been linked epidemiologically to these two cases. As steps are made to enhance contact tracing across the UK, the lessons learned from earlier contact tracing during the country's containment phase are particularly important and timely.
To perform a validation assessment of a novel porcine ex vivo model for otoplasty training.
A total of nine otolaryngology trainees performed a standard approach otoplasty on a porcine ear. They completed a series of tasks including posterior skin incision, anterior scoring, Mustardé suture placement and concha–mastoid suture placement. Trainees completed a post-task questionnaire assessing face validity, global content validity and task-specific content validity.
Trainees’ median scores for the porcine model were: 4 for face validity (interquartile range, 3–4), 5 for global content validity (interquartile range, 4–5) and 4 for task-specific content validity (interquartile range, 4–4).
This study is the first to formally validate the ex vivo porcine auricular model as a useful tool for training in otoplasty. The model should be incorporated into simulation training for otoplasty in order to improve learning, enable acquisition of specific surgical skills and improve operative outcomes.
Previous work led to the proposal that the precision feeding of a high-concentrate diet may represent a potential method with which to enhance feed efficiency (FE) when rearing dairy heifers. However, the physiological and metabolic mechanisms underlying this approach remain unclear. This study used metabolomics analysis to investigate the changes in plasma metabolites of heifers precision-fed diets containing a wide range of forage to concentrate ratios. Twenty-four half-sib Holstein heifers, with a similar body condition, were randomly assigned into four groups and precision fed with diets containing different proportions of concentrate (20%, 40%, 60% and 80% based on DM). After 28 days of feeding, blood samples were collected 6 h after morning feeding and gas chromatography time-of-ﬂight/MS was used to analyze the plasma samples. Parameters of oxidative status were also determined in the plasma. The FE (after being corrected for gut fill) increased linearly (P < 0.01) with increasing level of dietary concentrate. Significant changes were identified for 38 different metabolites in the plasma of heifers fed different dietary forage to concentrate ratios. The main pathways showing alterations were clustered into those relating to carbohydrate and amino acid metabolism; all of which have been previously associated with FE changes in ruminants. Heifers fed with a high-concentrate diet had higher (P < 0.01) plasma total antioxidant capacity and superoxide dismutase but lower (P ≤ 0.02) hydroxyl radical and hydrogen peroxide than heifers fed with a low-concentrate diet, which might indicate a lower plasma oxidative status in the heifers fed a high-concentrate diet. Thus, heifers fed with a high-concentrate diet had higher FE and antioxidant capacity but a lower plasma oxidative status as well as changed carbohydrate and amino acid metabolism. Our findings provide a better understanding of how forage to concentrate ratios affect FE and metabolism in the precision-fed growing heifers.
Guanidinoacetic acid (GAA) can improve the growth performance of bulls. This study investigated the influences of GAA addition on growth, nutrient digestion, ruminal fermentation and serum metabolites in bulls. Forty-eight Angus bulls were randomly allocated to experimental treatments, that is, control, low-GAA (LGAA), medium-GAA (MGAA) and high-GAA (HGAA), with GAA supplementation at 0, 0.3, 0.6 and 0.9 g/kg DM, respectively. Bulls were fed a basal diet containing 500 g/kg DM concentrate and 500 g/kg DM roughage. The experimental period was 104 days, with 14 days for adaptation and 90 days for data collection. Bulls in the MGAA and HGAA groups had higher DM intake and average daily gain than bulls in the LGAA and control groups. The feed conversion ratio was lowest in MGAA and highest in the control. Bulls receiving 0.9 g/kg DM GAA addition had higher digestibility of DM, organic matter, NDF and ADF than bulls in other groups. The digestibility of CP was higher for HGAA than for LGAA and control. The ruminal pH was lower for MGAA, and the total volatile fatty acid concentration was greater for MGAA and HGAA than for the control. The acetate proportion and acetate-to-propionate ratio were lower for MGAA than for LGAA and control. The propionate proportion was higher for MGAA than for control. Bulls receiving GAA addition showed decreased ruminal ammonia N. Bulls in MGAA and HGAA had higher cellobiase, pectinase and protease activities and Butyrivibrio fibrisolvens, Prevotella ruminicola and Ruminobacter amylophilus populations than bulls in LGAA and control. However, the total protozoan population was lower for MGAA and HGAA than for LGAA and control. The total bacterial and Ruminococcus flavefaciens populations increased with GAA addition. The blood level of creatine was higher for HGAA, and the activity of l-arginine glycine amidine transferase was lower for MGAA and HGAA, than for control. The blood activity of guanidine acetate N-methyltransferase and the level of folate decreased in the GAA addition groups. The results indicated that dietary addition of 0.6 or 0.9 g/kg DM GAA improved growth performance, nutrient digestion and ruminal fermentation in bulls.
The meat quality of chicken is an important factor affecting the consumer’s health. It was hypothesized that n-3 polyunsaturated fatty acid (n-3 PUFA) could be effectively deposited in chicken, by incorporating antioxidation of soybean isoflavone (SI), which led to improved quality of chicken meat for good health of human beings. Effects of partial or complete dietary substitution of lard (LA) with linseed oil (LO), with or without SI on growth performance, biochemical indicators, meat quality, fatty acid profiles, lipid-related health indicators and gene expression of breast muscle were examined in chickens. A total of 900 males were fed a corn–soybean meal diet supplemented with 4% LA, 2% LA + 2% LO and 4% LO and the latter two including 30 mg SI/kg (2% LA + 2% LO + SI and 4% LO + SI) from 29 to 66 days of age; each of the five dietary treatments included six replicates of 30 birds. Compared with the 4% LA diet, dietary 4% LO significantly increased the feed efficiency and had no negative effect on objective indices related to meat quality; LO significantly decreased plasma triglycerides and total cholesterol (TCH); abdominal fat percentage was significantly decreased in birds fed the 4% LO and 4% LO + SI diets. Chickens with LO diets resulted in higher contents of α-linolenic acid (C18:3n-3), EPA (C20:5n-3) and total n-3 PUFA, together with a lower content of palmitic acid (C16:0), lignoceric acid (C24:0), saturated fatty acids and n-6:n-3 ratio in breast muscle compared to 4% LA diet (P < 0.05); they also significantly decreased atherogenic index, thrombogenic index and increased the hypocholesterolemic to hypercholesterolemic ratio. Adding SI to the LO diets enhanced the contents of EPA and DHA (C22:6n-3), plasma total superoxide dismutase, reduced glutathione (GSH)/oxidized glutathione and muscle GSH content, while decreased plasma total triglyceride and TCH and malondialdehyde content in plasma and breast muscle compared to its absence (P < 0.05). Expression in breast muscle of fatty acid desaturase 1 (FADS1), FADS2, elongase 2 (ELOVL2) and ELOVL5 genes were significantly higher with the LO diets including SI than with the 4% LA diet. Significant interactions existed between LO level and inclusion of SI on EPA and TCH contents. These findings indicate that diet supplemented with LO combined with SI is an effective alternative when optimizing the nutritional value of chicken meat for human consumers.
Porphyromonas gingivalis has been linked to the development and progression of oesophageal squamous cell carcinoma (ESCC), and is considered to be a high-risk factor for ESCC. Currently, the commonly used methods for P. gingivalis detection are culture or DNA extraction-based, which are either time and labour intensive especially for high-throughput applications. We aimed to establish and evaluate a rapid and sensitive direct quantitative polymerase chain reaction (qPCR) protocol for the detection of P. gingivalis without DNA extraction which is suitable for large-scale epidemiological studies. Paired gingival swab samples from 192 subjects undergoing general medical examinations were analysed using two direct and one extraction-based qPCR assays for P. gingivalis. Tris-EDTA buffer-based direct qPCR (TE-direct qPCR), lysis-based direct qPCR (lysis-direct qPCR) and DNA extraction-based qPCR (kit-qPCR) were used, respectively, in 192, 132 and 60 of these samples for quantification of P. gingivalis. The sensitivity and specificity of TE-direct qPCR was 95.24% and 100% compared with lysis-direct qPCR, which was 100% and 97.30% when compared with kit-qPCR; TE-direct qPCR had an almost perfect agreement with lysis-direct qPCR (κ = 0.954) and kit-qPCR (κ = 0.965). Moreover, the assay time used for TE-direct qPCR was 1.5 h. In conclusion, the TE-direct qPCR assay is a simple and efficient method for the quantification of oral P. gingivalis and showed high sensitivity and specificity compared with routine qPCR.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
To provide information about psychiatric comorbidity and suicidal behavior in people with epilepsy compared to those without epilepsy from a community sample in Brazil.
An attempt was made to evaluate all 174 subjects with epilepsy (cases) identified in a previous community survey. For every case identified, an individual without epilepsy (control) and matched by sex and age was selected in the same neighborhood. A structured interview with validated psychiatric scales was performed. 153 cases and 154 controls were enrolled in the study.
People with epilepsy had more frequently anxiety (39.4% versus 23.8%, OR 2.1 [95% CI 1.2 - 3.5]; p=0.006), depression (24.4% versus 14.7%, OR 1.9 [95% CI 1.01 - 3.5]; p=0.04) and anger (55.6% versus 39.7%, OR 1.9 [95% CI 1.2 - 3.1]; p=0.008). They also reported more suicidal thoughts (36.7% versus 23.8%, OR 1.8 [95% CI 1.1 - 3.1]; p=0.02), plans (18.2% versus 3.3%, OR 2.0 [95% CI 1.0 - 4.0]; p=0.04) and attempts (12.1% versus 5.3%, OR 2.4 [95% CI 1.1 - 3.2], p=0.04) during life than controls.
These findings call attention for psychiatric comorbidity and suicidal behavior associated with epilepsy. Suicide risk assessment, mental evaluation and treatment may improve quality of life in epilepsy and ultimately prevent suicide.
Attention, working memory (WM), information processing and memory deficits are important features of schizophrenia. WM functions appear to be mediated by the dorsolateral prefrontal cortex (DLPFC). Functional imaging studies have shown a failure to activate the DLPFC during working memory tasks in patients with chronic schizophrenia. The primary aim of this study is to determine whether there are brain activation changes in the dorso-lateral prefrontal cortex (DLPFC) as a result of engaging in a randomized, controlled 12 week course of cognitive remediation therapy (CRT) in inpatients with chronic schizophrenia.
Patients with DSM IV schizophrenia are randomized to a 12 week trial of Cognitive Remediation (CR) using a Computerized CR program (COGPACK) or to a 12-week control condition. Patients receive at baseline and endpoint an fMRI scan with a cognitive task (N-back task), a neuropsychological test battery (MATRICS), functional and symptom assessments.
Preliminary results of this ongoing study show that patients after 12 weeks of CR showed (1) significantly more improvement in WM functions than patients who participated in the control group and (2) improvement in accuracy on the verbal letter 2-back task during the fMRI scan. Signal difference between 2-back and 0-back was not present or only present minimally at baseline (Pre-CR); however, at endpoint (Post-CR) there was signal difference present, which corresponds to an increase in activation in the areas of the DLPFC. This increase in activation pattern may be reflective of the effects of the exposure to the CR intervention.
Although the deviations of brain volume deficits in sporadic and familial first-episode schizophrenia patients (FEP) had been presented, the difference of brain asymmetries remained unidentified.
To assess the potential differences of volumetric asymmetries of gray matter (GM) and white matter (WM) between groups.
To find out the different injury alteration of sporadic FEP and familial FEP.
42 sporadic and 30 familiar drug-naïve FEP with and 72 matched normal controls (NC) were recruited. Participants were assessed with neuropsychological tests and scanned by a 3.0T MRI to obtain T1-weighted and DTI images. Lateralization distribution maps of GM and WM volume were generated by employing optimized voxel-based morphometry. The asymmetries were analyzed by comparing calculating Laterality Index (LI) voxel by voxel.
All three groups showed similar overall brain torque. Familiar FEP have more regional extensive GM asymmetry brain lesions compared to sporadic FEP. There was no shared regional lesion between two groups. LIGM and LIWM in right superior temporal were negatively correlated. Significant negative correlations were also found between LIGM of left superior parietal lobule and LIWM of right superior parietal lobule, and between LIGM of right inferior parietal lobule and LIWM of left inferior parietal lobule. The asymmetry in distinct brain regions were related to cognitive deficits especially in the domains of language and memory.
The two patient groups had different alteration in injuries of brain asymmetry. Familiar FEP has more GM extensive asymmetry brain region, which may correlate with their high genetic burdens.
There are strong links between circadian disturbance and some of the most characteristic symptoms of clinical major depressive disorder (MDD). However there are no published studies of changes in expression of clock genes or of other neuropeptides related to circadian-rhythm regulation, which may influence recurrent susceptibility after treatment with antidepressant in MDD.
Blood samples were collected from twelve healthy controls and twelve male major depressive patients pre- and post- treated with escitalopram for eight weeks at 4-hour intervals for 24 hours. Outcome measures were the relative expression of mRNA of clock genes (hPERIOD1, hPERIOD2, hPERIOD3, hCRY1, hBMAL1, hNPAS2 and hGSK-3beta) and the levels of serum melatonin, Vasoactive Intestinal Peptide (VIP), cortisol, Adrenocorticotropic Hormone (ACTH), Insulin-like Growth Factor-1(IGF-1) and growth hormone (GH) in twelve healthy controls and twelve pre- and post- treated MDD patients.
Compared with healthy controls, MDD patients showed disruptions in diurnal rhythms of expression of hPERIOD1, hPERIOD2, hCRY1, hBMAL1, hNPAS2 and hGSK-3beta, along with disruptions in diurnal rhythms of release of melatonin, VIP, cortisol, ACTH, IGF-1, and GH. Several of these disruptions (hPER1, hCRY1, melatonin, VIP, cortisol, ACTH, and IGF-1) persisted after eight weeks escitalopram treatment, as did elevation of 24-hour levels of VIP and decreases in 24-hour levels of cortisol and ACTH.
These persisted neurobiological changes may play a role in MDD symptoms that are thought to contribute to recurrence vulnerability and in maintenance therapy for a long term.
Cognitive impairment is central to many psychiatric conditions and is a determinant factor of functioning. The evaluation of cognition is time-consuming and recourse to it limited by cost, accessibility of expertise, and, in the case of computerized batteries, equipment. The SCIP is a 15 minute paper and pencil evaluation of cognitive function which can be integrated into clinical practice. It is thus a tool which can assist in determining which patients require a more extensive evaluation and can inform the elaboration of a personalized treatment plan. Our group (Groupe Comorbidité psychiatrique et Dimensions) has validated a french translation of the SCIP and is testing the acceptability of its integration into clinical practice in selected clinical populations. We will present preliminary data regarding the use of the SCIP in adult attention deficit disorder. Forty adult patients with attention deficit disorder were invited to participate in the study. In order to maintain a sample representative of clinical practice the only exclusion criteria were inability to speak french and inability to give informed consent. Demographic characteristics were collected, and a multiaxial DSM-IV diagnosis determined by the treating physician, SCIP was administered. The time to administer the SCIP was recorded, and a qualitative questionnaire of patient impressions was completed. We will present preliminary results of this study.
The present study compared the expression profile and made the classification with the leukocytes by using whole-genome cRNA microarrays among patients with SSD, major depressive disorder (MDD) and healthy controls.
Gene expression profiling was conducted in peripheral blood leucocytes from drug-free first-episode subjects with SSD, MDD, and matched controls (8 subjects in each group) using global mRNA expression arrays. Support vector machines (SVMs) were utilized for training and testing on candidate signature expression profiles from signature selection step.
We identified SSD and MDD gene signatures from blood-based gene expression profile and build a SSD- MDD disorder model with higher predictive power. Firstly, we identified 63 differentially expressed SSD signatures in contrast to control (P <= 5.0E-4) and 30 differentially expressed MDD signatures in contrast to control, respectively. Then, 123 gene signatures were identified with significantly differential expression level between SSD and MDD. Secondly, in order to conduct priority selection for biomarkers for SSD and MDD together, we selected top gene signatures from each group of pair-wise comparison results, and merged the signatures together to generate better profiles used for clearly classify SSD and MDD sets in the same time. In details, we tried different combination of signatures from the three pair-wise compartmental results and finally determined 48 gene expression signatures with 100% accuracy.
Blood cell-derived RNA may have significant value for performing diagnostic functions and identifying disease biomarkers in SSD and MDD. These 48 gene model could classify SSD, MDD, and healthy controls.