To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Pulmonary aspergillosis associated with cyanotic congenital heart disease is a rare condition, which is known to have a poor prognosis. We report a case of a 21-year-old woman with truncus arteriosus and major aortopulmonary collateral arteries who underwent primary Rastelli procedure after thoracoscopic lobectomy for the management of progressive pulmonary aspergillosis.
Differences in individual eating habits may be influenced by genetic factors, in addition to cultural, social or environmental factors. Previous studies suggested that genetic variants within sweet taste receptor genes family were associated with sweet taste perception and the intake of sweet foods. The aim of this study was to conduct a genome-wide association study (GWAS) to find genetic variations that affect confection consumption in a Japanese population. We analysed GWAS data on confection consumption using 14 073 participants from the Japan Multi-Institutional Collaborative Cohort study. We used a semi-quantitative FFQ to estimate food intake that was validated previously. Association of the imputed variants with confection consumption was performed by linear regression analysis with adjustments for age, sex, total energy intake and principal component analysis components 1–3. Furthermore, the analysis was repeated adjusting for alcohol intake (g/d) in addition to the above-described variables. We found 418 SNP located in 12q24 that were associated with confection consumption. SNP with the ten lowest P-values were located on nine genes including at the BRAP, ACAD10 and aldehyde dehydrogenase 2 regions on 12q24.12-13. After adjustment for alcohol intake, no variant was associated with confections intake with genome-wide significance. In conclusion, we found a significant number of SNP located on 12q24 genes that were associated with confections intake before adjustment for alcohol intake. However, all of them lost statistical significance after adjustment for alcohol intake.
To develop a fully automated algorithm using data from the Veterans’ Affairs (VA) electrical medical record (EMR) to identify deep-incisional surgical site infections (SSIs) after cardiac surgeries and total joint arthroplasties (TJAs) to be used for research studies.
Retrospective cohort study.
This study was conducted in 11 VA hospitals.
Patients who underwent coronary artery bypass grafting or valve replacement between January 1, 2010, and March 31, 2018 (cardiac cohort) and patients who underwent total hip arthroplasty or total knee arthroplasty between January 1, 2007, and March 31, 2018 (TJA cohort).
Relevant clinical information and administrative code data were extracted from the EMR. The outcomes of interest were mediastinitis, endocarditis, or deep-incisional or organ-space SSI within 30 days after surgery. Multiple logistic regression analysis with a repeated regular bootstrap procedure was used to select variables and to assign points in the models. Sensitivities, specificities, positive predictive values (PPVs) and negative predictive values were calculated with comparison to outcomes collected by the Veterans’ Affairs Surgical Quality Improvement Program (VASQIP).
Overall, 49 (0.5%) of the 13,341 cardiac surgeries were classified as mediastinitis or endocarditis, and 83 (0.6%) of the 12,992 TJAs were classified as deep-incisional or organ-space SSIs. With at least 60% sensitivity, the PPVs of the SSI detection algorithms after cardiac surgeries and TJAs were 52.5% and 62.0%, respectively.
Considering the low prevalence rate of SSIs, our algorithms were successful in identifying a majority of patients with a true SSI while simultaneously reducing false-positive cases. As a next step, validation of these algorithms in different hospital systems with EMR will be needed.
The frequent prescription of antimicrobials, such as at discharge from the emergency department, calls for optimizing this practice through modifying physicians’ prescribing behavior. A 1-year, multifaceted intervention implemented in an emergency department decreased the mean monthly antimicrobial prescription rate at discharge and increased the proportion of appropriate prescriptions.
By analysing a database (Lozano-Durán & Jiménez, Phys. Fluids, vol. 26, 2014, 011702) of fully developed turbulent channel flow at the friction Reynolds number $Re_\tau =4179$, we investigate the sustaining mechanism of a hierarchy of coherent structures in the turbulence. For this purpose, we decompose the turbulent fields into different scales by band-pass filters and quantify the real-space energy transfer. Visualizations of the hierarchy of vortices and velocity in the filtered fields show that the largest-scale structures at each distance from the wall are composed of quasi-streamwise vortices and low-speed streaks. These are similar to well known coherent structures in the buffer layer and they are maintained by a hierarchical self-sustaining process. Quantitatively, however, the energy production rate of the largest-scale structures is different in the log and buffer layers. This difference explains the change of the scaling of the Reynolds stress as a function of the Reynolds number. In contrast to the largest-scale structures, vortices smaller than the distance from the wall distribute isotropically, and they are generated by an energy cascading process. The energy of these small-scale structures is transferred predominantly from twice-larger-scale structures and reduced by half-scale ones through the vortex stretching and contraction, respectively. Turbulent advection from the wall hardly contributes to the maintenance of small-scale structures in the log layer.
Coronavirus disease 2019 (COVID-19) remains a serious threat for long-term care facilities, and frequent screening of employees and residents places a substantial burden on those facilities. We report our successful multimodal prevention measures without frequent testing, which resulted in no cases within 20 nursing home units over the first 6 months of the pandemic.
This paper describes an automatic singing transcription (AST) method that estimates a human-readable musical score of a sung melody from an input music signal. Because of the considerable pitch and temporal variation of a singing voice, a naive cascading approach that estimates an F0 contour and quantizes it with estimated tatum times cannot avoid many pitch and rhythm errors. To solve this problem, we formulate a unified generative model of a music signal that consists of a semi-Markov language model representing the generative process of latent musical notes conditioned on musical keys and an acoustic model based on a convolutional recurrent neural network (CRNN) representing the generative process of an observed music signal from the notes. The resulting CRNN-HSMM hybrid model enables us to estimate the most-likely musical notes from a music signal with the Viterbi algorithm, while leveraging both the grammatical knowledge about musical notes and the expressive power of the CRNN. The experimental results showed that the proposed method outperformed the conventional state-of-the-art method and the integration of the musical language model with the acoustic model has a positive effect on the AST performance.
Demonstrating the value of medicines through health technology assessment (HTA) systems is becoming increasingly complex. Innovative therapies – such as immuno-oncology (IO) agents – are testing limits of methodological approaches in markets with established HTA systems. The objective of this study is to understand how requirements, approaches, and decision-making differ between select HTA agencies with a focus on specific PD-1/PD-L1 (programmed death receptor-1/programmed death-ligand 1) agents and cancer indications, and to describe how this variation impacts patient access. To achieve this objective, we conducted a detailed HTA dossier review for several recently launched IO products across Australia (AU), Canada (CA), France (FR), and the United Kingdom (UK).
Content experts reviewed HTA dossiers for pembrolizumab, nivolumab, and atezolizumab for non-small cell lung cancer (NSCLC) first-line monotherapy, NSCLC combination therapy, and adjuvant melanoma. A systematic analytic framework was developed to understand best-practice methodology across systems. Information on submitted data, patient/expert input, and access decisions were extracted; key themes were identified and refined through workshop discussion, and probed further through blinded primary research with eight individuals with current or recent experience of HTA systems.
We identified six major elements of variation impacting decision-making: evidentiary expectations for biomarkers, use/impact of patient-centered data; use/impact of real-world data, acceptance of surrogate endpoints, approaches for clinical data extrapolation, and accepted time horizons. Considerable variation in time to access was observed; for pembrolizumab (NSCLC first-line monotherapy), time from product registration to HTA decision ranged from 42 (CA) to 487 (AU) days; time from registration to listing ranged from 189 (CA) to 605 (AU) days.
Evaluated HTA systems demonstrate a large degree of variability in approaches to decision-making for novel IO medicines; resultant access decisions and time to access are also highly variable. Inconsistency between systems and duplication of effort when assessing similar clinical/economic data could be contributing to limited or delayed patient access; the relationship merits further exploration. Assessed HTA systems are currently undergoing process revisions but expert input suggests that this is not expected to reduce variation, and could further increase complexity. The influence of parallel scientific advice programs between HTA agencies and regulatory bodies in reducing variation must also be determined.
We performed a systematic literature review and meta-analysis measuring the burden of antibiotic use during end-of-life (EOL) care.
We searched PubMed, CINAHL (EBSCO platform), and Embase (Elsevier platform), through July 2019 for studies with the following inclusion criteria in the initial analysis: antibiotic use in the EOL care patients (advanced dementia, cancer, organ failure, frailty or multi-morbidity). If the number of patients in palliative care consultation (PCC) was available, antibiotic use data were pooled to compare the proportion of patients who received antibiotics under PCC compared to those not receiving PCC. Random-effect models were used to obtain pooled mean differences, and heterogeneity was assessed using the I2 value.
Overall, 72 studies met the inclusion criteria and were included in the final review: 22 EOL studies included only patients with cancer; 17 studies included only patients with advanced dementia; and 33 studies included “mixed populations” of EOL patients. Although few studies reported antibiotic using standard metrics (eg, days of therapy), 48 of 72 studies (66.7%) reported antibiotic use in >50% of all patients. When the 3 studies that evaluated antibiotic use in PCC were pooled together, patients under PCC was more likely to receive antibiotics compared to patients not under PCC (pooled odds ratio, 1.73; 95% CI, 1.02–2.93).
Future studies are needed to evaluate the benefits and harms of using antibiotics for patients during EOL care in diverse patient populations.
Background: Daptomycin is considered an effective alternative to vancomycin in patients with methicillin-resistant Staphylococcus aureus bloodstream infection (MRSA BSI). Objective: We investigated the real-world effectiveness of recommended daptomycin doses compared with vancomycin. Methods: This nationwide retrospective cohort study included patients from 124 Veterans’ Affairs hospitals who had a MRSA BSI and were initially treated with vancomycin during 2007–2014. Patients were categorized into 3 groups by daptomycin dose calculated using adjusted body weight: low (>6 mg/kg/day), standard (6–8 mg/kg/day), and high (≥8 mg/kg/day). International Classification of Diseases, Ninth Revision (ICD-9) diagnosis codes were used to identify other prior or concurrent infections and comorbidities. Multivariate cox regression was used to compare 30-day all-cause mortality as the primary outcome comparing patients on either low-dose, standard-dose, or high-dose daptomycin with vancomycin. Hazard ratio (HR) and 95% confidence intervals (CIs) were reported. Results: Of the 7,518 patients in the cohort, 683 (9.1%) were switched to daptomycin after initial treatment with vancomycin for their MRSA BSI episode. A low dose of daptyomycin was administered to 181 patients (26.5%), a standard dose was given to 377 patients (55.2%), and a high dose was administered to 125 patients (18.3%). Dose groups differed significantly in body mass index (BMI), presence of an osteomyelitis diagnosis, and diagnosis of diabetes. Thirty-day mortality was significantly lower in daptomycin patients than in those given vancomycin (11.3% vs 17.6%; P < .0001). Treatment with daptomycin was associated with improved 30-day survival compared with vancomycin (HR, 0.66; 95% CI, 0.53–0.84), after adjusting for age, BMI, diagnosis of endovascular infection, skin and soft-tissue infection and osteomyelitis, hospitalization in the prior year, immunosuppression, diagnosis of diabetes, and vancomycin minimum inhibitory concentration (MIC). Treatment with a standard dose of daptomycin was associated with lower mortality compared with vancomycin (HR, 0.63; 95% CI, 0.46–0.86). High and low daptomycin dose groups had a trend toward improved 30-day survival compared with vancomycin (Fig. 1). In 2 separate sensitivity analyses excluding vancomycin patients, there was no difference in 30-day mortality between a standard dose and a high dose (HR, 1.01; 95% CI, 0.51–1.97). However, we detected a trend toward poor survival with a low dose compared with a standard dose (HR, 1.21; 95% CI, 0.73–2.02). Conclusions: A standard dose of daptomycin was significantly associated with lower 30-day mortality compared with continued vancomycin treatment. Accurate dosage of daptomycin and avoidance of low-dose daptomycin should be a part of good antibiotic stewardship practice.
Background: Studies of interventions to decrease rates of surgical site infections (SSIs) must include thousands of patients to be statistically powered to demonstrate a significant reduction. Therefore, it is important to develop methodology to extract data available in the electronic medical record (EMR) to accurately measure SSI rates. Prior studies have created tools that optimize sensitivity to prioritize chart review for infection control purposes. However, for research studies, positive predictive value (PPV) with reasonable sensitivity is preferred to limit the impact of false-positive results on the assessment of intervention effectiveness. Using information from the prior tools, we aimed to determine whether an algorithm using data available in the Veterans Affairs (VA) EMR could accurately and efficiently identify deep incisional or organ-space SSIs found in the VA Surgical Quality Improvement Program (VASQIP) data set for cardiac and orthopedic surgery patients. Methods: We conducted a retrospective cohort study of patients who underwent cardiac surgery or total joint arthroplasty (TJA) at 11 VA hospitals between January 1, 2007, and April 30, 2017. We used EMR data that were recorded in the 30 days after surgery on inflammatory markers; microbiology; antibiotics prescribed after surgery; International Classification of Diseases (ICD) and current procedural terminology (CPT) codes for reoperation for an infection related purpose; and ICD codes for mediastinitis, prosthetic joint infection, and other SSIs. These metrics were used in an algorithm to determine whether a patient had a deep or organ-space SSI. Sensitivity, specificity, PPV and negative predictive values (NPV) were calculated for accuracy of the algorithm through comparison with 30-day SSI outcomes collected by nurse chart review in the VASQIP data set. Results: Among the 11 VA hospitals, there were 18,224 cardiac surgeries and 16,592 TJA during the study period. Of these, 20,043 were evaluated by VASQIP nurses and were included in our final cohort. Of the 8,803 cardiac surgeries included, manual review identified 44 (0.50%) mediastinitis cases. Of the 11,240 TJAs, manual review identified 71 (0.63%) deep or organ-space SSIs. Our algorithm identified 32 of the mediastinitis cases (73%) and 58 of the deep or organ-space SSI cases (82%). Sensitivity, specificity, PPV, and NPV are shown in Table 1. Of the patients that our algorithm identified as having a deep or organ-space SSI, only 21% (PPV) actually had an SSI after cardiac surgery or TJA. Conclusions: Use of the algorithm can identify most complex SSIs (73%–82%), but other data are necessary to separate false-positive from true-positive cases and to improve the efficiency of case detection to support research questions.
Background: When control mechanisms such as water temperature and biocide level are insufficient, Legionella, the causative bacteria of Legionnaires’ disease, can proliferate in water distribution systems in buildings. Guidance and oversight bodies are increasingly prioritizing water safety programs in healthcare facilities to limit Legionella growth. However, ensuring optimal implementation in large buildings is challenging. Much is unknown, and sometimes assumed, about whether building and campus characteristics influence Legionella growth. We used an extensive real-world environmental Legionella data set in the Veterans Health Administration (VHA) healthcare system to examine infrastructure characteristics and Legionella positivity. Methods: VHA medical facilities across the country perform quarterly potable water sampling of healthcare buildings for Legionella detection as part of a comprehensive water safety program. Results are reported to a standardized national database. We did an exploratory univariate analysis of facility-reported Legionella data from routine potable water samples taken in 2015 to 2018, in conjunction with infrastructure characteristics available in a separate national data set. This review examined the following characteristics: building height (number of floors), building age (reported construction year), and campus acreage. Results: The final data set included 201,936 water samples from 819 buildings. Buildings with 1–5 floors (n = 634) had a Legionella positivity rate of 5.3%, 6–10 floors (n = 104) had a rate of 6.4%, 11–15 floors (n = 36) had a rate of 8.1%, and 16–22 floors (n = 9) had a rate of 8.8%. All rates were significantly different from each other except 11–15 floors and 16–22 floors (P < .05, χ2). The oldest buildings (1800s) had significantly less (P < .05, χ2) Legionella positivity than those built between 1900 and 1939 and between 1940 and 1979, but they were no different than the newest buildings (Fig. 1). In newer buildings (1980–2019), all decades had buildings with Legionella positivity (Fig. 1 inset). Campus acreage varied from ~3 acres to almost 500 acres. Although significant differences were found in Legionella positivity for different campus sizes, there was no clear trend and campus acreage may not be a suitable proxy for the extent or complexity of water systems feeding buildings. Conclusions: The analysis of this large, real-world data set supports an assumption that taller buildings are more likely to be associated with Legionella detection, perhaps a result of more extensive piping. In contrast, the assumption that newer buildings are less associated with Legionella was not fully supported. These results demonstrate the variability in Legionella positivity in buildings, and they also provide evidence that can inform implementation of water safety programs.
Disclosures: Chetan Jinadatha, principal Investigator/Co-I: Research: NIH/NINR, AHRQ, NSF principal investigator: Research: Xenex Healthcare Services. Funds provided to institution. Inventor: Methods for organizing the disinfection of one or more items contaminated with biological agents. Owner: Department of Veterans Affairs. Licensed to Xenex Disinfection System, San Antonio, TX.
Background: Enhanced terminal room cleaning with ultraviolet C (UVC) disinfection has become more commonly used as a strategy to reduce the transmission of important nosocomial pathogens, including Clostridioides difficile, but the real-world effectiveness remains unclear. Objectives: We aimed to assess the association of UVC disinfection during terminal cleaning with the incidence of healthcare-associated C. difficile infection and positive test results for C. difficile within the nationwide Veterans Health Administration (VHA) System. Methods: Using a nationwide survey of VHA system acute-care hospitals, information on UV-C system utilization and date of implementation was obtained. Hospital-level incidence rates of clinically confirmed hospital-onset C. difficile infection (HO-CDI) and positive test results with recent healthcare exposures (both hospital-onset [HO-LabID] and community-onset healthcare-associated [CO-HA-LabID]) at acute-care units between January 2010 and December 2018 were obtained through routine surveillance with bed days of care (BDOC) as the denominator. We analyzed the association of UVC disinfection with incidence rates of HO-CDI, HO-Lab-ID, and CO-HA-LabID using a nonrandomized, stepped-wedge design, using negative binomial regression model with hospital-specific random intercept, the presence or absence of UVC disinfection use for each month, with baseline trend and seasonality as explanatory variables. Results: Among 143 VHA acute-care hospitals, 129 hospitals (90.2%) responded to the survey and were included in the analysis. UVC use was reported from 42 hospitals with various implementation start dates (range, June 2010 through June 2017). We identified 23,021 positive C. difficile test results (HO-Lab ID: 5,014) with 16,213 HO-CDI and 24,083,252 BDOC from the 129 hospitals during the study period. There were declining baseline trends nationwide (mean, −0.6% per month) for HO-CDI. The use of UV-C had no statistically significant association with incidence rates of HO-CDI (incidence rate ratio [IRR], 1.032; 95% CI, 0.963–1.106; P = .65) or incidence rates of healthcare-associated positive C. difficile test results (HO-Lab). Conclusions: In this large quasi-experimental analysis within the VHA System, the enhanced terminal room cleaning with UVC disinfection was not associated with the change in incidence rates of clinically confirmed hospital-onset CDI or positive test results with recent healthcare exposure. Further research is needed to understand reasons for lack of effectiveness, such as understanding barriers to utilization.
To examine whether the issue and dissemination of national guidelines in the Manual of Antimicrobial Stewardship had an impact on reducing antibiotic use for acute respiratory tract infection (ARTI) and gastroenteritis.
An interrupted time-series analysis was performed using a large nationwide database from June 2016 to June 2018. Outpatients with ARTI or gastroenteritis aged ≥6 years were identified. The outcome measures were any antibiotic use and broad-spectrum antibiotic use. The season-adjusted changes in the rate of antibiotic prescriptions for 2 periods before and after the guideline issue date were examined.
There were 13,177,735 patients with ARTI and 300,565 patients with gastroenteritis during the study period. Among patients with ARTI, there was a significant downward trend in antibiotic use during the 2-year study period (−0.06% per week; 95% CI, −0.07% to −0.04%). However, there was no significant change in trends of antibiotic use between the pre-issue period and post-issue period (trend difference, −0.01% per week; 95% CI, −0.10% to 0.07%). Similarly, for patients with gastroenteritis, there was no significant change in the trends of antibiotic use between the pre-issue period and post-issue period (trend difference, −0.02% per week; 95% CI, −0.04% to 0.01%). Similar associations were observed in analyses for broad-spectrum antibiotic use.
Despite the issue of national guidelines to promote the appropriate use of antibiotics, there were no significant changes in trends of antibiotic use for outpatients with ARTI or gastroenteritis between the pre-issue and post-issue periods.
English as a foreign language education in East Asia has received repeated criticism for its lack of success in developing sufficient English oral proficiency among its students (Muller at al., 2014). In response to the criticism, the governments of China, Japan and South Korea attempted to include assessment of students’ speaking abilities as part of their high-stakes college entrance exams, hoping for positive washback effects in both primary- and secondary-school English education as well as on shadow education (i.e., non-formal private-sector education). These attempts often failed. In South Korea, a new test called the National English Ability Test (NEAT), which included direct assessment of students’ speaking skill among other skills, was developed in 2012. However, the government's plan to use NEAT to replace the current exam – the Korean College Scholastic Aptitude Test (KCSAT) – was quickly dropped before its implementation. In China, the government has tried to promote more communicative methods of English education through incorporating English speaking test in high-stakes tests such as the Gaokao – college admission tests – in addition to reducing the weight of English in the traditional paper-based exams. However, the policies have received heavy resistance at the regional level and have not been implemented at the national level. In Japan, the government asked universities to accept designated external proficiency tests as part of the Common Test, the existing college entrance exam, in order to make up for the exam's missing speaking component. After a mountain of criticism from test users, implementation of the plan is still pending. In this light, the aim of this paper is to discuss why these policy attempts failed. While these policy attempts occurred in three different contexts, we could see striking underlying commonalities. We argue that these policy attempts were made based on a set of beliefs separate from the reality of the stakeholders (e.g., students, parents and teachers). More specifically, the failures can be largely attributed to the governments’ monolithic view of the English language and their insufficient consideration for equity rather than equality.
We evaluated the relationship between local MRSA prevalence rates and antibiotic use across 122 VHA hospitals in 2016. Higher hospital-level MRSA prevalence was associated with significantly higher rates of antibiotic use, even after adjusting for case mix and stewardship strategies. Benchmarking anti-MRSA antibiotic use may need to adjust for MRSA prevalence.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.
The purpose of this paper is, as part of the stratification of Cohen–Macaulay rings, to investigate the question of when the fiber products are almost Gorenstein rings. We show that the fiber product
$R \times _T S$
of Cohen–Macaulay local rings R, S of the same dimension
over a regular local ring T with
is an almost Gorenstein ring if and only if so are R and S. In addition, the other generalizations of Gorenstein properties are also explored.
N-acetylaspartate (NAA) levels and serum brain-derived neurotrophic factor (BDNF) levels in patients with first-episode schizophrenia psychosis and age- and sex-matched healthy control subjects were investigated. In addition, plasma levels of homovanillic acid (HVA) and 3-methoxy-4-hydroxyphenylglycol (MHPG) were compared between the two groups.
Eighteen patients (nine males, nine females; age range: 13–52 years) were enrolled in the study, and 18 volunteers (nine males, nine females; age range: 15–49 years) with no current or past psychiatric history were also studied by magnetic resonance spectroscopy (MRS) as sex- and age-matched controls.
Levels of NAA/Cr in the left basal ganglia (p = 0.0065) and parieto-occipital lobe (p = 0.00498), but not in the frontal lobe, were significantly lower in patients with first-episode schizophrenia psychosis than in control subjects. No difference was observed between the serum BDNF levels of patients with first-episode schizophrenia psychosis and control subjects. In regard to the plasma levels of catecholamine metabolites, plasma MHPG, but not HVA, was significantly lower in the patients with first-episode psychosis than in control subjects. In addition, a significantly positive correlation was observed between the levels of NAA/Cr of the left basal ganglia and plasma MHPG in all subjects.
These results suggest that brain NAA levels in the left basal ganglia and plasma MHPG levels were significantly reduced at the first episode of schizophrenia psychosis, indicating that neurodegeneration via noradrenergic neurons might be associated with the initial progression of the disease.
Previous studies have shown that the function of hypothalamic-pituitary-adrenal (HPA) axis is involved in the characterization of personality traits. FK506-binding protein 51 (FKBP51 or FKBP5) is a co-chaperone of heat-shock protein 90, and plays an important role in the negative feedback regulation of HPA axis function. It has been reported that a C/T single nucleotide polymorphism in the intron 2 of FKBP5 gene (rs1360780) affects FKBP5 protein levels and cortisol response to dexamethasone and psychological stress tests. Therefore, it is hypothesized that the FKBP5 polymorphism affects personality traits. In the present study, we studied the association between this polymorphism and personality traits in healthy subjects.
Subjects were 826 Japanese healthy volunteers. Personality traits were assessed by the Temperament and Character Inventory (TCI), and the FKBP5 genotype was detected by a real-time PCR and cycling probe technology for SNP typing.
In total subjects, the group with the T allele predictive of impaired negative feedback regulation of the HPA axis had higher scores of harm avoidance (p = 0.043) and lower scores of cooperativeness (p = 0.019) compared to that without the T allele. The T allele was associated with higher scores of harm avoidance in females (p = 0.020) and lower scores of cooperativeness in males (p = 0.015).
The present study thus suggests that the FKBP5 polymorphism affects harm avoidance and cooperativeness in healthy subjects, with gender specificity.