To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Archaeologists working in eastern North America typically refer to precontact and early postcontact Native American maize-based agriculture as shifting or swidden. Based on a comparison with European agriculture, it is generally posited that the lack of plows, draft animals, and animal manure fertilization resulted in the rapid depletion of soil nitrogen. This required Indigenous farmers to move their fields frequently. In Northern Iroquoia, depletion of soil fertility is frequently cited as one reason why villages were moved to new locations every 20 to 40 years. Recent analysis of δ15N ratios of maize macrobotanical remains from Northern Iroquoia, however, suggests that Iroquoian farmers were able to maintain soil nitrogen in their maize fields. An expanded analysis of maize kernel δ15N ratios from three ancestral Mohawk villages indicates that farmers from those villages maintained soil nitrogen throughout the occupational spans of their villages. It further suggests that precontact Iroquoian agronomy was consistent with contemporary conservation agriculture practices.
Attention-deficit/hyperactivity disorder (ADHD) is a clinically heterogeneous neurodevelopmental disorder defined by characteristic behavioral and cognitive features. Abnormal brain dynamic functional connectivity (dFC) has been associated with the disorder. The full spectrum of ADHD-related variation of brain dynamics and its association with behavioral and cognitive features remain to be established.
We sought to identify patterns of brain dynamics linked to specific behavioral and cognitive dimensions using sparse canonical correlation analysis across a cohort of children with and without ADHD (122 children in total, 63 with ADHD). Then, using mediation analysis, we tested the hypothesis that cognitive deficits mediate the relationship between brain dynamics and ADHD-associated behaviors.
We identified four distinct patterns of dFC, each corresponding to a specific dimension of behavioral or cognitive function (r = 0.811–0.879). Specifically, the inattention/hyperactivity dimension was positively associated with dFC within the default mode network (DMN) and negatively associated with dFC between DMN and the sensorimotor network (SMN); the somatization dimension was positively associated with dFC within DMN and SMN; the inhibition and flexibility dimension and fluency and memory dimensions were both positively associated with dFC within DMN and between DMN and SMN, and negatively associated with dFC between DMN and the fronto-parietal network. Furthermore, we observed that cognitive functions of inhibition and flexibility mediated the relationship between brain dynamics and behavioral manifestations of inattention and hyperactivity.
These findings document the importance of distinct patterns of dynamic functional brain activity for different cardinal behavioral and cognitive features related to ADHD.
The emerald ash borer, Agrilus planipennis Fairmaire (Coleoptera: Buprestidae), is a serious pest of ash (Fraxinus spp.) (Oleaceae) in North America. Control of emerald ash borer is difficult in natural forest settings; therefore, a classical biological control programme is the most feasible management option for this invasive, nonnative insect. Here, we report the first Canadian release and establishment of parasitoids Tetrastichus planipennisi Yang (Hymeoptera: Eulophinae), Oobius agrili Zhang and Huang (Hymenoptera: Encyrtidae), and Spathius galinae Belokobylskij and Strazanac (Hymenoptera: Braconidae) in natural forests in Ontario, Quebec, and New Brunswick, Canada for the control of emerald ash borer. Releases of T. planipennisi were made from 2013 to 2019, O. agrili from 2015 to 2019, and S. galinae from 2017 to 2019. Trees from release sites were destructively sampled to rear out adult emerald ash borers and parasitoids 1–3 years after parasitoid release. Recoveries of T. planipennisi were made at 81% of release sites (13 of 16) 1–2 years after release, and O. agili were recovered from 29% of release sites (4 of 14) 1–3 years after release. Spathius galinae was not recovered. These data provide important information for the development and deployment of a successful biological control programme for the management of emerald ash borer in Canada.
Persistent psychological distress associated with the coronavirus disease 2019 (COVID-19) pandemic has been well documented. This study aimed to identify pre-COVID brain functional connectome that predicts pandemic-related distress symptoms among young adults.
Baseline neuroimaging studies and assessment of general distress using the Depression, Anxiety and Stress Scale were performed with 100 healthy individuals prior to wide recognition of the health risks associated with the emergence of COVID-19. They were recontacted for the Impact of Event Scale-Revised and the Posttraumatic Stress Disorder Checklist in the period of community-level outbreaks, and for follow-up distress evaluation again 1 year later. We employed the network-based statistic approach to identify connectome that predicted the increase of distress based on 136-region-parcellation with assigned network membership. Predictive performance of connectome features and causal relations were examined by cross-validation and mediation analyses.
The connectome features that predicted emergence of distress after COVID contained 70 neural connections. Most within-network connections were located in the default mode network (DMN), and affective network-DMN and dorsal attention network-DMN links largely constituted between-network pairs. The hippocampus emerged as the most critical hub region. Predictive models of the connectome remained robust in cross-validation. Mediation analyses demonstrated that COVID-related posttraumatic stress partially explained the correlation of connectome to the development of general distress.
Brain functional connectome may fingerprint individuals with vulnerability to psychological distress associated with the COVID pandemic. Individuals with brain neuromarkers may benefit from the corresponding interventions to reduce the risk or severity of distress related to fear of COVID-related challenges.
Identification of treatment-specific predictors of drug therapies for bipolar disorder (BD) is important because only about half of individuals respond to any specific medication. However, medication response in pediatric BD is variable and not well predicted by clinical characteristics.
A total of 121 youth with early course BD (acute manic/mixed episode) were prospectively recruited and randomized to 6 weeks of double-blind treatment with quetiapine (n = 71) or lithium (n = 50). Participants completed structural magnetic resonance imaging (MRI) at baseline before treatment and 1 week after treatment initiation, and brain morphometric features were extracted for each individual based on MRI scans. Positive antimanic treatment response at week 6 was defined as an over 50% reduction of Young Mania Rating Scale scores from baseline. Two-stage deep learning prediction model was established to distinguish responders and non-responders based on different feature sets.
Pre-treatment morphometry and morphometric changes occurring during the first week can both independently predict treatment outcome of quetiapine and lithium with balanced accuracy over 75% (all p < 0.05). Combining brain morphometry at baseline and week 1 allows prediction with the highest balanced accuracy (quetiapine: 83.2% and lithium: 83.5%). Predictions in the quetiapine and lithium group were found to be driven by different morphometric patterns.
These findings demonstrate that pre-treatment morphometric measures and acute brain morphometric changes can serve as medication response predictors in pediatric BD. Brain morphometric features may provide promising biomarkers for developing biologically-informed treatment outcome prediction and patient stratification tools for BD treatment development.
Major depressive disorder (MDD) is a clinically and biologically heterogeneous syndrome. Identifying discrete subtypes of illness with distinguishing neurobiological substrates and clinical features is a promising strategy for guiding personalised therapeutics.
This study aimed to identify depression subtypes with correlated patterns of functional network connectivity and clinical symptoms by clustering patients according to a weighted linear combination of both features in a relatively large, medication-naïve depression sample.
We recruited 115 medication-naïve adults with MDD and 129 matched healthy controls, and evaluated all participants with magnetic resonance imaging. We used regularised canonical correlation analysis to identify component mapping relationships between functional network connectivity and symptom profiles, and K-means clustering was used to define distinct subtypes of patients.
Two subtypes of MDD were identified: insomnia-dominated subtype 1 and anhedonia-dominated subtype 2. Subtype 1 was characterised by abnormal hyperconnectivity within the ventral attention network and sleep maintenance insomnia. Subtype 2 was characterised by abnormal hypoconnectivity in the subcortical and dorsal attention networks, and prominent anhedonia symptoms.
Our study identified two distinct subtypes of patients with specific neurobiological and clinical symptom profiles. These findings advance understanding of the biological and clinical heterogeneity of MDD, offering a pathway for defining categorical subtypes of illness via consideration of both biological and clinical features.
Antisaccade tasks can be used to index cognitive control processes, e.g. attention, behavioral inhibition, working memory, and goal maintenance in people with brain disorders. Though diagnoses of schizophrenia (SZ), schizoaffective (SAD), and bipolar I with psychosis (BDP) are typically considered to be distinct entities, previous work shows patterns of cognitive deficits differing in degree, rather than in kind, across these syndromes.
Large samples of individuals with psychotic disorders were recruited through the Bipolar-Schizophrenia Network on Intermediate Phenotypes 2 (B-SNIP2) study. Anti- and pro-saccade task performances were evaluated in 189 people with SZ, 185 people with SAD, 96 people with BDP, and 279 healthy comparison participants. Logistic functions were fitted to each group's antisaccade speed-performance tradeoff patterns.
Psychosis groups had higher antisaccade error rates than the healthy group, with SZ and SAD participants committing 2 times as many errors, and BDP participants committing 1.5 times as many errors. Latencies on correctly performed antisaccade trials in SZ and SAD were longer than in healthy participants, although error trial latencies were preserved. Parameters of speed-performance tradeoff functions indicated that compared to the healthy group, SZ and SAD groups had optimal performance characterized by more errors, as well as less benefit from prolonged response latencies. Prosaccade metrics did not differ between groups.
With basic prosaccade mechanisms intact, the higher speed-performance tradeoff cost for antisaccade performance in psychosis cases indicates a deficit that is specific to the higher-order cognitive aspects of saccade generation.
There is increasing evidence that blood oxygenation level-dependent signaling in white matter (WM) reflects WM functional activity. Whether this activity is altered in schizophrenia remains uncertain, as does whether it is related to established alterations of gray matter (GM) or the microstructure of WM tracts.
A total of 153 antipsychotic-naïve schizophrenia patients and 153 healthy comparison subjects were assessed by resting-state functional magnetic resonance imaging, diffusion tensor imaging, and high-resolution T1-weighted imaging. We tested for case–control differences in the functional activity of WM, and examined their relation to the functional activity of GM and WM microstructure. The relations between fractional anisotropy (FA) in WM and GM–WM functional synchrony were investigated as well. Then, we examined the associations of identified abnormalities to age, duration of untreated psychosis (DUP), and symptom severity.
Schizophrenia patients displayed reductions of the amplitude of low-frequency fluctuations (ALFF), GM–WM functional synchrony, and FA in widespread regions. Specifically, the genu of corpus callosum not only had weakening in the synchrony of functional activity but also had reduced ALFF and FA. Positive associations were found between FA and functional synchrony in the genu of corpus callosum as well. No significant association was found between identified abnormalities and DUP, and symptom severity.
The widespread weakening in the synchrony of functional activity of GM and WM provided novel evidence for functional alterations in schizophrenia. Regarding the WM function as a component of brain systems and investigating its alternation represent a promising direction for future research.
Anxiety disorders are common in autism spectrum disorder (ASD) and associated with social–communication impairment and repetitive behavior symptoms. The neurobiology of anxiety in ASD is unknown, but amygdala dysfunction has been implicated in both ASD and anxiety disorders. Using resting-state functional magnetic resonance imaging, we compared amygdala–prefrontal and amygdala–striatal connections across three demographically matched groups studied in the Autism Brain Imaging Data Exchange (ABIDE): ASD with a comorbid anxiety disorder (N = 25; ASD + Anxiety), ASD without a comorbid disorder (N = 68; ASD-NoAnx), and typically developing controls (N = 139; TD). Relative to ASD-NoAnx and TD controls, ASD + Anxiety individuals had decreased connectivity between the amygdala and dorsal/rostral anterior cingulate cortex (dACC/rACC). The functional connectivity of these connections was not affected in ASD-NoAnx, and amygdala connectivity with ventral ACC/medial prefrontal cortex (mPFC) circuits was not different in ASD + Anxiety or ASD-NoAnx relative to TD. Decreased amygdala–dorsomedial prefrontal cortex (dmPFC)/rACC connectivity was associated with more severe social impairment in ASD + Anxiety; amygdala–striatal connectivity was associated with restricted, repetitive behavior (RRB) symptom severity in ASD-NoAnx individuals. These findings suggest comorbid anxiety in ASD is associated with disrupted emotion-monitoring processes supported by amygdala–dACC/mPFC pathways, whereas emotion regulation systems involving amygdala–ventromedial prefrontal cortex (vmPFC) are relatively spared. Our results highlight the importance of accounting for comorbid anxiety for parsing ASD neurobiological heterogeneity.
Background: Prevention of central-line–associated bloodstream infections (CLABSIs) and methicillin-resistant Staphylococcus aureus (MRSA) infections requires a multifaceted approach including strategies to decrease cutaneous bacterial colonization. Prior studies have shown benefit from chlorhexidine-gluconate (CHG) skin application on CLABSI and MRSA infection rates in intensive care units (ICUs); however, the use of CHG in the non-ICU population has not been well studied. Methods: We performed a quasi-experimental before-and-after study to evaluate the use of daily 2% CHG wipes in non-ICU patients at a 1,000 bed acute-care teaching hospital beginning in November 2017. The study population included adult and pediatric patients with central venous catheters on non-ICU units, excluding patients on the following units: stem cell transplant and hematologic malignancy (these units had already established use of CHG skin application as a standard prior to the intervention), labor and delivery, and psychiatry. CHG was applied according to the manufacturer’s instruction by nurses or nurse aides and random monthly auditing of compliance was performed. NHSN CLABSI, hospital-onset MRSA bacteremia, and hospital-onset MRSA LabID rates were compared for the period 24 months before the intervention (November 1, 2015, through October 31, 2017) to the 24-month period after the intervention (November 1, 2017, through October 31, 2019) using a paired t test. Notably, the health system also discontinued the use of contact precautions for patients with MRSA (excluding MRSA from open, draining wounds) 11 months prior to onset of this intervention. Results: The CLABSI rate decreased by 26% from 0.594 events per 1,000 central-line days (n = 50) before the intervention to 0.438 events per 1,000 central-line days (n = 38) after the intervention (P = 0.19). The number of CLABSIs with gram-positive organisms also decreased by 29%. MRSA LabID rates decreased by 37% from 0.301 events per 1,000 patient days (n = 119) to 0.189 events per 1,000 patient days (n = 75) (P = 0.01). MRSA bacteremia rates decreased by 79% from 0.058 events per 1,000 patient days (n = 23) to 0.012 events per 1,000 patient days (n = 5) (P < 0.01). Compliance with the intervention was 83% (n = 225). Conclusions: Daily CHG skin application in non-ICU patients with central venous catheters is an effective strategy to prevent CLABSIs and MRSA infections. We observed a decrease in MRSA LabID and bacteremia rates despite discontinuation of contact precautions. These findings suggest that a horizontal prevention approach of daily CHG skin application may be an effective alternative to contact isolation to interrupt transmission of MRSA in hospitalized patients outside the ICU setting.
To evaluate whether incorporating mandatory prior authorization for Clostridioides difficile testing into antimicrobial stewardship pharmacist workflow could reduce testing in patients with alternative etiologies for diarrhea.
Single center, quasi-experimental before-and-after study.
Tertiary-care, academic medical center in Ann Arbor, Michigan.
Adult and pediatric patients admitted between September 11, 2019 and December 10, 2019 were included if they had an order placed for 1 of the following: (1) C. difficile enzyme immunoassay (EIA) in patients hospitalized >72 hours and received laxatives, oral contrast, or initiated tube feeds within the prior 48 hours, (2) repeat molecular multiplex gastrointestinal pathogen panel (GIPAN) testing, or (3) GIPAN testing in patients hospitalized >72 hours.
A best-practice alert prompting prior authorization by the antimicrobial stewardship program (ASP) for EIA or GIPAN testing was implemented. Approval required the provider to page the ASP pharmacist and discuss rationale for testing. The provider could not proceed with the order if ASP approval was not obtained.
An average of 2.5 requests per day were received over the 3-month intervention period. The weekly rate of EIA and GIPAN orders per 1,000 patient days decreased significantly from 6.05 ± 0.94 to 4.87 ± 0.78 (IRR, 0.72; 95% CI, 0.56–0.93; P = .010) and from 1.72 ± 0.37 to 0.89 ± 0.29 (IRR, 0.53; 95% CI, 0.37–0.77; P = .001), respectively.
We identified an efficient, effective C. difficile and GIPAN diagnostic stewardship approval model.
OBJECTIVES/SPECIFIC AIMS: Abnormalities in sensorimotor behavior are present in the majority of individuals with ASD and associated with core symptoms. Cortico-cerebellar networks that control sensorimotor behavior have been implicated in ASD, but little is known about their function during sensorimotor actions. The purpose of this functional magnetic resonance imaging (fMRI) study was to examine cortical-cerebellar function during feedback-guided motor behavior in ASD. METHODS/STUDY POPULATION: Individuals with ASD (11-30 years; N = 18) and age-matched controls (N = 15) completed a visuomotor task of feedback-guided precision gripping during fMRI. Participants pressed with their right thumb and forefinger on a force transducer while viewing a green FORCE bar on a screen that moved upwards with increased force toward a fixed white TARGET bar. Individuals were instructed to maintain the FORCE bar at the level of the TARGET bar for 24 seconds. Target force levels were set at 20% and 60% of each participant’s maximum voluntary contraction (MVC). Force variability was characterized as the coefficient of variation (i.e., standard deviation of the force time series / mean force output; CoV). RESULTS/ANTICIPATED RESULTS: Mean force did not differ between groups indicating participants were able to follow task demands. Participants with ASD showed increased force variability (F(1,30) = 5.214, p = 0.03) at both 20% (d = .45) and 60% (d = .77) MVC compared to controls. Compared to controls, individuals with ASD showed decreased activation in left angular gyrus during the visuomotor task compared to rest (AG; maximum t = 4.31). Individuals with ASD also showed greater visuomotor activation compared to controls in ipsilateral ventral M1, extending anteriorly into posterior ventral pre-motor cortex (PMv; maximum t = −4.06, cluster size = 38 voxels). This difference reflected the finding that control participants showed a selective deactivation of ipsilateral M1/PMv during visuomotor behavior, whereas individuals with ASD did not show this pattern. A significant group x force interaction was observed for contralateral Crus I activation (maximum t = −2.42) that was driven by an increase in activity during 60% compared to 20% MVC in control participants, while individuals with ASD showed no change in Crus I activation between force levels. DISCUSSION/SIGNIFICANCE OF IMPACT: Increased force variability in individuals with ASD suggests impaired processing of sensory feedback to guide precision motor behaviors. Individuals with ASD did not show deactivation of right motor cortex during visuomotor behavior relative to rest, suggesting reduced ability to selectively modulate motor cortical output. Reduced activation in left AG may reflect an inability to integrate visual, haptic, and proprioceptive inputs to reactively adjust ongoing motor output. Failure to show force-dependent scaling of Crus I in ASD suggests lateral cerebellar circuits do not adapt sensory prediction and error processes to maintain precision motor output during more demanding conditions. Together, our results demonstrate multiple cortical-cerebellar mechanisms associated with sensorimotor imprecision in ASD.
The Kulshan caldera formed at ∼1.15 Ma on the present-day site of Mt. Baker, Washington State, northwest USA and erupted a compositionally zoned (dacite-rhyolite) magma and a correlative eruptive, the Lake Tapps tephra. This tephra has previously been described, but only from the Puget Lowland of NW Washington. Here an occurrence of a Kulshan caldera correlative tephra is described from the Quaternary Palouse loess at the Washtucna site (WA-3). Site WA-3 is located in east-central Washington, ∼340 km southeast of the Kulshan caldera and ∼300 km east-southeast of the Lake Tapps occurrence in the Puget Lowland. Major- and trace element chemistry and location of the deposit at Washtucna within reversed polarity sediments indicates that it is not correlative with the Mesa Falls, Rockland, Bishop Ash, Lava Creek B or Huckleberry Ridge tephras. Instead the Washtucna deposit is related to the Lake Tapps tephra by fractional crystallisation, but is chemically distinct, a consequence of its eruption from a compositionally zoned magma chamber. The correlation of the Washtucna occurrence to the Kulshan caldera-forming eruption indicates that it had an eruptive volume exceeding 100 km3, and that its tephra could provide a valuable early-Pleistocene chronostratigraphic marker in the Pacific Northwest.
We investigated the potential for human-mediated range expansion of an exotic beech leaf-mining weevil, Orchestes fagi (Linnaeus) (Coleoptera: Curculionidae: Curculioninae: Rhamphini) (formerly known as Rhynchaenus fagi) on timber or firewood, which for eight to nine months of the year may harbour adults in diapause. In both relatively low-density and high-density populations, adults were found on the base, middle, and upper boles of the primary host, American beech (Fagus grandifolia Ehrhart; Fagaceae), as well as red maple (Acer rubrum Linnaeus; Sapindaceae) and red spruce (Picea rubens Sargent; Pinaceae) in the vicinity. Comparatively few individuals were found on tree branches, or in the moss, duff, or soil collected beneath beech trees. Overwintering adults appeared to favour parts of trees with relatively high bark roughness. Our study suggests that, between the months of July through May, any woody stems near areas having O. fagi outbreaks are likely to harbour adults. Moreover, as all of the trees studied are common sources of timber or firewood, the harvest and transport of wood from these areas may facilitate outbreak spread; this may explain the multiple, distantly distributed populations of O. fagi that have been reported in eastern Nova Scotia, Canada in recent years.
Individuals with schizophrenia are known to demonstrate cognitive and behavioral difficulties, particularly alterations in executive functions, including working memory. It is unclear whether these deficits reflect trait-related vulnerability to schizophrenia indicators and can be assessed by studying nonpsychotic relatives of patients with schizophrenia. In this study, we used an oculomotor delayed response (ODR) paradigm to examine spatial working memory in 37 “high-risk” child and adolescent offspring and siblings (age range=6–25 years) of patients with schizophrenia or schizoaffective disorder. Compared with 37 age- and sex-matched healthy controls (age range=6–23 years), high-risk subjects showed nonsignificantly greater errors in the ODR task at longer delay intervals. Statistical analyses suggested that performance improved with age in healthy control subjects, whereas it worsened with age in high-risk subjects. In both groups, the ODR errors were generally associated with poorer sustained attention (Continuous Performance Test: visuospatial d prime), some-what poorer executive function (Wisconsin Card Sorting Test), and elevated Heinrichs-Buchanan neurological soft signs scores. These findings indicate the presence of spatial working memory abnormalities in individuals at risk for schizophrenia. Furthermore, these abnormalities may be progressive in nature. The ODR task is a valuable indicator of prefrontal cortical function and spatial working memory and may be a potentially valuable marker for familial risk of schizophrenia.
The current study examines the impact of a nutrition rating system on consumers’ food purchases in supermarkets.
Aggregate sales data for 102 categories of food (over 60 000 brands) on a weekly basis for 2005–2007 from a supermarket chain of over 150 stores are analysed. Change in weekly sales of nutritious and less nutritious foods, after the introduction of a nutrition rating system on store shelves, is calculated, controlling for seasonality and time trends in sales.
One hundred and sixty-eight supermarket stores in the north-east USA, from January 2005 to December 2007.
Consumers purchasing goods at the supermarket chain during the study period.
After the introduction of the nutrition ratings, overall weekly food sales declined by an average of 3637 units per category (95 % CI –5961, –1313; P<0·01). Sales of less nutritious foods fell by 8·31 % (95 % CI –13·50, –2·80 %; P=0·004), while sales of nutritious foods did not change significantly (P=0·21); as a result, the percentage of food purchases rated as nutritious rose by 1·39 % (95 % CI 0·58, 2·20 %; P<0·01). The decrease in sales of less nutritious foods was greatest in the categories of canned meat and fish, soda pop, bakery and canned vegetables.
The introduction of the nutrition ratings led shoppers to buy a more nutritious mix of products. Interestingly, it did so by reducing purchases of less nutritious foods rather than by increasing purchases of nutritious foods. In evaluating nutrition information systems, researchers should focus on the entire market basket, not just sales of nutritious foods.
Phytase (PHY) improves growth performance, nutrient digestibility and bone structure in pigs; however, little is known about its effects on intestinal nutrient transporter gene expression. In the present study, a 44 d experiment was carried out using forty-eight pigs (11·76 (sem 0·75) kg) assigned to one of three dietary treatment groups to measure growth performance, coefficient of apparent ileal digestibility (CAID), coefficient of apparent total tract nutrient digestibility (CATTD) and intestinal nutrient transporter gene expression. Dietary treatments during the experimental period were as follows: (1) a high-P (HP) diet containing 3·4 g/kg available P and 7·0 g/kg Ca; (2) a low-P (LP) diet containing 1·9 g/kg available P and 5·9 g/kg Ca; (3) a PHY diet containing LP diet ingredients+1000 phytase units (FTU)/kg of PHY. The PHY diet increased the average daily gain (P< 0·05) and final body weight (P< 0·01) and decreased the feed conversion ratio (P< 0·05) compared with the LP diet. Pigs fed the PHY diet had a higher CAID of gross energy compared with those fed the HP and LP diets (P< 0·001). Pigs fed the PHY diet had increased CAID of P (P< 0·01) and CATTD of Ca and P (P< 0·001) compared with those fed the LP diet. The PHY diet increased the gene expression of the peptide transporter 1 (PEPT1/SLC15A1) (P< 0·05) in the ileum compared with the LP diet. The LP diet decreased the gene expression of the sodium–glucose-linked transporter 1 (SGLT1/SLC5A1) and GLUT2/SLC2A2 (P< 0·05) and increased the expression of membrane Ca channel (TRPV6) and calbindin compared with the HP diet (P< 0·001). In conclusion, feeding a diet supplemented with PHY improves growth performance and nutrient digestibility as well as increases the gene expression of the peptide transporter PEPT1.
Olfactory identification deficits (OIDs) are seen in schizophrenia, but it is unclear whether they are state- or trait-related.
We examined the prevalence of OIDs, as assessed by the University of Pennsylvania Smell Identification Test (UPSIT), and their correlations with prodromal symptoms in young relatives at risk for schizophrenia or schizoaffective disorder (HR-S).
UPSIT scores were lower in HR- S than in healthy controls, but were non-significant after covarying the effects of age, gender and IQ. OID deficits in HR-S were correlated, after covarying out the effects of age and IQ, with prodromal disorganisation.
The potential value of OID deficits as markers of psychopathological vulnerability in young relatives at risk for schizophrenia deserves further investigation.