To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Oxford English Dictionary defines psychopharmacology as ‘the scientific study of the effect of drugs on the mind and behaviour’ (Oxford English Dictionary Online, 2018). The earliest reference to the term was in 1548 when Reinhard Lorichius published the prayer book Psychopharmakon, hoc est Medicina Animae (Lehmann, 1993; Wolman, 1977). Lorichius coined the term ‘psychopharmakon’ to refer to spiritual medicine that could reduce human suffering. The word psychopharmacology was first used in a scientific paper in 1920 by a pharmacologist working at Johns Hopkins University who wrote a short paper entitled Contributions to psychopharmacology (Macht, 1920).
Introduction: Paramedics commonly administer intravenous dextrose to severely hypoglycemic patients. Typically, the treatment provided is a 25g ampule of 50% dextrose (D50). This dose of D50 is meant to ensure a return to consciousness. However, this dose may be unnecessary and lead to harm or difficulties regulating blood glucose post treatment. We hypothesize that a lower dose such as dextrose 10% (D10) or titrating the D50 to desired level of consciousness may be optimal and avoid adverse events. Methods: We systematically searched Medline, Embase, CINAHL and Cochrane Central on June 5th 2019. PRISMA guidelines were followed. The GRADE methods and risk of bias assessments were applied to determine the certainty of the evidence. We included primary literature investigating the use of intravenous dextrose in hypoglycemic diabetic patients presenting to paramedics or the emergency department. Outcomes of interest were related to the safe and effective reversal of symptoms and blood glucose levels (BGL). Results: 660 abstracts were screened, 40 full text articles, with eight studies included. Data from three randomized controlled trials and five observational studies were analyzed. A single RCT comparing D10 to D50 was identified. The primary significant finding of the study was an increased post-treatment glycemic profile by 3.2 mmol/L in the D50 group; no other outcomes had significant differences between groups. When comparing pooled data from all the included studies we find higher symptom resolution in the D10 group compared to the D50 group; at 99.8% and 94.9% respectively. However, the mean time to resolution was approximately 4 minutes longer in the D10 group (4.1 minutes (D50) and 8 minutes (D10)). There was more need for subsequent doses in the D10 group at 23.0% versus 16.5% in the D50 group. The post treatment glycemic profile was lower in the D10 group at 5.9 mmol/L versus 8.5 mmol/L in the D50 group. Both treatments had nearly complete resolution of hypoglycemia; 98.7% (D50) and 99.2% (D10). No adverse events were observed in the D10 group (0/871) compared to 12/133 adverse events in the D50 group. Conclusion: D10 may be as effective as D50 at resolving symptoms and correcting hypoglycemia. Although the desired effect can take several minutes longer there appear to be fewer adverse events. The post treatment glycemic profile may facilitate less challenging ongoing glucose management by the patients.
We know from neurological diseases that there is not only one way to hallucinate. This might also be the case in the psychiatric field. During a trial on refractory verbal hallucinations, we rediscovered a subgroup described under several names in France (Délire chronique d’évolution systématique 1882, Psychose Hallucinatoire Chronique 1911-1953), England (Late Paraphrenia, 1954) and Germany (Affective Paraphrenia - AP, 1968). Roughly, AP can be viewed as the core of paranoid schizophrenia.
We compared 10 AP patients with refractory hallucinations to 35 healthy controls with structural and functional MRI (fMRI). We looked for regions that presented with both grey matter deficit relative to controls and with hallucination-related activity. The lateral orbito-frontal cortex (LOF) was bilaterally involved both anatomically and functionally.
Using fMRI, we studied whole brain functional connectivity, both as a trait factor, i.e. hallucinators vs controls, and as a state factor, i.e. ON vs OFF hallucinations in the same patient. As a trait, functional connectivity was significantly increased between left and right LOF in patients relative to controls; however as a state, functional connectivity dropped to zero between left LOF, left and right superior temporal sulcus (STS) when ON relative to OFF hallucination.
In a larger group of AP patients without ongoing hallucinations, the LOF was still disconnected from the cingulate and temporal regions, in comparison not only to controls, but also relative to non AP type schizophrenias, most of whom also hallucinate during episodes.
We will discuss the “LOF-story hypothesis” for AP patients and their hallucinations.
Use of the herbicide atrazine (ATR) is banned in the European Union; yet, it is still widely used in the USA and Australia. ATR is known to alter testosterone and oestrogen production and thus reproductive characteristics in numerous species. In this proof of concept study, we examined the effect of ATR exposure, at a supra-environmental dose (5 mg/kg bw/day), beginning on E9.5 in utero, prior to sexual differentiation of the reproductive tissues, until 26 weeks of age, on the development of the mouse penis. Notably, this is the first study to specifically investigate whether ATR can affect penis characteristics. We show that ATR exposure, beginning in utero, causes a shortening (demasculinisation) of penis structures and increases the incidence of hypospadias in mice. These data indicate the need for further studies of ATR on human reproductive development and fertility, especially considering its continued and widespread use.
Advancements in computer technology have enabled three-dimensional (3D) reconstruction, data-stitching, and manipulation of 3D data obtained on X-ray imaging systems such as micro-computed tomography (μ-CT). Likewise, intuitive evaluation of these 3D datasets can be enhanced by recent advances in virtual reality (VR) hardware and software. Additionally, the generation, viewing, and manipulation of 3D X-ray diffraction datasets, such as pole figures employed for texture analysis, can also benefit from these advanced visualization techniques. We present newly-developed protocols for porting 3D data (as TIFF-stacks) into a Unity gaming software platform so that data may be toured, manipulated, and evaluated within a more-intuitive VR environment through the use of game-like controls and 3D headsets. We demonstrate this capability by rendering μ-CT data of a polymer dogbone test bar at various stages of in situ mechanical strain. An additional experiment is presented showing 3D XRD data collected on an aluminum test block with vias. These 3D XRD data for texture analysis (χ, ϕ, 2θ dimensions) enables the viewer to visually inspect 3D pole figures and detect the presence or absence of in-plane residual macrostrain. These two examples serve to illustrate the benefits of this new methodology for multidimensional analysis.
Introduction: Oxygen is commonly administered to prehospital patients presenting with acute myocardial infarction (AMI). We conducted a systematic review to determine if oxygen administration, in AMI, impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central, clinicaltrials.gov and ISRCTN for relevant randomized controlled trials and observational studies comparing oxygen administration and no oxygen administration. The outcomes of interest were: mortality (≤30 days, in-hospital, and intermediate 2-11 months), infarct size, and major adverse cardiac events (MACE). Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. A meta-analysis was performed using RevMan 5 software. Results: Our search yielded 1192 citations of which 48 studies were reviewed as full texts and a total of 8 studies were included in the analysis. All evidence was considered low or very low quality. Five studies reported on mortality finding low quality evidence of no benefit or harm. Low quality evidence demonstrated no benefit or harm from supplemental oxygen administration. Similarly, no benefit or harm was found in MACE or infarct size (very low quality). Normoxia was defined as oxygen saturation measured via pulse oximetry at ≥90% in one recent study and ≥94% in another. Conclusion: We found low and very low quality evidence that the administration of supplemental oxygen to normoxic patients experiencing AMI, provides no clear harm nor benefit for mortality or MACE. The evidence on infarct size was inconsistent and warrants further prospective examination.
Introduction: Opioids are routinely administered for analgesia to prehospital patients experiencing chest discomfort from acute myocardial infarction (AMI). We conducted a systematic review to determine if opioid administration impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central and Clinicaltrials.gov for relevant randomized controlled trials and observational studies comparing opioid administration in AMI patients from 1990 to 2017. The outcomes of interest were: all-cause short-term mortality (≤30 days), major adverse cardiac events (MACE), platelet activity and aggregation, immediate adverse events, infarct size, and analgesia. Included studies were hand searched for additional citations. Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. Results: Our search yielded 3001 citations of which 19 studies were reviewed as full texts and a total of 9 studies were included in the analysis. The studies predominantly reported on morphine as the opioid. Five studies reported on mortality (≤30 days), seven on MACE, four on platelet activity and aggregation, two on immediate adverse events, two on infarct size and none on analgesic effect. We found low quality evidence suggesting no benefit or harm in terms of mortality or MACE. However, low quality evidence indicates that opioids increase infarct size. Low-quality evidence also shows reduced serum P2Y12 (eg: clopidogrel and ticagrelor) active metabolite levels and increased platelet reactivity in the first several hours post administration following an increase in vomiting. Conclusion: We find low and very low quality evidence that the administration of opioids in STEMI may be adversely related to vomiting and some surrogate outcomes including increased infarct size, reduced serum P2Y12 levels, and increased platelet activity. We found no clear benefit or harm on patient-oriented clinical outcomes including mortality.
Introduction: The Prehospital Evidence-Based Practice (PEP) program is an online, freely accessible, continuously updated Emergency Medical Services (EMS) evidence repository. This summary describes the research evidence for the identification and management of adult patients suffering from sepsis syndrome or septic shock. Methods: PubMed was searched in a systematic manner. One author reviewed titles and abstracts for relevance and two authors appraised each study selected for inclusion. Primary outcomes were extracted. Studies were scored by trained appraisers on a three-point Level of Evidence (LOE) scale (based on study design and quality) and a three-point Direction of Evidence (DOE) scale (supportive, neutral, or opposing findings based on the studies’ primary outcome for each intervention). LOE and DOE of each intervention were plotted on an evidence matrix (DOE x LOE). Results: Eighty-eight studies were included for 15 interventions listed in PEP. The interventions with the most evidence were related to identification tools (ID) (n = 26, 30%) and early goal directed therapy (EGDT) (n = 21, 24%). ID tools included Systematic Inflammatory Response Syndrome (SIRS), quick Sequential Organ Failure Assessment (qSOFA) and other unique measures. The most common primary outcomes were related to diagnosis (n = 30, 34%), mortality (n = 40, 45%) and treatment goals (e.g. time to antibiotic) (n = 14, 16%). The evidence rank for the supported interventions were: supportive-high quality (n = 1, 7%) for crystalloid infusion, supportive-moderate quality (n = 7, 47%) for identification tools, prenotification, point of care lactate, titrated oxygen, temperature monitoring, and supportive-low quality (n = 1, 7%) for vasopressors. The benefit of prehospital antibiotics and EGDT remain inconclusive with a neutral DOE. There is moderate level evidence opposing use of high flow oxygen. Conclusion: EMS sepsis interventions are informed primarily by moderate quality supportive evidence. Several standard treatments are well supported by moderate to high quality evidence, as are identification tools. However, some standard in-hospital therapies are not supported by evidence in the prehospital setting, such as antibiotics, and EGDT. Based on primary outcomes, no identification tool appears superior. This evidence analysis can guide selection of appropriate prehospital therapies.
Introduction: Long-term immobility has detrimental effects for critically ill patients admitted to the intensive care unit (ICU) including ICU-acquired weakness. Early mobilization of patients admitted to ICU has been demonstrated to be a safe, feasible and effective strategy to improve patient outcomes. The optimal mobilization of trauma ICU patients has not been extensively studied. Our objective was to determine the impact of an early mobilization protocol on outcomes among trauma patients admitted to the ICU. Methods: We analyzed all adult trauma patients ( > 18 years old) admitted to ICU over a 2-year period prior to and following implementation of an early mobilization protocol, allowing for a 1-year transition period. Data were collected from the Nova Scotia Trauma Registry. We compared patient characteristics and outcomes (mortality, length of stay [LOS], ventilator days) between the pre- and post-implementation groups. Associations between early mobilization and clinical outcomes were estimated using binary and linear regression models. Results: Overall, there were 526 patients included in the analysis (292 pre-implementation, 234 post-implementation). The study population ranged in age from 18 to 92 years (mean age 49.0 ± 20.4 years) and 74.3% of all patients were male. The pre- and post-implementation groups were similar in age, sex, and injury severity. In-hospital mortality was reduced in the post-implementation group (25.3% vs. 17.5%; p = 0.031). In addition, there was a reduction in ICU mortality in the post-implementation group (21.6% vs. 12.8%; p = 0.009). We did not observe any difference in overall hospital LOS, ICU LOS, or ventilator days between the two groups. Compared to the pre-implementation period, trauma patients admitted to the ICU following protocol implementation were less likely to die in-hospital (OR = 0.52, 95% CI 0.30-0.91; p = 0.021) or in the ICU (OR = 0.40, 95% CI 0.21- 0.76, p = 0.005). Results were similar following a sensitivity analysis limited to patients with blunt or penetrating injuries. There was no difference between the pre- and post-implementation groups with respect to in-hospital LOS, ICU LOS, or the number of ventilator days. Conclusion: We found that trauma patients admitted to ICU during the post-implementation period had decreased odds of in-hospital mortality and ICU mortality. Ours is the first study to demonstrate a significant reduction in trauma mortality following implementation of an ICU mobility protocol.
Introduction: Previous systematic reviews suggest early mobilization in the intensive care unit (ICU) population is feasible, safe, and may improve outcomes. Only one review investigated mobilization specifically in trauma ICU patients and failed to identify any relevant articles. The objective of the present systematic review was to conduct an up-to-date search of the literature to assess the effect of early mobilization in adult trauma ICU patients on mortality, length of stay (LOS) and duration of mechanical ventilation. Methods: We performed a systematic search of four electronic databases (Ovid MEDLINE, Embase, CINAHL, Cochrane Library) and the grey literature. To be included, studies must have compared early mobilization to delayed or no mobilization among trauma patients admitted to the ICU. Meta-analysis was performed to determine the effect of early mobilization on mortality, hospital LOS, ICU LOS, and duration of mechanical ventilation. Results: The search yielded 2,975 records from the 4 databases and 7 records from grey literature and bibliographic searches; of these, 9 articles met all eligibility criteria and were included in the analysis. There were 7 studies performed in the United States, 1 study from China and 1 study from Norway. Study populations included neurotrauma (3 studies), blunt abdominal trauma (2 studies), mixed injury types (2 studies) and burns (1 study). Cohorts ranged in size from 15 to 1,132 patients (median, 63) and varied in inclusion criteria. Most studies used some form of stepwise progressive mobility protocol. Two studies used simple ambulation as the mobilization measure, and 1 study employed upright sitting as their only intervention. Time to commencement of the intervention was variable across studies, and only 2 studies specified the timing of mobilization initiation. We did not detect a difference in mortality with early mobilization, although the pooled risk ratio (RR) was reduced (RR 0.90, 95% CI 0.74 to 1.09). Hospital LOS and ICU LOS were decreased with early mobilization, though this difference did not reach significance. Duration of mechanical ventilation was significantly shorter in the early mobilization group (mean difference −1.18. 95% CI −2.17 to −0.19). Conclusion: Our review identified few studies that examined mobilization of critically ill trauma patients in the ICU. On meta-analysis, early mobilization was found to reduce duration of mechanical ventilation, but the effects on mortality and LOS were not significant.
Despite the significant health benefits of breastfeeding for the mother and the infant, economic class and race disparities in breastfeeding rates persist. Support for breastfeeding from the father of the infant is associated with higher rates of breastfeeding initiation. However, little is known about the factors that may promote or deter father support of breastfeeding, especially in fathers exposed to contextual adversity such as poverty and violence. Using a mixed methods approach, the primary aims of the current work were to (1) elicit, using qualitative methodology, the worries, barriers and promotive factors for breastfeeding that expectant mothers and fathers identify as they prepare to parent a new infant, and (2) to examine factors that influence the parental breastfeeding intentions of both mothers and fathers using quantitative methodology. A sample (N=95) of expectant, third trimester mothers and fathers living in a low-income, urban environment in Midwestern USA, were interviewed from October 2013 to February 2015 about their infant feeding intentions. Compared with fathers, mothers more often identified the benefits of breastfeeding for the infant’s health and the economic advantage of breastfeeding. Mothers also identified more personal and community breastfeeding support resources. Fathers viewed their own support of breastfeeding as important but expressed a lack of knowledge about the breastfeeding process and often excluded themselves from discussions about infant feeding. The results point to important targets for interventions that aim to increase breastfeeding initiation rates in vulnerable populations in the US by increasing father support for breastfeeding.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
Movement disorders associated with exposure to antipsychotic drugs are common and stigmatising but underdiagnosed.
To develop and evaluate a new clinical procedure, the ScanMove instrument, for the screening of antipsychotic-associated movement disorders for use by mental health nurses.
Item selection and content validity assessment for the ScanMove instrument were conducted by a panel of neurologists, psychiatrists and a mental health nurse, who operationalised a 31-item screening procedure. Interrater reliability was measured on ratings for 30 patients with psychosis from ten mental health nurses evaluating video recordings of the procedure. Criterion and concurrent validity were tested comparing the ScanMove instrument-based rating of 13 mental health nurses for 635 community patients from mental health services with diagnostic judgement of a movement disorder neurologist based on the ScanMove instrument and a reference procedure comprising a selection of commonly used rating scales.
Interreliability analysis showed no systematic difference between raters in their prediction of any antipsychotic-associated movement disorders category. On criterion validity testing, the ScanMove instrument showed good sensitivity for parkinsonism (90%) and hyperkinesia (89%), but not for akathisia (38%), whereas specificity was low for parkinsonism and hyperkinesia, and moderate for akathisia.
The ScanMove instrument demonstrated good feasibility and interrater reliability, and acceptable sensitivity as a mental health nurse-administered screening tool for parkinsonism and hyperkinesia.
During the 2009 influenza pandemic, a rapid assessment of disease severity was a challenge as a significant proportion of cases did not seek medical care; care-seeking behaviour changed and the proportion asymptomatic was unknown. A random-digit-dialling telephone survey was undertaken during the 2011/12 winter season in England and Wales to address the feasibility of answering these questions. A proportional quota sampling strategy was employed based on gender, age group, geographical location, employment status and level of education. Households were recruited pre-season and re-contacted immediately following peak seasonal influenza activity. The pre-peak survey was undertaken in October 2011 with 1061 individuals recruited and the post-peak telephone survey in March 2012. Eight hundred and thirty-four of the 1061 (78.6%) participants were successfully re-contacted. Their demographic characteristics compared well to national census data. In total, 8.4% of participants self-reported an influenza-like illness (ILI) in the previous 2 weeks, with 3.2% conforming to the World Health Organization (WHO) ILI case definition. In total, 29.6% of the cases reported consulting their general practitioner. 54.1% of the 1061 participants agreed to be re-contacted about providing biological samples. A population-based cohort was successfully recruited and followed up. Longitudinal survey methodology provides a practical tool to assess disease severity during future pandemics.
Significant increases in excess all-cause mortality, particularly in the elderly, were observed during the winter of 2014/15 in England. With influenza A(H3N2) the dominant circulating influenza A subtype, this paper determines the contribution of influenza to this excess controlling for weather. A standardised multivariable Poisson regression model was employed with weekly all-cause deaths the dependent variable for the period 2008–2015. Adjusting for extreme temperature, a total of 26 542 (95% CI 25 301–27 804) deaths in 65+ and 1942 (95% CI 1834–2052) in 15–64-year-olds were associated with influenza from week 40, 2014 to week 20, 2015. This is compatible with the circulation of influenza A(H3N2). It is the largest estimated number of influenza-related deaths in England since prior to 2008/09. The findings highlight the potential health impact of influenza and the important role of the annual influenza vaccination programme that is required to protect the population including the elderly, who are vulnerable to a severe outcome.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
The Molonglo Observatory Synthesis Telescope (MOST) is an 18000 m2 radio telescope located 40 km from Canberra, Australia. Its operating band (820–851 MHz) is partly allocated to telecommunications, making radio astronomy challenging. We describe how the deployment of new digital receivers, Field Programmable Gate Array-based filterbanks, and server-class computers equipped with 43 Graphics Processing Units, has transformed the telescope into a versatile new instrument (UTMOST) for studying the radio sky on millisecond timescales. UTMOST has 10 times the bandwidth and double the field of view compared to the MOST, and voltage record and playback capability has facilitated rapid implementaton of many new observing modes, most of which operate commensally. UTMOST can simultaneously excise interference, make maps, coherently dedisperse pulsars, and perform real-time searches of coherent fan-beams for dispersed single pulses. UTMOST operates as a robotic facility, deciding how to efficiently target pulsars and how long to stay on source via real-time pulsar folding, while searching for single pulse events. Regular timing of over 300 pulsars has yielded seven pulsar glitches and three Fast Radio Bursts during commissioning. UTMOST demonstrates that if sufficient signal processing is applied to voltage streams, innovative science remains possible even in hostile radio frequency environments.
Childhood trauma is a risk factor for psychosis. Deficits in response inhibition are common to psychosis and trauma-exposed populations, and associated brain functions may be affected by trauma exposure in psychotic disorders. We aimed to identify the influence of trauma-exposure on brain activation and functional connectivity during a response inhibition task.
We used functional magnetic resonance imaging to examine brain function within regions-of-interest [left and right inferior frontal gyrus (IFG), right dorsolateral prefrontal cortex, right supplementary motor area, right inferior parietal lobule and dorsal anterior cingulate cortex], during the performance of a Go/No-Go Flanker task, in 112 clinical cases with psychotic disorders and 53 healthy controls (HCs). Among the participants, 71 clinical cases and 21 HCs reported significant levels of childhood trauma exposure, while 41 clinical cases and 32 HCs did not.
In the absence of effects on response inhibition performance, childhood trauma exposure was associated with increased activation in the left IFG, and increased connectivity between the left IFG seed region and the cerebellum and calcarine sulcus, in both cases and healthy individuals. There was no main effect of psychosis, and no trauma-by-psychosis interaction for any other region-of-interest. Within the clinical sample, the effects of trauma-exposure on the left IFG activation were mediated by symptom severity.
Trauma-related increases in activation of the left IFG were not associated with performance differences, or dependent on clinical diagnostic status; increased IFG functionality may represent a compensatory (overactivation) mechanism required to exert adequate inhibitory control of the motor response.
Healthy adults (n 30) participated in a placebo-controlled, randomised, double-blinded, cross-over study consisting of two 28 d treatments (β2-1 fructan or maltodextrin; 3×5 g/d) separated by a 14-d washout. Subjects provided 1 d faecal collections at days 0 and 28 of each treatment. The ability of faecal bacteria to metabolise β2-1 fructan was common; eighty-seven species (thirty genera, and four phyla) were isolated using anaerobic medium containing β2-1 fructan as the sole carbohydrate source. β2-1 fructan altered the faecal community as determined through analysis of terminal restriction fragment length polymorphisms and 16S rRNA genes. Supplementation with β2-1 fructan reduced faecal community richness, and two patterns of community change were observed. In most subjects, β2-1 fructan reduced the content of phylotypes aligning within the Bacteroides, whereas increasing those aligning within bifidobacteria, Faecalibacterium and the family Lachnospiraceae. In the remaining subjects, supplementation increased the abundance of Bacteroidetes and to a lesser extent bifidobacteria, accompanied by decreases within the Faecalibacterium and family Lachnospiraceae. β2-1 Fructan had no impact on the metagenome or glycoside hydrolase profiles in faeces from four subjects. Few relationships were found between the faecal bacterial community and various host parameters; Bacteroidetes content correlated with faecal propionate, subjects whose faecal community contained higher Bacteroidetes produced more caproic acid independent of treatment, and subjects having lower faecal Bacteroidetes exhibited increased concentrations of serum lipopolysaccharide and lipopolysaccharide binding protein independent of treatment. We found no evidence to support a defined health benefit for the use of β2-1 fructans in healthy subjects.