To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There is significant interest in the use of angiotensin converting enzyme inhibitors (ACE-I) and angiotensin II receptor blockers (ARB) in coronavirus disease 2019 (COVID-19) and concern over potential adverse effects since these medications upregulate the severe acute respiratory syndrome coronavirus 2 host cell entry receptor ACE2. Recent studies on ACE-I and ARB in COVID-19 were limited by excluding outpatients, excluding patients by age, analyzing ACE-I and ARB together, imputing missing data, and/or diagnosing COVID-19 by chest computed tomography without definitive reverse transcription polymerase chain reaction (RT-PCR), all of which are addressed here.
We performed a retrospective cohort study of 1023 COVID-19 patients diagnosed by RT-PCR at Stanford Hospital through April 8, 2020 with a minimum follow-up time of 14 days to investigate the association between ACE-I or ARB use with outcomes.
Use of ACE-I or ARB medications was not associated with increased risk of hospitalization, intensive care unit admission, or death. Compared to patients with charted past medical history, there was a lower risk of hospitalization for patients on ACE-I (odds ratio (OR) 0.43; 95% confidence interval (CI) 0.19–0.97; P = 0.0426) and ARB (OR 0.39; 95% CI 0.17–0.90; P = 0.0270). Compared to patients with hypertension not on ACE-I or ARB, patients on ARB medications had a lower risk of hospitalization (OR 0.09; 95% CI 0.01–0.88; P = 0.0381).
These findings suggest that the use of ACE-I and ARB is not associated with adverse outcomes and may be associated with improved outcomes in COVID-19, which is immediately relevant to care of the many patients on these medications.
Given the rapidly progressing coronavirus disease 2019 (COVID-19) pandemic, this report on a US cohort of 54 COVID-19 patients from Stanford Hospital and data regarding risk factors for severe disease obtained at initial clinical presentation is highly important and immediately clinically relevant. We identified low presenting oxygen saturation as predictive of severe disease outcomes, such as diagnosis of pneumonia, acute respiratory distress syndrome, and admission to the intensive care unit, and also replicated data from China suggesting an association between hypertension and disease severity. Clinicians will benefit by tools to rapidly risk stratify patients at presentation by likelihood of progression to severe disease.
Early detection and intervention strategies in patients at clinical high-risk (CHR) for syndromal psychosis have the potential to contain the morbidity of schizophrenia and similar conditions. However, research criteria that have relied on severity and number of positive symptoms are limited in their specificity and risk high false-positive rates. Our objective was to examine the degree to which measures of recency of onset or intensification of positive symptoms [a.k.a., new or worsening (NOW) symptoms] contribute to predictive capacity.
We recruited 109 help-seeking individuals whose symptoms met criteria for the Progression Subtype of the Attenuated Positive Symptom Psychosis-Risk Syndrome defined by the Structured Interview for Psychosis-Risk Syndromes and followed every three months for two years or onset of syndromal psychosis.
Forty-one (40.6%) of 101 participants meeting CHR criteria developed a syndromal psychotic disorder [mostly (80.5%) schizophrenia] with half converting within 142 days (interquartile range: 69–410 days). Patients with more NOW symptoms were more likely to convert (converters: 3.63 ± 0.89; non-converters: 2.90 ± 1.27; p = 0.001). Patients with stable attenuated positive symptoms were less likely to convert than those with NOW symptoms. New, but not worsening, symptoms, in isolation, also predicted conversion.
Results suggest that the severity and number of attenuated positive symptoms are less predictive of conversion to syndromal psychosis than the timing of their emergence and intensification. These findings also suggest that the earliest phase of psychotic illness involves a rapid, dynamic process, beginning before the syndromal first episode, with potentially substantial implications for CHR research and understanding the neurobiology of psychosis.
We assessed self-reported drives for alcohol use and their impact on clinical features of alcohol use disorder (AUD) patients. Our prediction was that, in contrast to “affectively” (reward or fear) driven drinking, “habitual” drinking would be associated with worse clinical features in relation to alcohol use and higher occurrence of associated psychiatric symptoms.
Fifty-eight Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) alcohol abuse patients were assessed with a comprehensive battery of reward- and fear-based behavioral tendencies. An 18-item self-report instrument (the Habit, Reward and Fear Scale; HRFS) was employed to quantify affective (fear or reward) and non-affective (habitual) motivations for alcohol use. To characterize clinical and demographic measures associated with habit, reward, and fear, we conducted a partial least squares analysis.
Habitual alcohol use was significantly associated with the severity of alcohol dependence reflected across a range of domains and with lower number of detoxifications across multiple settings. In contrast, reward-driven alcohol use was associated with a single domain of alcohol dependence, reward-related behavioral tendencies, and lower number of detoxifications.
These results seem to be consistent with a shift from goal-directed to habit-driven alcohol use with severity and progression of addiction, complementing preclinical work and informing biological models of addiction. Both reward-related and habit-driven alcohol use were associated with lower number of detoxifications, perhaps stemming from more benign course for the reward-related and lack of treatment engagement for the habit-related alcohol abuse group. Future work should further explore the role of habit in this and other addictive disorders, and in obsessive-compulsive related disorders.
Many patients with advanced serious illness or at the end of life experience delirium, a potentially reversible form of acute brain dysfunction, which may impair ability to participate in medical decision-making and to engage with their loved ones. Screening for delirium provides an opportunity to address modifiable causes. Unfortunately, delirium remains underrecognized. The main objective of this pilot was to validate the brief Confusion Assessment Method (bCAM), a two-minute delirium-screening tool, in a veteran palliative care sample.
This was a pilot prospective, observational study that included hospitalized patients evaluated by the palliative care service at a single Veterans’ Administration Medical Center. The bCAM was compared against the reference standard, the Diagnostic and Statistical Manual of Mental Disorders, fifth edition. Both assessments were blinded and conducted within 30 minutes of each other.
We enrolled 36 patients who were a median of 67 years (interquartile range 63–73). The primary reasons for admission to the hospital were sepsis or severe infection (33%), severe cardiac disease (including heart failure, cardiogenic shock, and myocardial infarction) (17%), or gastrointestinal/liver disease (17%). The bCAM performed well against the Diagnostic and Statistical Manual of Mental Disorders, fifth edition, for detecting delirium, with a sensitivity (95% confidence interval) of 0.80 (0.4, 0.96) and specificity of 0.87 (0.67, 0.96).
Significance of Results
Delirium was present in 27% of patients enrolled and never recognized by the palliative care service in routine clinical care. The bCAM provided good sensitivity and specificity in a pilot of palliative care patients, providing a method for nonpsychiatrically trained personnel to detect delirium.
Nitrate (NO3−) is an ergogenic nutritional supplement that is widely used to improve physical performance. However, the effectiveness of NO3− supplementation has not been systematically investigated in individuals with different physical fitness levels. The present study analysed whether different fitness levels (non-athletes v. athletes or classification of performance levels), duration of the test used to measure performance (short v. long duration) and the test protocol (time trials v. open-ended tests v. graded-exercise tests) influence the effects of NO3− supplementation on performance. This systematic review and meta-analysis was conducted and reported according to the guidelines outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement. A systematic search of electronic databases, including PubMed, Web of Science, SPORTDiscus and ProQuest, was performed in August 2017. On the basis of the search and inclusion criteria, fifty-four and fifty-three placebo-controlled studies evaluating the effects of NO3− supplementation on performance in humans were included in the systematic review and meta-analysis, respectively. NO3− supplementation was ergogenic in non-athletes (mean effect size (ES) 0·25; 95 % CI 0·11, 0·38), particularly in evaluations of performance using long-duration open-ended tests (ES 0·47; 95 % CI 0·23, 0·71). In contrast, NO3− supplementation did not enhance the performance of athletes (ES 0·04; 95 % CI −0·05, 0·15). After objectively classifying the participants into different performance levels, the frequency of trials showing ergogenic effects in individuals classified at lower levels was higher than that in individuals classified at higher levels. Thus, the present study indicates that dietary NO3− supplementation improves physical performance in non-athletes, particularly during long-duration open-ended tests.
The replacement of fish oil (FO) with vegetable oil (VO) in feed formulations reduces the availability of n-3 long-chain PUFA (LC-PUFA) to marine fish such as gilthead seabream. The aim of this study was to examine compositional and physiological responses to a dietary gradient of n-3 LC-PUFA. Six iso-energetic and iso-nitrogenous diets (D1–D6) were fed to seabream, with the added oil being a blend of FO and VO to achieve a dietary gradient of n-3 LC-PUFA. Fish were sampled after 4 months feeding, to determine biochemical composition, tissue fatty acid concentrations and lipid metabolic gene expression. The results indicated a disturbance to lipid metabolism, with fat in the liver increased and fat deposits in the viscera reduced. Tissue fatty acid profiles were altered towards the fatty acid compositions of the diets. There was evidence of endogenous modification of dietary PUFA in the liver which correlated with the expression of fatty acid desaturase 2 (fads2). Expression of sterol regulatory element binding protein 1 (srebp1), fads2 and fatty acid synthase increased in the liver, whereas PPARα1 pathways appeared to be supressed by dietary VO in a concentration-dependent manner. The effects in lipogenic genes appear to become measurable in D1–D3, which agrees with the weight gain data suggesting that disturbances to energy metabolism and lipogenesis may be related to performance differences. These findings suggested that suppression of β-oxidation and stimulation of srebp1-mediated lipogenesis may play a role in contributing toward steatosis in fish fed n-3 LC-PUFA deficient diets.
Making predictions about aliens is not an easy task. Most previous work has focused on extrapolating from empirical observations and mechanistic understanding of physics, chemistry and biology. Another approach is to utilize theory to make predictions that are not tied to details of Earth. Here we show how evolutionary theory can be used to make predictions about aliens. We argue that aliens will undergo natural selection – something that should not be taken for granted but that rests on firm theoretical grounds. Given aliens undergo natural selection we can say something about their evolution. In particular, we can say something about how complexity will arise in space. Complexity has increased on the Earth as a result of a handful of events, known as the major transitions in individuality. Major transitions occur when groups of individuals come together to form a new higher level of the individual, such as when single-celled organisms evolved into multicellular organisms. Both theory and empirical data suggest that extreme conditions are required for major transitions to occur. We suggest that major transitions are likely to be the route to complexity on other planets, and that we should expect them to have been favoured by similarly restrictive conditions. Thus, we can make specific predictions about the biological makeup of complex aliens.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
The subsurface exploration of other planetary bodies can be used to unravel their geological history and assess their habitability. On Mars in particular, present-day habitable conditions may be restricted to the subsurface. Using a deep subsurface mine, we carried out a program of extraterrestrial analog research – MINe Analog Research (MINAR). MINAR aims to carry out the scientific study of the deep subsurface and test instrumentation designed for planetary surface exploration by investigating deep subsurface geology, whilst establishing the potential this technology has to be transferred into the mining industry. An integrated multi-instrument suite was used to investigate samples of representative evaporite minerals from a subsurface Permian evaporite sequence, in particular to assess mineral and elemental variations which provide small-scale regions of enhanced habitability. The instruments used were the Panoramic Camera emulator, Close-Up Imager, Raman spectrometer, Small Planetary Linear Impulse Tool, Ultrasonic drill and handheld X-ray diffraction (XRD). We present science results from the analog research and show that these instruments can be used to investigate in situ the geological context and mineralogical variations of a deep subsurface environment, and thus habitability, from millimetre to metre scales. We also show that these instruments are complementary. For example, the identification of primary evaporite minerals such as NaCl and KCl, which are difficult to detect by portable Raman spectrometers, can be accomplished with XRD. By contrast, Raman is highly effective at locating and detecting mineral inclusions in primary evaporite minerals. MINAR demonstrates the effective use of a deep subsurface environment for planetary instrument development, understanding the habitability of extreme deep subsurface environments on Earth and other planetary bodies, and advancing the use of space technology in economic mining.
Gene expression profiling of in vivo- and in vitro-matured bovine oocytes can identify transcripts related to the developmental potential of oocytes. Nonetheless, the effects of in vitro culturing oocytes are yet to be fully understood. We tested the effects of in vitro maturation on the transcript profile of oocytes collected from Bos taurus indicus cows. We quantified the expression of 1488 genes in in vivo- and in vitro-matured oocytes. Of these, 51 genes were up-regulated, whereas 56 were down-regulated (≥2-fold) in in vivo-matured oocytes in comparison with in vitro-matured oocytes. Quantitative real-time polymerase chain reaction (PCR) of nine genes confirmed the microarray results of differential expression between in vivo- and in vitro-matured oocytes (EZR, EPN1, PSEN2, FST, IGFBP3, RBBP4, STAT3, FDPS and IRS1). We interrogated the results for enrichment of Gene Ontology categories and overlap with protein–protein interactions. The results revealed that the genes altered by in vitro maturation are mostly related to the regulation of oocyte metabolism. Additionally, analysis of protein–protein interactions uncovered two regulatory networks affected by the in vitro culture system. We propose that the differentially expressed genes are candidates for biomarkers of oocyte competence. In vitro oocyte maturation can affect the abundance of specific transcripts and are likely to deplete the developmental competence.
Depression and anxiety in Parkinson's disease are common and frequently co-morbid, with significant impact on health outcome. Nevertheless, management is complex and often suboptimal. The existence of clinical subtypes would support stratified approaches in both research and treatment.
Five hundred and thirteen patients with Parkinson's disease were assessed annually for up to 4 years. Latent transition analysis (LTA) was used to identify classes that may conform to clinically meaningful subgroups, transitions between those classes over time, and baseline clinical and demographic features that predict common trajectories.
In total, 64.1% of the sample remained in the study at year 4. LTA identified four classes, a ‘Psychologically healthy’ class (approximately 50%), and three classes associated with psychological distress: one with moderate anxiety alone (approximately 20%), and two with moderate levels of depression plus moderate or severe anxiety. Class membership tended to be stable across years, with only about 15% of individuals transitioning between the healthy class and one of the distress classes. Stable distress was predicted by higher baseline depression and psychiatric history and younger age of onset of Parkinson's disease. Those with younger age of onset were also more likely to become distressed over the course of the study.
Psychopathology was characterized by relatively stable anxiety or anxious-depression over the 4-year period. Anxiety, with or without depression, appears to be the prominent psychopathological phenotype in Parkinson's disease suggesting a pressing need to understanding its mechanisms and improve management.
A symptom of mild cognitive impairment (MCI) and Alzheimer’s disease
(AD) is a flat learning profile. Learning slope calculation methods vary, and
the optimal method for capturing neuroanatomical changes associated with MCI and
early AD pathology is unclear. This study cross-sectionally compared four
different learning slope measures from the Rey Auditory Verbal Learning Test
(simple slope, regression-based slope, two-slope method, peak slope) to
structural neuroimaging markers of early AD neurodegeneration (hippocampal
volume, cortical thickness in parahippocampal gyrus, precuneus, and lateral
prefrontal cortex) across the cognitive aging spectrum [normal
control (NC); (n=198;
age=76±5), MCI (n=370;
age=75±7), and AD (n=171;
age=76±7)] in ADNI. Within diagnostic group,
general linear models related slope methods individually to neuroimaging
variables, adjusting for age, sex, education, and APOE4 status. Among MCI,
better learning performance on simple slope, regression-based slope, and late
slope (Trial 2–5) from the two-slope method related to larger
parahippocampal thickness (all p-values<.01) and
hippocampal volume (p<.01). Better regression-based
slope (p<.01) and late slope
(p<.01) were related to larger ventrolateral
prefrontal cortex in MCI. No significant associations emerged between any slope
and neuroimaging variables for NC (p-values ≥.05) or
AD (p-values ≥.02). Better learning performances
related to larger medial temporal lobe (i.e., hippocampal volume,
parahippocampal gyrus thickness) and ventrolateral prefrontal cortex in MCI
only. Regression-based and late slope were most highly correlated with
neuroimaging markers and explained more variance above and beyond other common
memory indices, such as total learning. Simple slope may offer an acceptable
alternative given its ease of calculation. (JINS, 2015,
The objective was to study the effect of maternal supplementation with a yeast cell wall-based product containing a mannan-rich fraction (MRF) during gestation and lactation on piglet intestinal gene expression. First parity sows were fed experimental gestation and lactation diets with or without MRF (900 mg/kg). After farrowing, piglets were fostered within treatment, as necessary. Sow and litter production performance data were collected until weaning. On day 10 post farrowing, jejunum samples from piglets were collected for gene expression analysis using the Affymetrix Porcine GeneChip array. Most performance parameters did not differ between the treatments. However, protein (P<0.01), total solids less fat (P<0.03) and the concentration of immunoglobulin G (IgG) in milk were greater (P<0.05) in the MRF-supplemented group. Gene expression results using hierarchical clustering revealed an overall dietary effect. Further analysis elucidated activation of pathways involved in tissue development, functioning and immunity, as well as greater cell proliferation and less migration of cells in the jejunum tissue. In conclusion, feeding the sow MRF during pregnancy and lactation was an effective nutritional strategy to bolster colostrum and milk IgG that are essential for development of piglet immune system and gut. In addition, the gene expression patterns affected by the passive immunity transfer showed indicators that could benefit animal performance long term.
Dietary glutamine (Gln) supplementation improves intestinal function in several stressful conditions. Therefore, in the present study, the effects of dietary Gln supplementation on the core body temperature (Tcore), bacterial translocation (BT) and intestinal permeability of mice subjected to acute heat stress were evaluated. Male Swiss mice (4 weeks old) were implanted with an abdominal temperature sensor and randomly assigned to one of the following groups fed isoenergetic and isoproteic diets for 7 d before the experimental trials: group fed the standard AIN-93G diet and exposed to a high ambient temperature (39°C) for 2 h (H-NS); group fed the AIN-93G diet supplemented with l-Gln and exposed to a high temperature (H-Gln); group fed the standard AIN-93G diet and not exposed to a high temperature (control, C-NS). Mice were orally administered diethylenetriaminepentaacetic acid radiolabelled with technetium (99mTc) for the assessment of intestinal permeability or 99mTc-Escherichia coli for the assessment of BT. Heat exposure increased Tcore (approximately 41°C during the experimental trial), intestinal permeability and BT to the blood and liver (3 h after the experimental trial) in mice from the H-NS group relative to those from the C-NS group. Dietary Gln supplementation attenuated hyperthermia and prevented the increases in intestinal permeability and BT induced by heat exposure. No correlations were observed between the improvements in gastrointestinal function and the attenuation of hyperthermia by Gln. Our findings indicate that dietary Gln supplementation preserved the integrity of the intestinal barrier and reduced the severity of hyperthermia during heat exposure. The findings also indicate that these Gln-mediated effects occurred through independent mechanisms.