To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In response to advancing clinical practice guidelines regarding concussion management, service members, like athletes, complete a baseline assessment prior to participating in high-risk activities. While several studies have established test stability in athletes, no investigation to date has examined the stability of baseline assessment scores in military cadets. The objective of this study was to assess the test–retest reliability of a baseline concussion test battery in cadets at U.S. Service Academies.
All cadets participating in the Concussion Assessment, Research, and Education (CARE) Consortium investigation completed a standard baseline battery that included memory, balance, symptom, and neurocognitive assessments. Annual baseline testing was completed during the first 3 years of the study. A two-way mixed-model analysis of variance (intraclass correlation coefficent (ICC)3,1) and Kappa statistics were used to assess the stability of the metrics at 1-year and 2-year time intervals.
ICC values for the 1-year test interval ranged from 0.28 to 0.67 and from 0.15 to 0.57 for the 2-year interval. Kappa values ranged from 0.16 to 0.21 for the 1-year interval and from 0.29 to 0.31 for the 2-year test interval. Across all measures, the observed effects were small, ranging from 0.01 to 0.44.
This investigation noted less than optimal reliability for the most common concussion baseline assessments. While none of the assessments met or exceeded the accepted clinical threshold, the effect sizes were relatively small suggesting an overlap in performance from year-to-year. As such, baseline assessments beyond the initial evaluation in cadets are not essential but could aid concussion diagnosis.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
Purpose – This study reports on a project to monitor deliberate self-poisoning in a rural area of Northern Ireland over a 20-year period. Comparison is made with reports from large urban centres. In addition, a local prescribing database allows assessment of any association between psychotropic drug prescription and use for deliberate self-poisoning. Materials and methods – Frequency of self-poisoning, demographic details and drugs used were recorded for all episodes of deliberate self-poisoning occurring at Craigavon Area Hospital for the years 1976, 1986, 1991 and 1996. It was possible to compare prescriptions of psychotropic drugs with their use for deliberate self-poisoning between the years 1991 and 1996 in the region served by the hospital, using the Defined Daily Dose (DDD) system. Results – In this rural area the pattern of deliberate self-poisoning has changed, as in urban centres, with a rise in frequency and the male/female ratio approaching unity. The pattern of drug use has altered, with paracetamol overtaking benzodiazepines as the most commonly used agent. More recently, antidepressants have become the second most frequently used drug class for this purpose. Psychotropic medications used for self-poisoning altered in proportion to their prescription between the years 1991 and 1996. Conclusions – In the face of a continuing rise in deliberate self-poisoning, which is effecting both urban and rural areas, care should be taken to prescribe the least toxic agent available as this is associated with likely frequency of self-poisoning for most classes of psychotropic drug.
There is strong evidence that foods containing dietary fibre protect against colorectal cancer, resulting at least in part from its anti-proliferative properties. This study aimed to investigate the effects of supplementation with two non-digestible carbohydrates, resistant starch (RS) and polydextrose (PD), on crypt cell proliferative state (CCPS) in the macroscopically normal rectal mucosa of healthy individuals. We also investigated relationships between expression of regulators of apoptosis and of the cell cycle on markers of CCPS. Seventy-five healthy participants were supplemented with RS and/or PD or placebo for 50 d in a 2 × 2 factorial design in a randomised, double-blind, placebo-controlled trial (the Dietary Intervention, Stem cells and Colorectal Cancer (DISC) Study). CCPS was assessed, and the expression of regulators of the cell cycle and of apoptosis was measured by quantitative PCR in rectal mucosal biopsies. SCFA concentrations were quantified in faecal samples collected pre- and post-intervention. Supplementation with RS increased the total number of mitotic cells within the crypt by 60 % (P = 0·001) compared with placebo. This effect was limited to older participants (aged ≥50 years). No other differences were observed for the treatments with PD or RS as compared with their respective controls. PD did not influence any of the measured variables. RS, however, increased cell proliferation in the crypts of the macroscopically-normal rectum of older adults. Our findings suggest that the effects of RS on CCPS are not only dose, type of RS and health status-specific but are also influenced by age.
Lactogenesis stage II, also known as when a mother's milk “comes in”, is characterised by copious milk production. Delayed lactogenesis II, when onset occurs after 72 hours post-partum, has been linked to early breastfeeding cessation. It has been suggested that caesarean section is a risk factor for late onset of lactogenesis II. It is unknown why lactogenesis II may be delayed in caesarean section but there are several potential reasons such as volume of blood loss, maternal stress, delayed breastfeeding initiation and difficulties with mobility and positioning. Analysis of timing of lactogenesis and breastfeeding frequency was carried out on data from the PROMESA and IMPRINT studies, which were looking at the supplementation of breast milk with a probiotic Bifidobacterium infantis. IMPRINT was carried out in California and enrolled eighty women prior to birth or before postnatal day 4. The PROMESA study in the UK only recruited women who were booked for elective caesarean sections, and also enrolled eighty mother-baby dyads. As part of both studies mothers filled out a variety of surveys and daily logs, including a daily feeding log, along with self-reported lactogenesis. Using logistic regression, we looked at whether mode of birth (spontaneous vaginal delivery, emergency and elective caesarean section) was associated with the timing of onset of lactogenesis, and linear regression to look at the difference in breastfeeding frequency between modes of birth. Mode of birth was significantly associated with delayed onset of lactogenesis > 3 days (OR 3.38, 95% CI 2.48–4.61). There was also a reduced frequency of breastfeeding in the first week post-partum in mother-baby dyads who underwent an elective caesarean section. These findings suggest that mothers who give birth by elective caesarean section may need additional support with breastfeeding in the early days post-partum, as well as ongoing support long-term to reduce the likelihood of early cessation of breastfeeding.
Methods to stimulate appetite in the sick or elderly remains a challenge with few safe therapeutic options. Ghrelin is an orexigenic hormone, increasing appetite and subsequent food intake. It has received considerable attention as a therapeutic target to stimulate food intake in patients with anorexia. The identification of food-grade bioactives with proven orexigenic effects would mark significant progress in the treatment of disease-related malnutrition. This study therefore investigated the effects of two milk-derived ghrelinergic peptides on appetite and energy intake in healthy humans.
A single-blind, placebo-controlled, 3-arm (placebo, casein bioactive MF1145 and whey bioactive UL-2-141) cross-over trial was conducted in healthy male volunteers. Participants received 26 mg/kg of both the bioactives and placebo. The main outcome measures were energy & protein intake from a set breakfast and ad libitum lunch and subjective appetite sensations as assessed by visual analogue scale (VAS). Basal and postprandial levels of active ghrelin (AG) were measured. Dietary intakes were analysed using Nutritics software. Statistical analyses were performed in R.
Overall, 22 male participants (mean age 27 years) were included, average BMI was 24.6 kg/m2, (19.8 to 30.2 kg/m2). Mean energy and protein intakes at lunch when treated with placebo were 1343 kcal (95% CI: 1215–1471 kcal) and 74 g (95% CI: 66–81 g), respectively. Energy and protein intakes were not significantly different from placebo for either treatment (p = 0.918, p = 0.319 for UL-2-141 and p = 0.889, p = 0.959 for MF1145, respectively). Similarly, appetite, hunger and satiety responses on VAS were not significantly different from placebo for either treatment. AG peak post-lunch on placebo was 653 pg/ml (95% CI: 511–794 pg/ml). Treatment with UL-2-141 resulted in 139 pg/ml reduction in post-prandial AG compared to placebo and treatment with MF1145 resulted in 114 pg/ml reduction compared to placebo. This pattern was significant for both treatments (p = 0.021 and p = 0.045, respectively) however when controlling for fasting-AG, the pattern was no longer significant (p = 0.590 and p = 0.877 respectively). Pre-prandial AG peaks were not significantly different across treatments.
While these peptides have previously demonstrated ghrelinergic effects in rats, no effect on appetite or food intake in humans was identified by this study. This may be attributable to the small sample size or low dose. However, since healthy adults are often not in tune with their own physiological hunger, they may not respond strongly to simple physiological modulators and repeating the study in subjects with established anorexia may be prudent.
Alcohol and cannabis remain the substances most widely used by adolescents. Better understanding of the dynamic relationship between trajectories of substance use in relation to neuropsychological functioning is needed. The aim of this study was to examine the different impacts of within- and between-person changes in alcohol and cannabis use on neuropsychological functioning over multiple time points.
Hierarchical linear modeling examined the effects of alcohol and cannabis use on neuropsychological functioning over the course of 14 years in a sample of 175 adolescents (aged 12–15 years at baseline).
Time-specific fluctuations in alcohol use (within-person effect) predicted worse performance across time on the Wechsler Abbreviated Scale of Intelligence Block Design subtest (B = −.05, SE = .02, p = .01). Greater mean levels of percent days of cannabis use across time (between-person effect) were associated with an increased contrast score between Delis–Kaplan Executive Function System Color Word Inhibition and Color Naming conditions (B = .52, SE = .14, p < .0001) and poorer performance over time on Block Design (B = −.08, SE = .04, p = .03). Neither alcohol and/nor cannabis use over time was associated with performance in the verbal memory and processing speed domains.
Greater cumulative cannabis use over adolescence may be linked to poorer inhibitory control and visuospatial functioning performance, whereas more proximal increases in alcohol consumption during adolescence may drive alcohol-related performance decrements in visuospatial functioning. Results from this prospective study add to the growing body of literature on the impact of alcohol and cannabis use on cognition from adolescent to young adulthood.
We study star formation and metallicity enrichment histories of 24 massive galaxies at 1.6 < z < 2.5. Deep slitless spectroscopy + imaging data set collected from multiple HST surveys allows robust determination of their SEDs. Our new SED modeling with no functional assumptions on star formation histories revels that 1. most of the sample galaxies have already formed >50% of their extant masses ∼1.5 Gyr before the time of observed redshifts, with a trend where more massive galaxies form earlier, 2. most of our galaxies already have stellar metallicities compatible with those of local early-type galaxies, and 3. inferred metallicities are on average ∼ 0.25 dex higher than observed gas-phase metallicities of star forming galaxies at the time of their formation. Continuation of low-level star formation, rather than abrupt termination of star forming activity, may explain the observed gap of metallicities.
In experimental and clinical studies, green or black tea consumption has been shown to reduce oxidative stress. However, these studies involved high levels of tea consumption and may not reflect patterns in the general population. Here, we examined the association between black or green tea consumption and oxidative stress in a cross-sectional study of 889 premenopausal US women aged 35–54 years. Tea consumption was measured using the Block-98 FFQ. Urinary 8-iso-PGF2α (F2-IsoP) and 2,3-dinor-5,6-dihydro-15-F2t-isoprostane (15-F2t-IsoP-M) were used as biomarkers of oxidative stress. These compounds were measured by MS and normalised to creatinine. Linear regression was used to calculate the geometric mean differences (GMD) and 95% CI for log-transformed urinary F2-IsoP or 15-F2t-IsoP-M in relation to black or green tea consumption. We further examined whether adjusting for caffeine impacted associations between tea and oxidative stress. Geometric means of urinary F2-IsoP and 15-F2t-IsoP-M were 1·44 (95% CI 1·39, 1·49) and 0·71 (95% CI 0·69, 0·73) ng/mg creatinine, respectively. Overall, green tea consumption was not associated with urinary F2-IsoP or 15-F2t-IsoP-M. High-level black tea consumption (≥5 cups/week compared with 0) was associated with higher 15-F2t-IsoP-M concentrations (adjusted GMD=0·10, 95 % CI 0·02–0.19) but not F2-IsoP. Adjusting for caffeine nullified the association between black tea and 15-F2t-IsoP-M. Our findings do not support the hypothesis that dietary tea consumption is inversely associated with oxidative stress.
Psychotropic medications are frequently co-prescribed with antiretroviral therapy (ART), owing to a high prevalence of psychiatric illness within the population living with HIV, as well as a 7-fold increased risk of HIV infection among patients with psychiatric illness. While ART has been notoriously associated with a multitude of pharmacokinetic drug interactions involving the cytochrome P450 enzyme system, the magnitude and clinical impact of these interactions with psychotropics may range from negligible effects on plasma concentrations to life-threatening torsades de pointes or respiratory depression. This comprehensive review summarizes the currently available information regarding drug–drug interactions between antiretrovirals and pharmacologic agents utilized in the treatment of psychiatric disorders—antidepressants, stimulants, antipsychotics, anxiolytics, mood stabilizers, and treatments for opioid use disorder and alcohol use disorder—and provides recommendations for their management. Additionally, overlapping toxicities between antiretrovirals and the psychotropic classes are highlighted. Knowledge of the interaction and adverse effect potential of specific antiretrovirals and psychotropics will allow clinicians to make informed prescribing decisions to better promote the health and wellness of this high-risk population.
Objectives: Down syndrome (DS) is a population with known hippocampal impairment, with studies showing that individuals with DS display difficulties in spatial navigation and remembering arbitrary bindings. Recent research has also demonstrated the importance of the hippocampus for novel word-learning. Based on these data, we aimed to determine whether individuals with DS show deficits in learning new labels and if they may benefit from encoding conditions thought to be less reliant on hippocampal function (i.e., through fast mapping). Methods: In the current study, we examined immediate, 5-min, and 1-week delayed word-learning across two learning conditions (e.g., explicit encoding vs. fast mapping). These conditions were examined across groups (twenty-six 3- to 5-year-old typically developing children and twenty-six 11- to 28-year-old individuals with DS with comparable verbal and nonverbal scores on the Kaufman Brief Intelligence Test – second edition) and in reference to sleep quality. Results: Both individuals with and without DS showed retention after a 1-week delay, and the current study found no benefit of the fast mapping condition in either group contrary to our expectations. Eye tracking data showed that preferential eye movements to target words were not present immediately but emerged after 1-week in both groups. Furthermore, sleep measures collected via actigraphy did not relate to retention in either group. Conclusions: This study presents novel data on long-term knowledge retention in reference to sleep patterns in DS and adds to a body of knowledge helping us to understand the processes of word-learning in typical and atypically developing populations. (JINS, 2018, 24, 955–965)
This study examined the effectiveness of a formal postdoctoral education program designed to teach skills in clinical and translational science, using scholar publication rates as a measure of research productivity.
Participants included 70 clinical fellows who were admitted to a master’s or certificate training program in clinical and translational science from 1999 to 2015 and 70 matched control peers. The primary outcomes were the number of publications 5 years post-fellowship matriculation and time to publishing 15 peer-reviewed manuscripts post-matriculation.
Clinical and translational science program graduates published significantly more peer-reviewed manuscripts at 5 years post-matriculation (median 8 vs 5, p=0.041) and had a faster time to publication of 15 peer-reviewed manuscripts (matched hazard ratio = 2.91, p=0.002). Additionally, program graduates’ publications yielded a significantly higher average H-index (11 vs. 7, p=0.013).
These findings support the effectiveness of formal training programs in clinical and translational science by increasing academic productivity.
Bowel cancer risk is strongly influenced by lifestyle factors including diet and physical activity. Several studies have investigated the effects of adherence to the World Cancer Research Fund (WCRF)/American Institute for Cancer Research (AICR) cancer prevention recommendations on outcomes such as all-cause and cancer-specific mortality, but the relationships with molecular mechanisms that underlie the effects on bowel cancer risk are unknown. This study aimed to investigate the relationships between adherence to the WCRF/AICR cancer prevention recommendations and wingless/integrated (WNT)-pathway-related markers of bowel cancer risk, including the expression of WNT pathway genes and regulatory microRNA (miRNA), secreted frizzled-related protein 1 (SFRP1) methylation and colonic crypt proliferative state in colorectal mucosal biopsies. Dietary and lifestyle data from seventy-five healthy participants recruited as part of the DISC Study were used. A scoring system was devised including seven of the cancer prevention recommendations and smoking status. The effects of total adherence score and scores for individual recommendations on the measured outcomes were assessed using Spearman’s rank correlation analysis and unpaired t tests, respectively. Total adherence score correlated negatively with expression of Myc proto-oncogene (c-MYC) (P=0·039) and WNT11 (P=0·025), and high adherers had significantly reduced expression of cyclin D1 (CCND1) (P=0·042), WNT11 (P=0·012) and c-MYC (P=0·048). Expression of axis inhibition protein 2 (AXIN2), glycogen synthase kinase (GSK3β), catenin β1 (CTNNB1) and WNT11 and of the oncogenic miRNA miR-17 and colonic crypt kinetics correlated significantly with scores for individual recommendations, including body fatness, red meat intake, plant food intake and smoking status. The findings from this study provide evidence for positive effects of adherence to the WCRF/AICR cancer prevention recommendations on WNT-pathway-related markers of bowel cancer risk.
Ischemic stroke treatment is time-sensitive, and barriers to providing prehospital care encountered by Emergency Medical Services (EMS) providers have been under-studied.
This study described barriers to providing prehospital care, identified predictors of these barriers, and assessed the impact of these barriers on EMS on-scene time and administration of tissue plasminogen activator (tPA) in the emergency department (ED).
A retrospective cohort study was performed using the Get With The Guidelines-Stroke (GWTG-S; American Heart Association [AHA]; Dallas, Texas USA) registry at two hospitals to identify ischemic stroke patients arriving by EMS. Variables were abstracted from prehospital and hospital medical records and merged with registry data. Barriers to care were grouped into themes. Logistic regression was used to identify predictors of barriers to care, and bi-variate tests were used to assess differences in EMS on-scene time and the proportion of patients receiving tPA between patients with and without barriers.
Barriers to providing prehospital care were documented for 15.5% of patients: 29.6% related to access, 26.7% communication, 23.0% extrication and transportation, 20.0% refusal, and 14.1% assessment/management. Non-white and non-black race (OR: 3.69; 95% CI, 1.63-8.36) and living alone (OR: 1.53; 95% CI, 1.05-2.23) were associated with greater odds of barriers to providing care. The EMS on-scene time was ≥15 minutes for 70.4% of patients who had a barrier to care, compared with 49.0% of patients who did not (P<.001). There was no significant difference in the proportion of patients who were administered tPA between those with and without barriers to care (14.1% vs 19.2%; P=.159).
Barriers to providing prehospital care were documented for a sizable proportion of ischemic stroke patients, with the majority related to patient access and communication, and occurred more frequently among non-white and non-black patients and those living alone. Although EMS on-scene time was longer for patients with barriers to care, the proportion of patients receiving tPA in the ED did not differ.
LiT, CushmanJT, ShahMN, KellyAG, RichDQ, JonesCMC. Barriers to Providing Prehospital Care to Ischemic Stroke Patients: Predictors and Impact on Care. Prehosp Disaster Med.2018;33(5):501–507.