To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
It is unclear if mild-to-moderate dehydration independently affects mood without confounders like heat exposure or exercise. This study examined the acute effect of cellular dehydration on mood. Forty-nine adults (55 % female, age 39 (sd 8) years) were assigned to counterbalanced, crossover trials. Intracellular dehydration was induced with 2-h (0·1 ml/kg per min) 3 % hypertonic saline (HYPER) infusion or 0·9 % isotonic saline (ISO) as a control. Plasma osmolality increased in HYPER (pre 285 (sd 3), post 305 (sd 4) mmol/kg; P < 0·05) but remained unchanged in ISO (pre 285 (sd 3), post 288 (sd 3) mmol/kg; P > 0·05). Mood was assessed with the short version of the Profile of Mood States Questionnaire (POMS). The POMS sub-scale (confusion-bewilderment, depression-dejection, fatigue-inertia) increased in HYPER compared with ISO (P < 0·05). Total mood disturbance score (TMD) assessed by POMS increased from 10·3 (sd 0·9) to 16·6 (sd 1·7) in HYPER (P < 0·01), but not in ISO (P > 0·05). When TMD was stratified by sex, the increase in the HYPER trial was significant in females (P < 0·01) but not in males (P > 0·05). Following infusion, thirst and copeptin (surrogate for vasopressin) were also higher in females than in males (21·3 (sd 2·0), 14·1 (sd 1·4) pmol/l; P < 0·01) during HYPER. In conclusion, cellular dehydration acutely degraded specific aspects of mood mainly in women. The mechanisms underlying sex differences may be related to elevated thirst and vasopressin.
To characterize associations between exposures within and outside the medical workplace with healthcare personnel (HCP) SARS-CoV-2 infection, including the effect of various forms of respiratory protection.
We collected data from international participants via an online survey.
In total, 1,130 HCP (244 cases with laboratory-confirmed COVID-19, and 886 controls healthy throughout the pandemic) from 67 countries not meeting prespecified exclusion (ie, healthy but not working, missing workplace exposure data, COVID symptoms without lab confirmation) were included in this study.
Respondents were queried regarding workplace exposures, respiratory protection, and extra-occupational activities. Odds ratios for HCP infection were calculated using multivariable logistic regression and sensitivity analyses controlling for confounders and known biases.
HCP infection was associated with non–aerosol-generating contact with COVID-19 patients (adjusted OR, 1.4; 95% CI, 1.04–1.9; P = .03) and extra-occupational exposures including gatherings of ≥10 people, patronizing restaurants or bars, and public transportation (adjusted OR range, 3.1–16.2). Respirator use during aerosol-generating procedures (AGPs) was associated with lower odds of HCP infection (adjusted OR, 0.4; 95% CI, 0.2–0.8, P = .005), as was exposure to intensive care and dedicated COVID units, negative pressure rooms, and personal protective equipment (PPE) observers (adjusted OR range, 0.4–0.7).
COVID-19 transmission to HCP was associated with medical exposures currently considered lower-risk and multiple extra-occupational exposures, and exposures associated with proper use of appropriate PPE were protective. Closer scrutiny of infection control measures surrounding healthcare activities and medical settings considered lower risk, and continued awareness of the risks of public congregation, may reduce the incidence of HCP infection.
Cognitive impairment associated with lifetime major depressive disorder (MDD) is well-supported by meta-analytic studies, but population-based estimates remain scarce. Previous UK Biobank studies have only shown limited evidence of cognitive differences related to probable MDD. Using updated cognitive and clinical assessments in UK Biobank, this study investigated population-level differences in cognitive functioning associated with lifetime MDD.
Associations between lifetime MDD and cognition (performance on six tasks and general cognitive functioning [g-factor]) were investigated in UK Biobank (N-range 7,457–14,836, age 45–81 years, 52% female), adjusting for demographics, education, and lifestyle. Lifetime MDD classifications were based on the Composite International Diagnostic Interview. Within the lifetime MDD group, we additionally investigated relationships between cognition and (a) recurrence, (b) current symptoms, (c) severity of psychosocial impairment (while symptomatic), and (d) concurrent psychotropic medication use.
Lifetime MDD was robustly associated with a lower g-factor (β = −0.10, PFDR = 4.7 × 10−5), with impairments in attention, processing speed, and executive functioning (β ≥ 0.06). Clinical characteristics revealed differential profiles of cognitive impairment among case individuals; those who reported severe psychosocial impairment and use of psychotropic medication performed worse on cognitive tests. Severe psychosocial impairment and reasoning showed the strongest association (β = −0.18, PFDR = 7.5 × 10−5).
Findings describe small but robust associations between lifetime MDD and lower cognitive performance within a population-based sample. Overall effects were of modest effect size, suggesting limited clinical relevance. However, deficits within specific cognitive domains were more pronounced in relation to clinical characteristics, particularly severe psychosocial impairment.
Epidemiological studies indicate that individuals with one type of mental disorder have an increased risk of subsequently developing other types of mental disorders. This study aimed to undertake a comprehensive analysis of pair-wise lifetime comorbidity across a range of common mental disorders based on a diverse range of population-based surveys.
The WHO World Mental Health (WMH) surveys assessed 145 990 adult respondents from 27 countries. Based on retrospectively-reported age-of-onset for 24 DSM-IV mental disorders, associations were examined between all 548 logically possible temporally-ordered disorder pairs. Overall and time-dependent hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated using Cox proportional hazards models. Absolute risks were estimated using the product-limit method. Estimates were generated separately for men and women.
Each prior lifetime mental disorder was associated with an increased risk of subsequent first onset of each other disorder. The median HR was 12.1 (mean = 14.4; range 5.2–110.8, interquartile range = 6.0–19.4). The HRs were most prominent between closely-related mental disorder types and in the first 1–2 years after the onset of the prior disorder. Although HRs declined with time since prior disorder, significantly elevated risk of subsequent comorbidity persisted for at least 15 years. Appreciable absolute risks of secondary disorders were found over time for many pairs.
Survey data from a range of sites confirms that comorbidity between mental disorders is common. Understanding the risks of temporally secondary disorders may help design practical programs for primary prevention of secondary disorders.
This paper summarizes a multi-state, multi-year study assessing the potential for local agriculture in northern New England. While largely rural, this region's agricultural sector differs greatly from the rest of the United States, and demand for locally produced food has been increasing. To assess this unique economic landscape, researchers and Cooperative Extension at the Universities of Maine, New Hampshire, and Vermont investigated four key areas: (1) local food capacities, (2) constraints to agricultural expansion, (3) consumer preferences for local and organic produce, and (4) the role of intermediaries as alternative local food outlets. The project included input from local farmers, Extension members, restaurants, and the general public. We present the four research areas in a sequential, overlapping fashion. The timing of our research was such that each step in the process informed the next and can be used as a template for assessing a region's potential for local agricultural production.
In electron beams where space charge plays an important role in the beam transport, the beams’ transverse and longitudinal properties will become coupled. One example of this is the transverse–longitudinal correlation produced in a current-modulated beam generated in a DC electron gun, formed through the competition between the time-dependent radial space charge force and the time-independent radial focusing force. This correlation will cause both the slice radius and divergence of the beam extracted from the gun to depend on the slice current. Here we consider the transport of such a beam in a linearly tapered solenoid focusing channel. Transport performance was generally improved with longer taper lengths, minimal initial correlation between slice divergence and slice current, and moderate degrees of initial correlation between initial slice radius and slice current. Performance was also generally improved with lower slice emittances, although surprisingly transport was improved by slightly increasing the assumed slice emittance in certain limited circumstances.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Substantial clinical heterogeneity of major depressive disorder (MDD) suggests it may group together individuals with diverse aetiologies. Identifying distinct subtypes should lead to more effective diagnosis and treatment, while providing more useful targets for further research. Genetic and clinical overlap between MDD and schizophrenia (SCZ) suggests an MDD subtype may share underlying mechanisms with SCZ.
The present study investigated whether a neurobiologically distinct subtype of MDD could be identified by SCZ polygenic risk score (PRS). We explored interactive effects between SCZ PRS and MDD case/control status on a range of cortical, subcortical and white matter metrics among 2370 male and 2574 female UK Biobank participants.
There was a significant SCZ PRS by MDD interaction for rostral anterior cingulate cortex (RACC) thickness (β = 0.191, q = 0.043). This was driven by a positive association between SCZ PRS and RACC thickness among MDD cases (β = 0.098, p = 0.026), compared to a negative association among controls (β = −0.087, p = 0.002). MDD cases with low SCZ PRS showed thinner RACC, although the opposite difference for high-SCZ-PRS cases was not significant. There were nominal interactions for other brain metrics, but none remained significant after correcting for multiple comparisons.
Our significant results indicate that MDD case-control differences in RACC thickness vary as a function of SCZ PRS. Although this was not the case for most other brain measures assessed, our specific findings still provide some further evidence that MDD in the presence of high genetic risk for SCZ is subtly neurobiologically distinct from MDD in general.
Objective: Few studies have investigated the assessment and functional impact of egocentric and allocentric neglect among stroke patients. This pilot study aimed to determine (1) whether allocentric and egocentric neglect could be dissociated among a sample of stroke patients using eye tracking; (2) the specific patterns of attention associated with each subtype; and (3) the nature of the relationship between neglect subtype and functional outcome. Method: Twenty acute stroke patients were administered neuropsychological assessment batteries, a pencil-and-paper Apples Test to measure neglect subtype, and an adaptation of the Apples Test with an eye tracking measure. To test clinical discriminability, twenty age- and education-matched control participants were administered the eye tracking measure of neglect. Results: The eye tracking measure identified a greater number of individuals as having egocentric and/or allocentric neglect than the pencil-and-paper Apples Test. Classification of neglect subtype based on eye tracking performance was a significant predictor of functional outcome beyond that accounted for by the neuropsychological test performance and Apples Test neglect classification. Preliminary evidence suggests that patients with no neglect symptoms had superior functional outcomes compared with patients with neglect. Patients with combined egocentric and allocentric neglect had poorer functional outcomes than those with either subtype. Functional outcomes of patients with either allocentric or egocentric neglect did not differ significantly. The applications of our findings, to improve neglect detection, are discussed. Conclusion: Results highlight the potential clinical utility of eye tracking for the assessment and identification of neglect subtype among stroke patients to predict functional outcomes. (JINS, 2019, 25, 479–489)
This replication study examined protective effects of positive childhood memories with caregivers (“angels in the nursery”) against lifespan and intergenerational transmission of trauma. More positive, elaborated angel memories were hypothesized to buffer associations between mothers’ childhood maltreatment and their adulthood posttraumatic stress disorder (PTSD) and depression symptoms, comorbid psychopathology, and children's trauma exposure. Participants were 185 mothers (M age = 30.67 years, SD = 6.44, range = 17–46 years, 54.6% Latina, 17.8% White, 10.3% African American, 17.3% other; 24% Spanish speaking) and children (M age = 42.51 months; SD = 15.95, range = 3–72 months; 51.4% male). Mothers completed the Angels in the Nursery Interview (Van Horn, Lieberman, & Harris, 2008), and assessments of childhood maltreatment, adulthood psychopathology, children's trauma exposure, and demographics. Angel memories significantly moderated associations between maltreatment and PTSD (but not depression) symptoms, comorbid psychopathology, and children's trauma exposure. For mothers with less positive, elaborated angel memories, higher levels of maltreatment predicted higher levels of psychopathology and children's trauma exposure. For mothers with more positive, elaborated memories, however, predictive associations were not significant, reflecting protective effects. Furthermore, protective effects against children's trauma exposure were significant only for female children, suggesting that angel memories may specifically buffer against intergenerational trauma from mothers to daughters.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
Internal gravity wave energy contributes significantly to the energy budget of the oceans, affecting mixing and the thermohaline circulation. Hence it is important to determine the internal wave energy flux
is the pressure perturbation field and
is the velocity perturbation field. However, the pressure perturbation field is not directly accessible in laboratory or field observations. Previously, a Green’s function based method was developed to calculate the instantaneous energy flux field from a measured density perturbation field
, given a constant buoyancy frequency
. Here we present methods for computing the instantaneous energy flux
for an internal wave field with vertically varying background
, as in the oceans where
typically decreases by two orders of magnitude from the pycnocline to the deep ocean. Analytic methods are presented for computing
from a density perturbation field for
varying linearly with
. To generalize this approach to arbitrary
, we present a computational method for obtaining
. The results for
for the different cases agree well with results from direct numerical simulations of the Navier–Stokes equations. Our computational method can be applied to any density perturbation data using the MATLAB graphical user interface ‘EnergyFlux’.
A robust biomedical informatics infrastructure is essential for academic health centers engaged in translational research. There are no templates for what such an infrastructure encompasses or how it is funded. An informatics workgroup within the Clinical and Translational Science Awards network conducted an analysis to identify the scope, governance, and funding of this infrastructure. After we identified the essential components of an informatics infrastructure, we surveyed informatics leaders at network institutions about the governance and sustainability of the different components. Results from 42 survey respondents showed significant variations in governance and sustainability; however, some trends also emerged. Core informatics components such as electronic data capture systems, electronic health records data repositories, and related tools had mixed models of funding including, fee-for-service, extramural grants, and institutional support. Several key components such as regulatory systems (e.g., electronic Institutional Review Board [IRB] systems, grants, and contracts), security systems, data warehouses, and clinical trials management systems were overwhelmingly supported as institutional infrastructure. The findings highlighted in this report are worth noting for academic health centers and funding agencies involved in planning current and future informatics infrastructure, which provides the foundation for a robust, data-driven clinical and translational research program.
Grommet insertion is a common surgical procedure in children. Long waiting times for grommet insertion are not unusual. This project aimed to streamline the process by introducing a pathway for audiologists to directly schedule children meeting National Institute for Health and Care Excellence Clinical Guideline 60 (‘CG60’) for grommet insertion.
Method and results
A period from June to November 2014 was retrospectively audited. Mean duration between the first audiology appointment and grommet insertion was 294.5 days (median = 310 days). Implementing the direct-listing pathway reduced the duration between first audiology appointment and grommet insertion (mean = 232 days; median = 231 days). There has been a reduction in the time between the first audiology appointment and surgery (mean difference of 62.5 days; p = 0.024), and a reduction in the time between second audiology appointment and surgery (28 days; p = 0.009).
Direct-listing pathways for grommet insertion can reduce waiting times and expedite surgery. Implementation involves a simple alteration of current practice, adhering to National Institute for Health and Care Excellence Clinical Guideline 60. The ultimate decision regarding surgery still rests with ENT specialists.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Potential participants seek information about clinical trials for many reasons, but the process can be challenging. We analyzed 101,249 searches in ResearchMatch Trials Today, a free interface to recruiting trials from ClinicalTrials.gov. Searches from March 2015 to November 2016 included a broad range of conditions and healthy volunteer concepts, including 12,649 unique topics. Trials Today data indicate that it is being used to identify trials on a variety of topics.
Introduction: Emergency Department Overcrowding (EDOC) is a multifactorial issue that leads to Access Block for patients needing emergency care. Identified as a national problem, patients presenting to a Canadian Emergency Department (ED) at a time of overcrowding have higher rates of admission to hospital and increased seven-day mortality. Using the well accepted input-throughput-output model to study EDOC, current research has focused on throughput as a measure of patient flow, reported as ED length of stay (LOS). In fact, ED LOS and ED beds occupied by inpatients are two “extremely important indicators of EDOC identified by a 2005 survey of Canadian ED directors. One proposed solution to improve ED throughput is to utilize a physician at triage (PAT) to rapidly assess newly arriving patients. In 2017, a pilot PAT program was trialed at Kelowna General Hospital (KGH), a tertiary care hospital, as part of a PDSA cycle. The aim was to mitigate EDOC by improving ED throughput by the end of 2018, to meet the national targets for ED LOS suggested in the 2013 CAEP position statement. Methods: During the fiscal periods 1-6 (April 1 to September 7, 2017) a PAT shift occurred daily from 1000-2200, over four long weekends. ED LOS, time to inpatient bed, time to physician initial assessment (PIA), number of British Columbia Ambulance Service (BCAS) offload delays, and number of patients who left without being seen (LWBS) were extracted from an administrative database. Results were retrospectively analyzed and compared to data from 1000-2200 of non-PAT trial days during the trial periods. Results: Median ED LOS decreased from 3.8 to 3.4 hours for high-acuity patients (CTAS 1-3), from 2.1 to 1.8 hours for low-acuity patients (CTAS 4-5), and from 9.3 to 8.0 hours for all admitted patients. During PAT trial weekends, there was a decrease in the average time to PIA by 65% (from 73 to 26 minutes for CTAS 2-5), average number of daily BCAS offload delays by 39% (from 2.3 to 1.4 delays per day), and number of patients who LWBS from 2.4% to 1.7%. Conclusion: The implementation of PAT was associated with improvements in all five measures of ED throughput, providing a potential solution for EDOC at KGH. ED LOS was reduced compared to non-PAT control days, successfully meeting the suggested national targets. PAT could improve efficiency, resulting in the ability to see more patients in the ED, and increase the quality and safety of ED practice. Next, we hope to prospectively evaluate PAT, continuing to analyze these process measures, perform a cost-benefit analysis, and formally assess ED staff and patient perceptions of the program.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.