To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Research among non-industrial societies suggests that body kinematics adopted during running vary between groups according to the cultural importance of running. Among groups in which running is common and an important part of cultural identity, runners tend to adopt what exercise scientists and coaches consider to be good technique for avoiding injury and maximising performance. In contrast, among groups in which running is not particularly culturally important, people tend to adopt suboptimal technique. This paper begins by describing key elements of good running technique, including landing with a forefoot or midfoot strike pattern and leg oriented roughly vertically. Next, we review evidence from non-industrial societies that cultural attitudes about running associate with variation in running techniques. Then, we present new data from Tsimane forager–horticulturalists in Bolivia. Our findings suggest that running is neither a common activity among the Tsimane nor is it considered an important part of cultural identity. We also demonstrate that when Tsimane do run, they tend to use suboptimal technique, specifically landing with a rearfoot strike pattern and leg protracted ahead of the knee (called overstriding). Finally, we discuss processes by which culture might influence variation in running techniques among non-industrial societies, including self-optimisation and social learning.
The inaugural data from the first systematic program of sea-ice observations in Kotzebue Sound, Alaska, in 2018 coincided with the first winter in living memory when the Sound was not choked with ice. The following winter of 2018–19 was even warmer and characterized by even less ice. Here we discuss the mass balance of landfast ice near Kotzebue (Qikiqtaġruk) during these two anomalously warm winters. We use in situ observations and a 1-D thermodynamic model to address three research questions developed in partnership with an Indigenous Advisory Council. In doing so, we improve our understanding of connections between landfast ice mass balance, marine mammals and subsistence hunting. Specifically, we show: (i) ice growth stopped unusually early due to strong vertical ocean heat flux, which also likely contributed to early start to bearded seal hunting; (ii) unusually thin ice contributed to widespread surface flooding. The associated snow ice formation partly offset the reduced ice growth, but the flooding likely had a negative impact on ringed seal habitat; (iii) sea ice near Kotzebue during the winters of 2017–18 and 2018–19 was likely the thinnest since at least 1945, driven by a combination of warm air temperatures and a persistent ocean heat flux.
A goosegrass [Eleusine indica (L.) Gaertn.] population uncontrolled by paraquat (R) in a vegetable production field in St. Clair County, AL, was collected in summer 2019. Research was conducted to assess the level of resistance of the suspected resistant population compared with three populations with no suspected paraquat resistance (S1, S2, and S3). Visual injury at all rating dates and biomass reduction at 28 d after treatment (DAT) of S populations occurred exponentially to increasing paraquat rates. S biotypes were injured more than R at 3 DAT, with biomass recovery at 28 DAT only occurring at rates <0.28 kg ha−1. Plant death or biomass reduction did not occur for any rate at any date for R. Paraquat rates that induced 50% or 90% injury or reduced biomass 50% or 90% compared with the non-treated (I50 or I90, respectively) ranged from 10 to 124 times higher I50 for R compared with S and 54 to 116 times higher I90 for R compared with S biotypes. These data confirm a paraquat-resistant E. indica biotype in Alabama, providing additional germplasm for study of resistance to photosystem I electron-diverting (PSI-ED) resistance mechanisms.
Iron and/or iodine deficiencies can have multiple serious adverse health outcomes, but examination of incidence rates of these deficiencies has rarely been conducted in any large population. This study examined incidence rates, temporal trends and demographic factors associated with medically diagnosed iron and iodine deficiencies/disorders in US military service members (SM).
The Defense Medical Epidemiological Database (DMED) was queried for medical visits of active duty SM to obtain specific International Classification of Diseases, Version 9, codes involving clinically diagnosed iron and iodine deficiencies/disorders.
Analysis of existing database (DMED).
Entire population of US military SM from 1997 to 2015 (average n per year = 1 382 266, 15 % women).
Overall incidence rates for iron and iodine were 104 and 36 cases/100 000 person-years, respectively. Over the 19-year period, rates for iron disorders increased steadily (108 % for men, 177 % for women). Rates for iodine disorders also increased steadily for men (91 %), but, for women, there was an initial rise followed by a later decline. Overall, women’s rates were 12 and 10 times higher than men’s for iron and iodine, respectively. Compared with whites, blacks and those of other races had higher rates of deficiencies of both minerals. Incidence rates for iodine deficiency increased substantially with age.
The overall incidence of clinically diagnosed iron and iodine deficiency among SM was low, but increased over the 19 years examined, and certain demographic groups were at significantly greater risk. Given the unexpected increases in incidence of these mineral disorders, increased surveillance may be appropriate.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Children are important transmitters of infection. Within schools they encounter large numbers of contacts and infections can spread easily causing outbreaks. However, not all schools are affected equally. We conducted a retrospective analysis of school outbreaks to identify factors associated with the risk of gastroenteritis, influenza, rash or other outbreaks. Data on reported school outbreaks in England were obtained from Public Health England and linked with data from the Department for Education and the Office for Standards in Education, Children's Services and Skills (Ofsted). Primary and all-through schools were found to be at increased risk of outbreaks, compared with secondary schools (odds ratio (OR) 5.82, 95% confidence interval (CI) 4.50–7.58 and OR 4.66, 95% CI 3.27–6.61, respectively). School size was also significantly associated with the risk of outbreaks, with higher odds associated with larger schools. Attack rates were higher in gastroenteritis and influenza outbreaks, with lower attack rates associated with rashes (relative risk 0.17, 95% CI 0.15–0.20). Deprivation and Ofsted rating were not associated with either outbreak occurrence or the subsequent attack rate. This study identifies primary and all-through schools as key settings for health protection interventions. Public health teams need to work closely with these schools to encourage early identification and reporting of outbreaks.
It is unclear if mild-to-moderate dehydration independently affects mood without confounders like heat exposure or exercise. This study examined the acute effect of cellular dehydration on mood. Forty-nine adults (55 % female, age 39 (sd 8) years) were assigned to counterbalanced, crossover trials. Intracellular dehydration was induced with 2-h (0·1 ml/kg per min) 3 % hypertonic saline (HYPER) infusion or 0·9 % isotonic saline (ISO) as a control. Plasma osmolality increased in HYPER (pre 285 (sd 3), post 305 (sd 4) mmol/kg; P < 0·05) but remained unchanged in ISO (pre 285 (sd 3), post 288 (sd 3) mmol/kg; P > 0·05). Mood was assessed with the short version of the Profile of Mood States Questionnaire (POMS). The POMS sub-scale (confusion-bewilderment, depression-dejection, fatigue-inertia) increased in HYPER compared with ISO (P < 0·05). Total mood disturbance score (TMD) assessed by POMS increased from 10·3 (sd 0·9) to 16·6 (sd 1·7) in HYPER (P < 0·01), but not in ISO (P > 0·05). When TMD was stratified by sex, the increase in the HYPER trial was significant in females (P < 0·01) but not in males (P > 0·05). Following infusion, thirst and copeptin (surrogate for vasopressin) were also higher in females than in males (21·3 (sd 2·0), 14·1 (sd 1·4) pmol/l; P < 0·01) during HYPER. In conclusion, cellular dehydration acutely degraded specific aspects of mood mainly in women. The mechanisms underlying sex differences may be related to elevated thirst and vasopressin.
To characterize associations between exposures within and outside the medical workplace with healthcare personnel (HCP) SARS-CoV-2 infection, including the effect of various forms of respiratory protection.
We collected data from international participants via an online survey.
In total, 1,130 HCP (244 cases with laboratory-confirmed COVID-19, and 886 controls healthy throughout the pandemic) from 67 countries not meeting prespecified exclusion (ie, healthy but not working, missing workplace exposure data, COVID symptoms without lab confirmation) were included in this study.
Respondents were queried regarding workplace exposures, respiratory protection, and extra-occupational activities. Odds ratios for HCP infection were calculated using multivariable logistic regression and sensitivity analyses controlling for confounders and known biases.
HCP infection was associated with non–aerosol-generating contact with COVID-19 patients (adjusted OR, 1.4; 95% CI, 1.04–1.9; P = .03) and extra-occupational exposures including gatherings of ≥10 people, patronizing restaurants or bars, and public transportation (adjusted OR range, 3.1–16.2). Respirator use during aerosol-generating procedures (AGPs) was associated with lower odds of HCP infection (adjusted OR, 0.4; 95% CI, 0.2–0.8, P = .005), as was exposure to intensive care and dedicated COVID units, negative pressure rooms, and personal protective equipment (PPE) observers (adjusted OR range, 0.4–0.7).
COVID-19 transmission to HCP was associated with medical exposures currently considered lower-risk and multiple extra-occupational exposures, and exposures associated with proper use of appropriate PPE were protective. Closer scrutiny of infection control measures surrounding healthcare activities and medical settings considered lower risk, and continued awareness of the risks of public congregation, may reduce the incidence of HCP infection.
Cognitive impairment associated with lifetime major depressive disorder (MDD) is well-supported by meta-analytic studies, but population-based estimates remain scarce. Previous UK Biobank studies have only shown limited evidence of cognitive differences related to probable MDD. Using updated cognitive and clinical assessments in UK Biobank, this study investigated population-level differences in cognitive functioning associated with lifetime MDD.
Associations between lifetime MDD and cognition (performance on six tasks and general cognitive functioning [g-factor]) were investigated in UK Biobank (N-range 7,457–14,836, age 45–81 years, 52% female), adjusting for demographics, education, and lifestyle. Lifetime MDD classifications were based on the Composite International Diagnostic Interview. Within the lifetime MDD group, we additionally investigated relationships between cognition and (a) recurrence, (b) current symptoms, (c) severity of psychosocial impairment (while symptomatic), and (d) concurrent psychotropic medication use.
Lifetime MDD was robustly associated with a lower g-factor (β = −0.10, PFDR = 4.7 × 10−5), with impairments in attention, processing speed, and executive functioning (β ≥ 0.06). Clinical characteristics revealed differential profiles of cognitive impairment among case individuals; those who reported severe psychosocial impairment and use of psychotropic medication performed worse on cognitive tests. Severe psychosocial impairment and reasoning showed the strongest association (β = −0.18, PFDR = 7.5 × 10−5).
Findings describe small but robust associations between lifetime MDD and lower cognitive performance within a population-based sample. Overall effects were of modest effect size, suggesting limited clinical relevance. However, deficits within specific cognitive domains were more pronounced in relation to clinical characteristics, particularly severe psychosocial impairment.
Epidemiological studies indicate that individuals with one type of mental disorder have an increased risk of subsequently developing other types of mental disorders. This study aimed to undertake a comprehensive analysis of pair-wise lifetime comorbidity across a range of common mental disorders based on a diverse range of population-based surveys.
The WHO World Mental Health (WMH) surveys assessed 145 990 adult respondents from 27 countries. Based on retrospectively-reported age-of-onset for 24 DSM-IV mental disorders, associations were examined between all 548 logically possible temporally-ordered disorder pairs. Overall and time-dependent hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated using Cox proportional hazards models. Absolute risks were estimated using the product-limit method. Estimates were generated separately for men and women.
Each prior lifetime mental disorder was associated with an increased risk of subsequent first onset of each other disorder. The median HR was 12.1 (mean = 14.4; range 5.2–110.8, interquartile range = 6.0–19.4). The HRs were most prominent between closely-related mental disorder types and in the first 1–2 years after the onset of the prior disorder. Although HRs declined with time since prior disorder, significantly elevated risk of subsequent comorbidity persisted for at least 15 years. Appreciable absolute risks of secondary disorders were found over time for many pairs.
Survey data from a range of sites confirms that comorbidity between mental disorders is common. Understanding the risks of temporally secondary disorders may help design practical programs for primary prevention of secondary disorders.
In electron beams where space charge plays an important role in the beam transport, the beams’ transverse and longitudinal properties will become coupled. One example of this is the transverse–longitudinal correlation produced in a current-modulated beam generated in a DC electron gun, formed through the competition between the time-dependent radial space charge force and the time-independent radial focusing force. This correlation will cause both the slice radius and divergence of the beam extracted from the gun to depend on the slice current. Here we consider the transport of such a beam in a linearly tapered solenoid focusing channel. Transport performance was generally improved with longer taper lengths, minimal initial correlation between slice divergence and slice current, and moderate degrees of initial correlation between initial slice radius and slice current. Performance was also generally improved with lower slice emittances, although surprisingly transport was improved by slightly increasing the assumed slice emittance in certain limited circumstances.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Substantial clinical heterogeneity of major depressive disorder (MDD) suggests it may group together individuals with diverse aetiologies. Identifying distinct subtypes should lead to more effective diagnosis and treatment, while providing more useful targets for further research. Genetic and clinical overlap between MDD and schizophrenia (SCZ) suggests an MDD subtype may share underlying mechanisms with SCZ.
The present study investigated whether a neurobiologically distinct subtype of MDD could be identified by SCZ polygenic risk score (PRS). We explored interactive effects between SCZ PRS and MDD case/control status on a range of cortical, subcortical and white matter metrics among 2370 male and 2574 female UK Biobank participants.
There was a significant SCZ PRS by MDD interaction for rostral anterior cingulate cortex (RACC) thickness (β = 0.191, q = 0.043). This was driven by a positive association between SCZ PRS and RACC thickness among MDD cases (β = 0.098, p = 0.026), compared to a negative association among controls (β = −0.087, p = 0.002). MDD cases with low SCZ PRS showed thinner RACC, although the opposite difference for high-SCZ-PRS cases was not significant. There were nominal interactions for other brain metrics, but none remained significant after correcting for multiple comparisons.
Our significant results indicate that MDD case-control differences in RACC thickness vary as a function of SCZ PRS. Although this was not the case for most other brain measures assessed, our specific findings still provide some further evidence that MDD in the presence of high genetic risk for SCZ is subtly neurobiologically distinct from MDD in general.
Objective: Few studies have investigated the assessment and functional impact of egocentric and allocentric neglect among stroke patients. This pilot study aimed to determine (1) whether allocentric and egocentric neglect could be dissociated among a sample of stroke patients using eye tracking; (2) the specific patterns of attention associated with each subtype; and (3) the nature of the relationship between neglect subtype and functional outcome. Method: Twenty acute stroke patients were administered neuropsychological assessment batteries, a pencil-and-paper Apples Test to measure neglect subtype, and an adaptation of the Apples Test with an eye tracking measure. To test clinical discriminability, twenty age- and education-matched control participants were administered the eye tracking measure of neglect. Results: The eye tracking measure identified a greater number of individuals as having egocentric and/or allocentric neglect than the pencil-and-paper Apples Test. Classification of neglect subtype based on eye tracking performance was a significant predictor of functional outcome beyond that accounted for by the neuropsychological test performance and Apples Test neglect classification. Preliminary evidence suggests that patients with no neglect symptoms had superior functional outcomes compared with patients with neglect. Patients with combined egocentric and allocentric neglect had poorer functional outcomes than those with either subtype. Functional outcomes of patients with either allocentric or egocentric neglect did not differ significantly. The applications of our findings, to improve neglect detection, are discussed. Conclusion: Results highlight the potential clinical utility of eye tracking for the assessment and identification of neglect subtype among stroke patients to predict functional outcomes. (JINS, 2019, 25, 479–489)
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.