To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Invasive species drive biodiversity loss and lead to changes in parasite–host associations. Parasites are linked to invasions and can mediate invasion success and outcomes. We review theoretical and empirical research into parasites in biological invasions, focusing on a freshwater invertebrate study system. We focus on the effects of parasitic infection on host traits (behaviour and life history) that can mediate native/invader trophic interactions. We review evidence from the field and laboratory of parasite-driven changes in predation, intraguild predation and cannibalism. Theoretical work shows that the trait-mediated effects of parasites can be as strong as classical density effects and their impact on the host’s trophic interactions merits more consideration. We also report on evidence of broader cascading effects warranting deeper study. Biological invasion can lead to altered parasite–host associations. Focusing on amphipod invasions, we find patterns of parasite introduction and loss that mirror host invasion pathways, but also highlight the risks of introducing invasive parasites. Horizon scanning and impact predictions are vital in identifying future disease risks, potential pathways of introduction and suitable management measures for mitigation.
This study investigates the pressure–strain tensor (
) in Langmuir turbulence. The pressure–strain tensor is determined from large-eddy simulations (LES), and is partitioned into components associated with the mean current shear (rapid), the Stokes shear and the turbulent–turbulent (slow) interactions. The rapid component can be parameterized using existing closure models, although the coefficients in the closure models are particular to Langmuir turbulence. A closure model for the Stokes component is proposed, and it is shown to agree with results from the LES. The slow component of
does not agree with existing ‘return-to-isotropy’ closure models for five of the six components of the Reynolds stress tensor, and a new closure model is proposed that accounts for these deviations which vary systematically with Langmuir number,
, and depth. The implications of these results for second- and first-order closures of Langmuir turbulence are discussed.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
There are no estimates of the heritability of phenotypic udder traits in suckler sheep, which produce meat lambs, and whether these are associated with resilience to mastitis. Mastitis is a common disease which damages the mammary gland and reduces productivity. The aims of this study were to investigate the feasibility of collecting udder phenotypes, their heritability and their association with mastitis in suckler ewes. Udder and teat conformation, teat lesions, intramammary masses (IMM) and litter size were recorded from 10 Texel flocks in Great Britain between 2012 and 2014; 968 records were collected. Pedigree data were obtained from an online pedigree recording system. Univariate quantitative genetic parameters were estimated using animal and sire models. Linear mixed models were used to analyse continuous traits and generalised linear mixed models were used to analyse binary traits. Continuous traits had higher heritabilities than binary with teat placement and teat length heritability (h2) highest at 0.35 (SD 0.04) and 0.42 (SD 0.04), respectively. Udder width, drop and separation heritabilities were lower and varied with udder volume. The heritabilities of IMM and teat lesions (sire model) were 0.18 (SD 0.12) and 0.17 (SD 0.11), respectively. All heritabilities were sufficiently high to be in a selection programme to increase resilience to mastitis in the population of Texel sheep. Further studies are required to investigate genetic relationships between traits and to determine whether udder traits predict IMM, and the potential benefits from including traits in a selection programme to increase resilience to chronic mastitis.
Significant ethnic and socio-economic disparities exist in infectious diseases (IDs) rates in New Zealand, so accurate measures of these characteristics are required. This study compared methods of ascribing ethnicity and socio-economic status. Children in the Growing Up in New Zealand longitudinal cohort were ascribed to self-prioritised, total response and single-combined ethnic groups. Socio-economic status was measured using household income, and both census-derived and survey-derived deprivation indices. Rates of ID hospitalisation were compared using linked administrative data. Self-prioritised ethnicity was simplest to use. Total response accounted for mixed ethnicity and allowed overlap between groups. Single-combined ethnicity required aggregation of small groups to maintain power but offered greater detail. Regardless of the method used, Māori and Pacific children, and children in the most socio-economically deprived households had a greater risk of ID hospitalisation. Risk differences between self-prioritised and total response methods were not significant for Māori and Pacific children but single-combined ethnicity revealed a diversity of risk within these groups. Household income was affected by non-random missing data. The census-derived deprivation index offered a high level of completeness with some risk of multicollinearity and concerns regarding the ecological fallacy. The survey-derived index required extra questions but was acceptable to participants and provided individualised data. Based on these results, the use of single-combined ethnicity and an individualised survey-derived index of deprivation are recommended where sample size and data structure allow it.
Racial/ethnic minorities are more vulnerable to mental and physical health problems, but we know little about the psychobiological underpinnings of these disparities. In this study, we examined racial/ethnic differences in cortisol diurnal patterns and affect as initial steps toward elucidating long-term health disparities. A racially/ethnically diverse (39.5% White, 60.5% minority) sample of 370 adolescents (57.3% female) between the ages of 11.9 and 18 years (M = 14.65 years, SD = 1.39) participated in this study. These adolescents provided 16 cortisol samples (4 samples per day across 4 days), allowing the computation of diurnal cortisol slopes, the cortisol awakening response, and diurnal cortisol output (area under the curve), as well as daily diary ratings of high-arousal and low-arousal positive and negative affect. Consistent with prior research, we found that racial/ethnic minorities (particularly African American and Latino youth) exhibited flatter diurnal cortisol slopes compared to White youth, F (1, 344.7) = 5.26, p = .02, effect size g = 0.25. Furthermore, African American and Asian American youth reported lower levels of positive affect (both high arousal and low arousal) compared to White youth. Racial/ethnic differences in affect did not explain differences in cortisol patterns, suggesting a need to refine our models of relations between affect and hypothalamic–pituitary–adrenocortical activity. We conclude by proposing that a deeper understanding of cultural development may help elucidate the complex associations between affect and hypothalamic–pituitary–adrenocortical functioning and how they explain racial/ethnic differences in both affect and stress biology.
A total of 592 people reported gastrointestinal illness following attendance at Street Spice, a food festival held in Newcastle-upon-Tyne, North East England in February/March 2013. Epidemiological, microbiological and environmental investigations were undertaken to identify the source and prevent further cases. Several epidemiological analyses were conducted; a cohort study; a follow-up survey of cases and capture re-capture to estimate the true burden of cases. Indistinguishable isolates of Salmonella Agona phage type 40 were identified in cases and on fresh curry leaves used in one of the accompaniments served at the event. Molecular testing indicated entero-aggregative Escherichia coli and Shigella also contributed to the burden of illness. Analytical studies found strong associations between illness and eating food from a particular stall and with food items including coconut chutney which contained fresh curry leaves. Further investigation of the food supply chain and food preparation techniques identified a lack of clear instruction on the use of fresh uncooked curry leaves in finished dishes and uncertainty about their status as a ready-to-eat product. We describe the investigation of one of the largest outbreaks of food poisoning in England, involving several gastrointestinal pathogens including a strain of Salmonella Agona not previously seen in the UK.
We investigated risk factors for severe acute lower respiratory infections (ALRI) among hospitalised children <2 years, with a focus on the interactions between virus and age. Statistical interactions between age and respiratory syncytial virus (RSV), influenza, adenovirus (ADV) and rhinovirus on the risk of ALRI outcomes were investigated. Of 1780 hospitalisations, 228 (12.8%) were admitted to the intensive care unit (ICU). The median (range) length of stay (LOS) in hospital was 3 (1–27) days. An increase of 1 month of age was associated with a decreased risk of ICU admission (rate ratio (RR) 0.94; 95% confidence intervals (CI) 0.91–0.98) and with a decrease in LOS (RR 0.96; 95% CI 0.95–0.97). Associations between RSV, influenza, ADV positivity and ICU admission and LOS were significantly modified by age. Children <5 months old were at the highest risk from RSV-associated severe outcomes, while children >8 months were at greater risk from influenza-associated ICU admissions and long hospital stay. Children with ADV had increased LOS across all ages. In the first 2 years of life, the effects of different viruses on ALRI severity varies with age. Our findings help to identify specific ages that would most benefit from virus-specific interventions such as vaccines and antivirals.
After the public outcry over backdating, many firms began scheduling option grants. This eliminates backdating but creates other agency problems: Chief executive officers (CEOs) aware of upcoming option grants have an incentive to temporarily depress stock prices to obtain lower strike prices. We show that some CEOs have manipulated stock prices to increase option compensation, documenting negative abnormal returns before scheduled option grants and positive abnormal returns afterward. These returns are explained by measures of CEOs’ incentives and ability to influence stock prices. We document several mechanisms used to lower stock price, including changing the substance and timing of disclosures.
Some centres favour early intervention for ureteral colic while others prefer trial of spontaneous passage, and relative outcomes are poorly described. Calgary and Vancouver have similar populations and physician expertise, but differing approaches to ureteral colic. We studied 60-day hospitalization and intervention rates for patients having a first emergency department (ED) visit for ureteral colic in these diverse systems.
We used administrative data and structured chart review to study all Vancouver and Calgary patients with an index visit for ureteral colic during 2014. Patient demographics, arrival characteristics and triage category were captured from ED information systems, while ED visits and admissions were captured from linked regional hospital databases. Laboratory results were obtained from electronic health records and stone characteristics were abstracted from diagnostic imaging reports. Our primary outcome was hospitalization or urological intervention from 0 to 60 days. Secondary outcomes included ED revisits, readmissions and rescue interventions. Time to event analysis was conducted and Cox Proportional Hazards modelling was performed to adjust for covariate imbalance.
We studied 3283 patients with CT-defined stones. Patient and stone characteristics were similar for the cities. Hospitalization or intervention occurred in 60.9% of Calgary patients and 31.3% of Vancouver patients (p<0.001). Calgary patients had higher index intervention rates (52.1% v. 7.5%), and experienced more ED revisits and hospital readmissions during follow-up. The data suggest that outcome events were associated with overtreatment of small stones in one city and undertreatment of large stones in the other.
An early interventional approach was associated with higher ED revisit, hospitalization and intervention rates. If these events are markers of patient disability, then a less interventional approach to small stones and earlier definitive management of large stones may reduce system utilization and improve outcomes for patients with acute ureteral colic.
This study was a randomised, double-blind, placebo-controlled cross-over trial examining the effects of β-hydroxy β-methylbutyrate free acid (HMB-FA) supplementation on muscle protein breakdown, cortisol, testosterone and resting energy expenditure (REE) during acute fasting. Conditions consisted of supplementation with 3 g/d HMB-FA or placebo during a 3-d meat-free diet followed by a 24-h fast. Urine was collected before and during the 24-h fast for analysis of 3-methylhistidine:creatinine ratio (3MH:CR). Salivary cortisol, testosterone, their ratio (T:C), and the cortisol awakening response were assessed. ANOVA was used to analyse all dependent variables, and linear mixed models were used to confirm the absence of carryover effects. Eleven participants (six females, five males) completed the study. Urinary HMB concentrations confirmed compliance with supplementation. 3MH:CR was unaffected by fasting and supplementation, but the cortisol awakening response differed between conditions. In both conditions, cortisol increased from awakening to 30 min post-awakening (P=0·01). Cortisol was reduced from 30 to 45 min post-awakening with HMB-FA (−32 %, d=−1·0, P=0·04), but not placebo (PL) (−6 %, d=−0·2, P=0·14). In males, T:C increased from 0 to 24 h of fasting with HMB-FA (+162 %, d=3·0, P=0·001), but not placebo (+13 %, d=0·4, P=0·60), due to reductions in cortisol. REE was higher at 24 h of fasting than 16 h of fasting independent of supplementation (+4·0 %, d=0·3, P=0·04). In conclusion, HMB-FA may affect cortisol responses, but not myofibrillar proteolysis, during acute 24-h fasting.
Neurocognitive deficits are often seen as core features of schizophrenia, and as primary determinants of poor functioning. Yet, our clinical observations suggest that individuals who score within the impaired range on standardized tests can reliably perform better in complex real-world situations, especially when performance is embedded within a positive socio-affective context.
We analyzed literature on the influence of non-neurocognitive factors on test performance in order to clarify their contributions.
We identified seven non-neurocognitive factors that significantly contribute to neurocognitive test performance: avolition, dysfunctional attitudes, effort, stress, negative emotions, asociality, and disorganized symptoms. We then proposed an alternative model based on dysfunctional (e.g. defeatist) attitudes and their consequences for motivation and sustained task engagement. We demonstrated that these factors account for substantial variance in negative symptoms, neurocognitive test performance, and functional outcomes. We then demonstrated that recovery-oriented cognitive therapy – which is derived from this alternative model and primarily targets dysfunctional beliefs – has been successful in the treatment of low functioning individuals with schizophrenia.
The contributions of neurocognitive impairments to poor real-world functioning in people with schizophrenia may be overstated in the literature, and may even be limited relative to non-neurocognitive factors. We offer suggestions for further research to more precisely quantify the contributions of attitudinal/motivation v. neurocognitive factors in schizophrenia.
Field studies of grazing management have frequently concluded that the magnitude and direction of vegetation response is dependent on initial vegetation condition. On upland heath, this dependence reflects the importance of small-scale ecological processes (e.g. plant competition), and local neighbourhood effects (e.g. spatial distribution of plant species), in driving the vegetation dynamics. These small-scale effects, together with variation in grazing patterns, increase the difficulty of deriving general rules about the effect of grazing on vegetation change from field studies. However, we need to determine the impacts of such grazing-related vegetation change upon biodiversity, (e.g. birds). For many bird species it is impractical to use experimental approaches due to low breeding densities, and the influence of other site and management effects (e.g. predator control). To predict the effect of management changes on them requires an accurate assessment of the large-scale effects of grazing management on the ecological landscape using data from small-scale field studies. This paper sets out an approach that integrates field studies with theoretical models to investigate the large-scale effects of grazing management on plant and bird communities on upland heath.
Lameness is a major economic and welfare issue in horses and cattle. Many factors contribute to the cause of lameness. The composition of hoof horn is one of the factors which determines the integrity and strength of the cornified epidermis of the wall and sole. Although keratins are an important component of hoof horn, other constituents such as lipids also contribute to the functional and structural integrity of the horn. In only one study has any attempt been made to quantify the major lipid classes in the equine hoof (Wertz and Downing, 1984).The objective of this study was to characterise the lipid composition of equine hoof horn.