To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The diet of most adults is low in fish and, therefore, provides limited quantities of the long-chain, omega-3 fatty acids (LCn-3FAs), eicosapentaenoic and docosahexaenoic acids (EPA, DHA). Since these compounds serve important roles in the brain, we sought to determine if healthy adults with low-LCn-3FA consumption would exhibit improvements in neuropsychological performance and parallel changes in brain morphology following repletion through fish oil supplementation.
In a randomized, controlled trial, 271 mid-life adults (30–54 years of age, 118 men, 153 women) consuming ⩽300 mg/day of LCn-3FAs received 18 weeks of supplementation with fish oil capsules (1400 mg/day of EPA and DHA) or matching placebo. All participants completed a neuropsychological test battery examining four cognitive domains: psychomotor speed, executive function, learning/episodic memory, and fluid intelligence. A subset of 122 underwent neuroimaging before and after supplementation to measure whole-brain and subcortical tissue volumes.
Capsule adherence was over 95%, participant blinding was verified, and red blood cell EPA and DHA levels increased as expected. Supplementation did not affect performance in any of the four cognitive domains. Exploratory analyses revealed that, compared to placebo, fish oil supplementation improved executive function in participants with low-baseline DHA levels. No changes were observed in any indicator of brain morphology.
In healthy mid-life adults reporting low-dietary intake, supplementation with LCn-3FAs in moderate dose for moderate duration did not affect neuropsychological performance or brain morphology. Whether salutary effects occur in individuals with particularly low-DHA exposure requires further study.
Background: Quality of life (QOL) is of great importance in dementia. We examined QOL across types of dementia in patients presenting to a rural and remote memory clinic (RRMC). Methods: This analysis included 343 RRMC patients seen between 2004 and 2016. Patients were diagnosed with mild cognitive impairment (MCI, n=74), frontotemporal dementia (FTD, n=42), Alzheimer’s disease (AD, n=187), vascular dementia (VD, n=22), or Lewy Body dementia (DLB, n=18). Patients and caregivers completed questionnaires at their initial visit. Data collection included patient-rated patient QOL (QOL-PT), caregiver-rated patient QOL (QOL-CG), MMSE score, age, and other patient demographics. Statistical analysis assessed patient variables and differences in QOL across types of dementia using one-way ANOVA, χ2 tests, and t-tests. Results: QOL-PT did not differ by diagnosis, whereas QOL-CG did. QOL-CG was significantly higher in MCI (34.6±7.1) compared to FTD (30.9±5.2) and AD (31.7±5.9). QOL-PT and QOL-CG differed in certain dementia types. QOL-PT was significantly higher than QOL-CG in MCI (QOL-PT=37.3±5.0, QOL-CG=35.3±7.3), FTD (QOL-PT=37.2±6.1, QOL-CG=31.7±5.5), and AD (QOL-PT=37.0±9.7, QOL-CG=32.1±5.9). Conclusions: We found that QOL-PT does not differ across dementia types, QOL-CG is higher in MCI compared to FTD and AD, and patients rate their own QOL higher than their caregivers do in MCI, FTD, and AD.
Background: To determine whether there is a difference in the average annual rate of decline in Mini Mental Status Examination (MMSE) scores between those with Alzheimer’s disease, vascular dementia, frontotemporal dementia and dementia with Lewy bodies. Methods: We conducted a retrospective chart review of 225 consecutive patients with dementia who attended the Rural and Remote Memory Clinic in Saskatoon, Saskatchewan. The data collected included MMSE scores and demographic information. Statistical analysis with ANOVA compared the average the annual rate of decline in MMSE score between patients with different types of dementia. Results: There was no statistically significant difference in the rate of MMSE score decline between these groups. Patients with frontotemporal dementia and vascular dementia were referred to the clinic at younger ages than those with Alzheimer’s disease and dementia with Lewy bodies. Conclusions: The rate of decline in MMSE did not differ between these four types of dementia. Patients with frontotemporal dementia and vascular dementia often experience cognitive decline earlier in life than those with Alzheimer’s disease and dementia with Lewy bodies.
Background: Safety behaviours are ubiquitous across anxiety disorders and are associated with the aetiology, maintenance and exacerbation of anxiety. Cognitive behavioural models posit that beliefs about safety behaviours directly influence their use. Therefore, beliefs about safety behaviours may be an important component in decreasing safety behaviour use. Unfortunately, little empirical research has evaluated this theorized relationship.
Aims: The present study aimed to examine the predictive relationship between beliefs about safety behaviours and safety behaviour use while controlling for anxiety severity.
Method: Adults with clinically elevated levels of social anxiety (n = 145) and anxiety sensitivity (n = 109) completed an online survey that included established measures of safety behaviour use, quality of life, and anxiety severity. Participants also completed the Safety Behaviour Scale (SBS), a measure created for the current study which includes a transdiagnostic checklist of safety behaviours, as well as questions related to safety behaviour use and beliefs about safety behaviours.
Results: Within both the social anxiety and anxiety sensitivity groups, positive beliefs about safety behaviours predicted greater safety behaviour use, even when controlling for anxiety severity. Certain beliefs were particularly relevant in predicting safety behaviour use within each of the clinical analogue groups.
Conclusions: Findings suggest that efforts to decrease safety behaviour use during anxiety treatment may benefit from identifying and modifying positive beliefs about safety behaviours.
Clostridium difficile infections (CDIs) affect patients in hospitals and in the community, but the relative importance of transmission in each setting is unknown. We developed a mathematical model of C. difficile transmission in a hospital and surrounding community that included infants, adults and transmission from animal reservoirs. We assessed the role of these transmission routes in maintaining disease and evaluated the recommended classification system for hospital- and community-acquired CDIs. The reproduction number in the hospital was <1 (range: 0.16–0.46) for all scenarios. Outside the hospital, the reproduction number was >1 for nearly all scenarios without transmission from animal reservoirs (range: 1.0–1.34). However, the reproduction number for the human population was <1 if a minority (>3.5–26.0%) of human exposures originated from animal reservoirs. Symptomatic adults accounted for <10% transmission in the community. Under conservative assumptions, infants accounted for 17% of community transmission. An estimated 33–40% of community-acquired cases were reported but 28–39% of these reported cases were misclassified as hospital-acquired by recommended definitions. Transmission could be plausibly sustained by asymptomatically colonised adults and infants in the community or exposure to animal reservoirs, but not hospital transmission alone. Under-reporting of community-onset cases and systematic misclassification underplays the role of community transmission.
OBJECTIVES/SPECIFIC AIMS: Persons living with HIV (PLWH) are at increased risk for fragility bone disease. Current osteoporosis screening guidelines do not account for HIV status, and clinical risk assessment tools are not sensitive in PLWH. We examined the value of traditional osteoporosis risk factors, HIV-specific indices, and bone turnover biomarkers in predicting low bone mineral density (BMD) in PLWH. METHODS/STUDY POPULATION: Demographic and clinical characteristics, dual energy x-ray absorptiometry (DXA)-derived BMD, HIV indices (viral load, CD4 count, antiretroviral therapy [ART]), and biomarkers of bone turnover (C-terminal telopeptide of collagen [CTx], osteocalcin [OCN]) were evaluated in a cross-sectional analysis of PLWH (n=248) and HIV- controls (n=183). The primary outcome was low BMD, defined as osteopenia or osteoporosis by WHO criteria. Multivariable logistic and modified Poisson regression models were used to assess associations between low BMD and covariates of interest. RESULTS/ANTICIPATED RESULTS: Overall, median age was 44 years, 48% were male, 88% were black, median body mass index (BMI) was 28 kg/m2, 72% smoked cigarettes, and 53% used alcohol; characteristics did not differ by HIV status. PLWH had a mean CD4 of 408 cells/mm3, 55% were ART-naïve, and 45% had viral suppression on ART. Overall, 25% (109/431) had low BMD, including 31% of PLWH compared to 16% of HIV- controls. In multivariable models, HIV was significantly associated with low BMD (aOR 2.46, 95%CI 1.39-4.34; aRR 1.90, 95%CI 1.18-3.07). Adjusting for HIV, three traditional risks– age, race, and BMI– were independently associated with low BMD in the full cohort. However, bone turnover markers, CTx and OCN, were better able to discriminate low vs. normal BMD in PLWH compared to HIV- controls. In PLWH, mean serum CTx was 23% higher in low vs. normal BMD (mean CTx difference=0.06 ug/mL); in HIV- controls, no association with BMD was observed (mean CTx difference=0 ug/mL). In PLWH, mean serum OCN was 38% higher in those with low vs. normal BMD (mean OCN difference=2.48 ug/mL); in HIV- controls, mean serum OCN was only 16% higher in those with low vs. normal BMD (mean OCN difference=1.08 ug/mL). DISCUSSION/SIGNIFICANCE OF IMPACT: In PLWH as opposed to HIV- controls, serum biomarkers reflecting a high bone turnover state, may discriminate individuals with low versus normal BMD. Because changes in biomarkers precede changes in BMD, these markers should be explored further either alone or in combination with traditional risk assessment tools to improve early screening for osteoporosis in PLWH.
Many seed quality tests are conducted by first randomly assigning seeds into replicates of a given size. The replicate results are then used to check whether or not any problems occur in the realization of the test. The two main tools developed for this verification are the ratio of the observed variance of the replicate results to a theoretical variance and the tolerance for the range of the results. In this paper, we derive the theoretical distribution and its related properties of the sequence of numbers of seeds with a given quality attribute present in the replicates. From these theoretical results, we revisit the two quality checking tools widely used for the germination test. We show a precaution to be taken when relying on the variance ratio to check for under- or over-dispersion of the replicate results. This has led to the development of tables providing credible intervals of the variance ratio. The International Seed Testing Association tolerance tables for the range of the results are also compared with tolerances computed from the exact theoretical distribution of the range, leading us to recommend a revision of these tables.
Eight ruminally-fistulated wethers were used to examine the temporal effects of afternoon (PM; 1600h) v. morning (AM; 0800 h) allocation of fresh spring herbage from a perennial ryegrass (Lolium perenne L.)-based pasture on fermentation and microbial community dynamics. Herbage chemical composition was minimally affected by time of allocation, but daily mean ammonia concentrations were greater for the PM group. The 24-h pattern of ruminal fermentation (i.e. time of sampling relative to time of allocation), however, varied considerably for all fermentation variables (P⩽0.001). Most notably amongst ruminal fermentation characteristics, ammonia concentrations showed a substantial temporal variation; concentrations of ammonia were 1.7-, 2.0- and 2.2-fold greater in rumens of PM wethers at 4, 6 and 8h after allocation, respectively, compared with AM wethers. The relative abundances of archaeal and ciliate protozoal taxa were similar across allocation groups. In contrast, the relative abundances of members of the rumen bacterial community, like Prevotella 1 (P=0.04), Bacteroidales RF16 group (P=0.005) and Fibrobacter spp. (P=0.008) were greater for the AM group, whereas the relative abundance of Kandleria spp. was greater (P=0.04) for the PM group. Of these taxa, only Prevotella 1 (P=0.04) and Kandleria (P<0.001) showed a significant interaction between time of allocation and time of sampling relative to feed allocation. Relative abundances of Prevotella 1 were greater at 2h (P=0.05), 4h (P=0.003) and 6h (P=0.01) after AM allocation of new herbage, whereas relative abundances of Kandleria were greater at 2h (P=0.003) and 4h (P<0.001) after PM allocation. The early post-allocation rise in ammonia concentrations in PM rumens occurred simultaneously with sharp increases in the relative abundance of Kandleria spp. and with a decline in the relative abundance of Prevotella. All measures of fermentation and most microbial community composition data showed highly dynamic changes in concentrations and genus abundances, respectively, with substantial temporal changes occurring within the first 8h of allocating a new strip of herbage. The dynamic changes in the relative abundances of certain bacterial groups, in synchrony with a substantial diurnal variation in ammonia concentrations, has potential effects on the efficiency by which N is utilised by the grazing ruminant.
Chemical weed control remains a widely used component of integrated weed management strategies because of its cost-effectiveness and rapid removal of crop pests. Additionally, dicamba-plus-glyphosate mixtures are a commonly recommended herbicide combination to combat herbicide resistance, specifically in recently commercially released dicamba-tolerant soybean and cotton. However, increased spray drift concerns and antagonistic interactions require that the application process be optimized to maximize biological efficacy while minimizing environmental contamination potential. Field research was conducted in 2016, 2017, and 2018 across three locations (Mississippi, Nebraska, and North Dakota) for a total of six site-years. The objectives were to characterize the efficacy of a range of droplet sizes [150 µm (Fine) to 900 µm (Ultra Coarse)] using a dicamba-plus-glyphosate mixture and to create novel weed management recommendations utilizing pulse-width modulation (PWM) sprayer technology. Results across pooled site-years indicated that a droplet size of 395 µm (Coarse) maximized weed mortality from a dicamba-plus-glyphosate mixture at 94 L ha–1. However, droplet size could be increased to 620 µm (Extremely Coarse) to maintain 90% of the maximum weed mortality while further mitigating particle drift potential. Although generalized droplet size recommendations could be created across site-years, optimum droplet sizes within each site-year varied considerably and may be dependent on weed species, geographic location, weather conditions, and herbicide resistance(s) present in the field. The precise, site-specific application of a dicamba-plus-glyphosate mixture using the results of this research will allow applicators to more effectively utilize PWM sprayers, reduce particle drift potential, maintain biological efficacy, and reduce the selection pressure for the evolution of herbicide-resistant weeds.
The importance of parasites as a selective force in host evolution is a topic of current interest. However, short-term ecological studies of host–parasite systems, on which such studies are usually based, provide only snap-shots of what may be dynamic systems. We report here on four surveys, carried out over a period of 12 years, of helminths of spiny mice (Acomys dimidiatus), the numerically dominant rodents inhabiting dry montane wadis in the Sinai Peninsula. With host age (age-dependent effects on prevalence and abundance were prominent) and sex (female bias in abundance in helminth diversity and in several taxa including Cestoda) taken into consideration, we focus on the relative importance of temporal and spatial effects on helminth infracommunities. We show that site of capture is the major determinant of prevalence and abundance of species (and higher taxa) contributing to helminth community structure, the only exceptions being Streptopharaus spp. and Dentostomella kuntzi. We provide evidence that most (notably the Spiruroidea, Protospirura muricola, Mastophorus muris and Gongylonema aegypti, but with exceptions among the Oxyuroidae, e.g. Syphacia minuta), show elements of temporal-site stability, with a rank order of measures among sites remaining similar over successive surveys. Hence, there are some elements of predictability in these systems.
Background: A will, power of attorney and advanced healthcare directive are critical to guide decision-making in people with cognitive decline. We identified characteristics that are associated with the existence of these documents in patients who presented to a rural and remote memory clinic (RRMC). Methods: 95 consecutive patients were included in this study. Patients and caregivers completed questionnaires on initial presentation to the RRMC and patients were asked if they have legal documents. Patients also completed neuropsychological testing. Statistical analysis (t-test and χ2 test) was performed to identify significant variables. Results: 70 patients had a will, 62 had a power of attorney and 21 had an advanced healthcare directive. Having a will was associated with good quality of life (p=0.001), living alone (p=0.034), poor verbal fluency (p=0.055) and European ethnicity (p=0.028). Factors associated with having a power of attorney included good quality of life (p=0.031), living alone (p=0.053) and poor verbal fluency (p=0.015). Old age (p=0.015), poor verbal fluency (p=0.023) and severity of cognitive and functional impairment (p=0.023) were associated with having an advanced healthcare directive. Conclusions: Our results indicate that poor quality of life, good verbal fluency, non-European ethnicity and living with others are associated with a lower likelihood of creating legal documents in patients with cognitive decline
While parasite infection can have substantial fitness consequences in organisms, the predictors of parasite prevalence and intensity are often complex and vary depending on the host species. Here, we examined correlates of Haemoproteus (a common malaria parasite) prevalence and intensity in an opportunistically breeding songbird, the red crossbill (Loxia curvirostra). Specifically, we quantified Haemoproteus prevalence and intensity in crossbills caught in the Grand Teton National Park from 2010 to 2013. We found that parasite prevalence varies seasonally and across years, with the highest number of infected individuals occurring in the summer, although there was variation across summers sampled, and that prevalence was positively related to annual mean cone crop sizes (a measure of crossbill food abundance) and daily ambient temperature (a correlate of vector abundance). Parasite intensity was significantly and positively related to one measure of innate immunity, leucocyte counts per blood volume. Finally, neither crossbill age, ecomorph, nor sex had significant effects on parasite infection intensity; however, parasite prevalence did significantly vary among ecomorph and age classes. These results support the interpretation that a combination of physiological (specifically immune activity) and environmental factors affects parasite prevalence and infection intensity in this opportunistically breeding avian species.
Tonsillectomy is a common procedure with significant post-operative pain. This study was designed to compare post-operative pain, returns to a normal diet and normal activity, and duration of regular analgesic use in Coblation and bipolar tonsillectomy patients.
A total of 137 patients, aged 2–50 years, presenting to a single institution for tonsillectomy or adenotonsillectomy were recruited. Pain level, diet, analgesic use, return to normal activity and haemorrhage data were collected.
Coblation tonsillectomy was associated with significantly less pain than bipolar tonsillectomy on post-operative days 1 (p = 0.005), 2 (p = 0.006) and 3 (p = 0.010). Mean pain scores were also significantly lower in the Coblation group (p = 0.039). Coblation patients had a significantly faster return to normal activity than bipolar tonsillectomy patients (p < 0.001).
Coblation tonsillectomy is a less painful technique compared to bipolar tonsillectomy in the immediate post-operative period and in the overall post-operative period. This allows a faster return to normal activity and decreased analgesic requirements.
Policy-makers and practitioners have a need to assess community resilience in disasters. Prior efforts conflated resilience with community functioning, combined resistance and recovery (the components of resilience), and relied on a static model for what is inherently a dynamic process. We sought to develop linked conceptual and computational models of community functioning and resilience after a disaster.
We developed a system dynamics computational model that predicts community functioning after a disaster. The computational model outputted the time course of community functioning before, during, and after a disaster, which was used to calculate resistance, recovery, and resilience for all US counties.
The conceptual model explicitly separated resilience from community functioning and identified all key components for each, which were translated into a system dynamics computational model with connections and feedbacks. The components were represented by publicly available measures at the county level. Baseline community functioning, resistance, recovery, and resilience evidenced a range of values and geographic clustering, consistent with hypotheses based on the disaster literature.
The work is transparent, motivates ongoing refinements, and identifies areas for improved measurements. After validation, such a model can be used to identify effective investments to enhance community resilience. (Disaster Med Public Health Preparedness. 2018;12:127–137)
Campylobacter sp. are a globally significant cause of gastroenteritis. Although rates of infection in Australia are among the highest in the industrialized world, studies describing campylobacteriosis incidence in Australia are lacking. Using national disease notification data between 1998 and 2013 we examined Campylobacter infections by gender, age group, season and state and territory. Negative binomial regression was used to estimate incidence rate ratios (IRRs), including trends by age group over time, with post-estimation commands used to obtain adjusted incidence rates. The incidence rate for males was significantly higher than for females [IRR 1·20, 95% confidence interval (CI) 1·18–1·21], while a distinct seasonality was demonstrated with higher rates in both spring (IRR 1·18, 95% CI 1·16–1·20) and summer (IRR 1·17, 95% CI 1·16–1·19). Examination of trends in age-specific incidence over time showed declines in incidence in those aged <40 years combined with contemporaneous increases in older age groups, notably those aged 70–79 years (IRR 1998–2013: 1·75, 95% CI 1·63–1·88). While crude rates continue to be highest in children, our findings suggest the age structure for campylobacteriosis in Australia is changing, carrying significant public health implications for older Australians.
The chromosphere is a complex region that acts as an intermediary between the magnetic flux emergence in the photosphere and the magnetic features seen in the corona. Large eruptions in the chromosphere of flares and filaments are often accompanied by ejections of coronal mass off the sun. Several studies have observed fast-moving progressive trains of compact bright points (called Sequential Chromospheric Brightenings or SCBs) streaming away from chromospheric flares that also produce a coronal mass ejection (CME). In this work, we review studies of SCBs and search for commonalties between them. We place these findings into a larger context with contemporary chromospheric and coronal observations. SCBs are fleeting indicators of the solar atmospheric environment as it existed before their associated eruption. Since they appear at the very outset of a flare eruption, SCBs are good early indication of a CME measured in the chromosphere.
A brief summary is presented of our current knowledge of the structure of cold molecular cloud cores that do not contain protostars, sometimes known as starless cores. The most centrally condensed starless cores are known as pre-stellar cores. These cores probably represent observationally the initial conditions for protostellar collapse that must be input into all models of star formation. The current debate over the nature of core density profiles is summarised. A cautionary note is sounded over the use of such profiles to ascertain the equilibrium status of cores. The magnetic field structure of pre-stellar cores is also briefly discussed.