To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The aim of this study was to test the hypotheses that differences in residual feed intake (RFI) of beef steers are related to diet sorting, diet nutrient composition, energy intake and apparent digestibility. To phenotype steers for RFI, 69 weaned Angus × Hereford steers were fed individually for 56 days. A finishing diet was fed twice daily on an ad libitum basis to maintain approximately 0.5 to 1.0 kg refusals. Diet offered and refused was measured daily, and DM intakes (DMI) were calculated by difference. Body weights were recorded at 14-day intervals following an 18-h solid feed withdrawal. The residual feed intake was determined as the residual of the regression of DMI versus mid-test metabolic BW (BW0.75) and average daily gain (ADG). Particle size distributions of diet and refusals were determined using the Penn State Particle Separator to quantify diet sorting. Sampling of diet, refusals and feces were repeated in four sampling periods which occurred during weeks 2, 4, 6 and 8 of the study. Particle size distributions of refusals and diet were analyzed in weeks 2, 4 and 6, and sampling for chemical analysis of refusals and feces occurred in all four periods. Indigestible neutral detergent fiber (288 h in situ) was used as an internal marker of apparent digestibility. We conclude that preference for the intakes of particles > 19 mm and 4 to 8 mm were negatively correlated to RFI and ADG, respectively. Although steers did sort to consume a different diet composition than offered, diet sorting did not impact intake energy, digestible energy or DM digestibility.
A wide margin of crop safety is a desirable trait of POST herbicides, and investigation of crop tolerance is a key step in evaluation of new herbicides. Six field experiments were conducted in Ontario, Canada, from 2017 to 2018 to examine the influence of corn (Zea mays L.) hybrid (DKC42-60RIB, DKC43-47RIB, P0094AM, and P9840AM), application rate (1X and 2X), and application timing (PRE, V1, V3, and V5) on the tolerance of field corn to tolpyralate, a new 4-hydroxyphenyl pyruvate dioxygenase inhibitor, co-applied with atrazine. Two corn hybrids (DKC42-60RIB and DKC43-47RIB) exhibited slightly greater visible injury from tolpyralate + atrazine, applied POST, than P0094AM and P9840AM at 1 to 2 wk after application (WAA); hybrids responded similarly with respect to height, grain moisture, and yield. Applications of tolpyralate + atrazine at a 2X rate (80 + 2,000 g ai ha−1) induced greater injury (≤31.6%) than the field rate (40 + 1,000 g ha−1) (≤11.6%); the 2X rate applied at V1 or V3 decreased corn height and slightly increased grain moisture at harvest. On average, field rates resulted in marginally higher grain yields than 2X rates. Based on mixed-model multiple stepwise regression analysis, the air temperature at application, time of day, temperature range in the 24 h before application, and precipitation following application were useful predictor variables in estimating crop injury with tolpyralate + atrazine; however, additional environmental variables also affected crop injury. These results demonstrate the margin of corn tolerance with tolpyralate + atrazine, which provides a basis for optimization of application timing, rate, and corn hybrid selection to mitigate the risk of crop injury with this herbicide tank mixture.
Apolipoprotein E (APOE) E4 is the main genetic risk factor for Alzheimer’s disease (AD). Due to the consistent association, there is interest as to whether E4 influences the risk of other neurodegenerative diseases. Further, there is a constant search for other genetic biomarkers contributing to these phenotypes, such as microtubule-associated protein tau (MAPT) haplotypes. Here, participants from the Ontario Neurodegenerative Disease Research Initiative were genotyped to investigate whether the APOE E4 allele or MAPT H1 haplotype are associated with five neurodegenerative diseases: (1) AD and mild cognitive impairment (MCI), (2) amyotrophic lateral sclerosis, (3) frontotemporal dementia (FTD), (4) Parkinson’s disease, and (5) vascular cognitive impairment.
Genotypes were defined for their respective APOE allele and MAPT haplotype calls for each participant, and logistic regression analyses were performed to identify the associations with the presentations of neurodegenerative diseases.
Our work confirmed the association of the E4 allele with a dose-dependent increased presentation of AD, and an association between the E4 allele alone and MCI; however, the other four diseases were not associated with E4. Further, the APOE E2 allele was associated with decreased presentation of both AD and MCI. No associations were identified between MAPT haplotype and the neurodegenerative disease cohorts; but following subtyping of the FTD cohort, the H1 haplotype was significantly associated with progressive supranuclear palsy.
This is the first study to concurrently analyze the association of APOE isoforms and MAPT haplotypes with five neurodegenerative diseases using consistent enrollment criteria and broad phenotypic analysis.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
In benefit-cost analysis, fatality risk reductions are usually valued based on estimates of adults’ willingness to pay for changes in their own risks, regardless of whether the risk reduction accrues to adults or children. This approach reflects the relatively large number of valuation studies that address adults; however, the literature on children is growing. We review these studies, focusing on those that estimate values for both adults and children using a consistent approach to limit the effects of between-study variability. We rely on explicit selection criteria to identify studies that measure reasonably comparable outcomes and are candidates for application to analyses of U.S. policies. The ratio of values for children to values for adults ranges from 0.6 to 2.9; however, most estimates are greater than 1.5. Although some studies suggest that the divergence between child and adult values decreases as the child ages, this finding is not universal. We conclude that analysts should test the sensitivity of their results to the use of higher values for children than adults. Additional empirical research is needed to support more precise estimates of the variation in values by age that can be featured in the primary analysis.
Effective POST herbicides and herbicide mixtures are key components of integrated weed management in corn; however, herbicides vary in their efficacy based on application timing. Six field experiments were conducted over 2 yr (2017–2018) in southwestern Ontario, Canada, to determine the effects of herbicide application timing and rate on the efficacy of tolpyralate, a new 4-hydroxyphenyl pyruvate dioxygenase inhibitor. Tolpyralate at 15, 30, or 40 g ai ha−1 in combination with atrazine at 500 or 1,000 g ai ha−1 was applied PRE, early POST, mid-POST, or late POST. Tolpyralate + atrazine at rates ≥30 + 1,000 g ha−1 provided equivalent control of common lambsquarters and Powell amaranth applied PRE or POST, whereas no rate applied PRE controlled common ragweed, velvetleaf, barnyardgrass, or green foxtail. Common ragweed, common lambsquarters, velvetleaf, and Powell amaranth were controlled equally regardless of POST timing. In contrast, control of barnyardgrass and green foxtail declined when herbicide application was delayed to the late-POST timing, irrespective of herbicide rate. Similarly, corn grain yield declined within each tolpyralate + atrazine rate when herbicide applications were delayed to late-POST timing. Overall, the results of this study indicate that several monocot and dicot weed species can be controlled with tolpyralate + atrazine with an early to mid-POST herbicide application timing, before weeds reach 30 cm in height, and Powell amaranth and common lambsquarters can also be controlled PRE. Additionally, this study provides further evidence highlighting the importance of effective, early-season weed control in corn.
To describe snacking characteristics and patterns in children and examine associations with diet quality and BMI.
Children’s weight and height were measured. Participants/adult proxies completed multiple 24 h dietary recalls. Snack occasions were self-identified. Snack patterns were derived for each sample using exploratory factor analysis. Associations of snacking characteristics and patterns with Healthy Eating Index-2010 (HEI-2010) score and BMI were examined using multivariable linear regression models.
Childhood Obesity Prevention and Treatment Research (COPTR) Consortium, USA: NET-Works, GROW, GOALS and IMPACT studies.
Two snack patterns were derived for three studies: a meal-like pattern and a beverage pattern. The IMPACT study had a similar meal-like pattern and a dairy/grains pattern. A positive association was observed between meal-like pattern adherence and HEI-2010 score (P for trend < 0⋅01) and snack occasion frequency and HEI-2010 score (β coefficient (95 % CI): NET-Works, 0⋅14 (0⋅04, 0⋅23); GROW, 0⋅12 (0⋅02, 0⋅21)) among younger children. A preference for snacking while using a screen was inversely associated with HEI-2010 score in all studies except IMPACT (β coefficient (95 % CI): NET-Works, −3⋅15 (−5⋅37, −0⋅92); GROW, −2⋅44 (−4⋅27, −0⋅61); GOALS, −5⋅80 (−8⋅74, −2⋅86)). Associations with BMI were almost all null.
Meal-like and beverage patterns described most children’s snack intake, although patterns for non-Hispanic Blacks or adolescents may differ. Diets of 2–5-year-olds may benefit from frequent meal-like pattern snack consumption and diets of all children may benefit from decreasing screen use during eating occasions.
Working memory (WM) deficits are seen as a core deficit in schizophrenia, implicated in the broad cognitive impairment seen in the illness. Here we examine the impact of WM storage of a single item on the operation of other cognitive systems.
We studied 37 healthy controls (HCS) and 43 people with schizophrenia (PSZ). Each trial consisted of a sequence of two potential target stimuli, T1 and T2. T1 was a letter presented for 100 ms. After delays of 100–800 ms, T2 was presented. T2 was a 1 or a 2 and required a speeded response. In one condition, subjects were instructed to ignore T1 but respond to T2. In another condition, they were required to report T1 after making their speeded response to T2 (i.e. to make a speeded T2 response while holding T1 in WM).
PSZ were dramatically slowed at responding to T2 when T1 was held in WM. A repeated measures ANOVA yielded main effects of group, delay, and condition with a group by condition interaction (p's < 0.001). Across delays, the slowing of the T2 response when required to hold T1 in memory, relative to ignoring T1, was nearly 3 times higher in PSZ than HCS (633 v. 219 ms).
Whereas previous studies have focused on reduced storage capacity, the present study found that PSZ are impaired at performing tasks while they are successfully maintaining a single item in WM. This may play a role in the broad cognitive impairment seen in PSZ.
Introduction: Emergency department (ED) staff carry a high risk for the burnout syndrome of increased emotional exhaustion, depersonalization and decreased personal accomplishment. Previous research has shown that task-oriented coping skills were associated with reduced levels of burnout compared to emotion-oriented coping. ED staff at one hospital participated in an intervention to teach task-oriented coping skills. We hypothesized that the intervention would alter staff coping behaviors and ultimately reduce burnout. Methods: ED physicians, nurses and support staff at two regional hospitals were surveyed using the Maslach Burnout Inventory (MBI) and the Coping Inventory for Stressful Situations (CISS). Surveys were performed before and after the implementation of communication and conflict resolution skills training at the intervention facility (I) consisting of a one-day course and a small group refresher 6 to 15 months later. Descriptive statistics and multivariate analysis assessed differences in staff burnout and coping styles compared to the control facility (C) and over time. Results: 85/143 (I) and 42/110 (C) ED staff responded to the initial survey. Post intervention 46 (I) and 23(C) responded. During the two year study period there was no statistically significant difference in CISS or MBI scores between hospitals (CISS: (Pillai's trace = .02, F(3,63) = .47, p = .71, partial η2 = .02); MBI: (Pillai's trace = .01, F(3,63) = .11, p = .95, partial η2 = .01)) or between pre- and post-intervention groups (CISS: (Pillai's trace = .01, F(3,63) = .22, p = .88, partial η2 = .01); MBI: (Pillai's trace = .09, F(3,63) = 2.15, p = .10, partial η2 = .01)). Conclusion: We were not able to measure improvement in staff coping or burnout in ED staff receiving communication skills intervention over a two year period. Burnout is a multifactorial problem and environmental rather than individual factors may be more important to address. Alternatively, to demonstrate a measurable effect on burnout may require more robust or inclusive interventions.
Introduction: In Nova Scotia, under the Paramedics Providing Palliative Care program, paramedics can now manage symptom crises in patients with palliative care goals and often at home without the need to transport to hospital. Growing recognition that non-cancer conditions benefit from a palliative approach is expanding the program. Our team previously found treatment of pain and breathlessness is not optimized, pain scores are underutilized, and paramedics were more comfortable (pre-launch) with a palliative approach in cancer versus non-cancer conditions. Our objective was to compare symptom management in cancer versus non-cancer subgroup. Methods: We conducted a retrospective cohort study. The Electronic Patient Care Record and Special Patient Program were queried for patients with palliative goals from July 1, 2015 to July 1, 2016. Descriptive analysis was conducted and results were compared with a t-test and Bonferroni correction (alpha = p < 0.007). Results: 1909 unique patients; 765/1909 (40.1%) cancer and 1144/1909 (59.9%) non-cancer. Female sex: cancer 357/765 (46.7%), non-cancer 538/1144 (47.0%). Mean age cancer: 73.3 (11.65), non-cancer 77.7 (12.80). Top non-cancer conditions: COPD (495/1144, 43.3%), CHF (322/1144, 28.1%), stroke (172/1144, 15.0%) and dementia (149/1144, 13.0%). Comorbidities for cancer patients (range): 0 to 3; non-cancer 0 to 5. Most common chief complaint (CC) for cancer and non-cancer: respiratory distress, 10.8% vs 21.5%. Overall, no difference in proportion treated cancer vs non-cancer, 11.5% vs 10.1%, p = 0.35. Some difference in individual therapies: morphine 83/765 (10.8%) vs 55/1144 (4.8%), p < 0.001, hydromorphone 9/765 (1.2%) vs 2/1144 (0.2%), p = 0.014, salbutamol 38/765 (5.0%) vs 5/1144 (0.4%), p < 0.001 and ipratropium 27/765 (3.5%) vs 134/1144 (11.7%), p < 0.001, in addition to any support with home medication which is not queriable. Pre-treatment pain scores were documented more often than post-treatment in both groups (58.7% vs 25.6% (p < 0.001), 57.4% vs 26.9% (p < 0.001)). Conclusion: Non-cancer patients represent an important proportion of palliative care calls for paramedics. Cancer and non-cancer patients had very similar CC and received similar treatment, although low proportions, despite pre-launch findings that non-cancer conditions were likely to be undertreated. Pain scores remain underutilized. Further research into the underlying reason(s) is required to improve the support of non-cancer patients by paramedics.
Transgenic crops are being developed with herbicide resistance traits to expand innovative weed management solutions for crop producers. Soybean with traits that confer resistance to the hydroxyphenylpyruvate dioxygenase herbicide isoxaflutole is under development and will provide a novel herbicide mode of action for weed management in soybean. Ten field experiments were conducted over 2 years (2017 and 2018) on five soil textures with isoxaflutole-resistant soybean to evaluate annual weed control using one- and two-pass herbicide programs. The one-pass weed control programs included isoxaflutole plus metribuzin, applied PRE, at a low rate (52.5 + 210 g ai ha−1), medium rate (79 + 316 g ai ha−1), and high rate (105 + 420 g ai ha−1); and glyphosate applied early postemergence (EPOST) or late postemergence (LPOST). The two-pass weed control programs included isoxaflutole plus metribuzin, applied PRE, followed by glyphosate applied LPOST, and glyphosate applied EPOST followed by LPOST. At 4 weeks after the LPOST application, control of common lambsquarters, pigweed species, common ragweed, and velvetleaf was variable at 25% to 69%, 49% to 86%, and 71% to 95% at the low, medium, and high rates of isoxaflutole plus metribuzin, respectively. Isoxaflutole plus metribuzin at the low, medium, and high rates controlled grass species evaluated (i.e., barnyardgrass, foxtail, crabgrass, and witchgrass) 85% to 97%, 75% to 99%, and 86% to 100%, respectively. All two-pass weed management programs provided 98% to 100% control of all species. Weed control improved as the rate of isoxaflutole plus metribuzin increased. Two-pass programs provided excellent, full-season annual grass and broadleaf weed control in isoxaflutole-resistant soybean.
The second order nonlinear optical (NLO) properties of two different ionic selfassembled multilayer (ISAM) films combined with Ag nanoparticles have been investigated. The plasmon resonances in the Ag particles concentrate the incident light, markedly increasing in the NLO efficiencies of the films. We find that the efficiency enhancement is significantly larger in conventional ISAM films compared to films made using a hybrid covalent ISAM technique (HCISAM), even though the intrinsic bulk second order non-linear susceptibility (χ(2)) is much larger for HCISAM films. We attribute this to the interfaces in HCISAM films being much easier to disrupt by external perturbations such as the metal deposition by which the nanoparticles are fabricated. We conclude that because the plasmon decay length is very short, the plasmonic enhancement of NLO effects primarily occurs at and near the film-particle interface. To discern the importance of the interfaces, we surrounded thin ISAM and HCISAM films with NLOinactive buffer layers, which confirmed this hypothesis, particularly in the case of HCISAM films.
Horseweed biotypes resistant to glyphosate and ALS-inhibiting herbicides are becoming more prevalent in Canada and the United States and present a significant management challenge in field crops. Tolpyralate is a recently commercialized herbicide for use in corn that inhibits 4-hydroxyphenylpyruvate dioxygenase (HPPD), and there is little information regarding its efficacy on horseweed. Six field experiments were conducted in 2017 and 2018 at four locations in Ontario, Canada, to determine the biologically effective dose of tolpyralate and tolpyralate + atrazine and to compare label rates of tolpyralate and tolpyralate + atrazine to currently accepted herbicide standards for POST control of glyphosate and cloransulam-methyl resistant (MR) horseweed. At 8 wk after application (WAA), tolpyralate at 4.8 and 22.6 g ha–1 provided 50% and 80% control, respectively. When applied with atrazine at a 1:33.3 tank-mix ratio, 22.3 + 741.7 g ha–1 provided 95% control of MR horseweed. The addition of atrazine to tolpyralate at label rates improved control of MR horseweed to 98%, which was similar to the control provided by dicamba:atrazine and bromoxynil + atrazine. The results of this study indicate that tolpyralate + atrazine provides excellent control of MR horseweed POST in corn.
Recent infection testing algorithms (RITA) for HIV combine serological assays with epidemiological data to determine likely recent infections, indicators of ongoing transmission. In 2016, we integrated RITA into national HIV surveillance in Ireland to better inform HIV prevention interventions. We determined the avidity index (AI) of new HIV diagnoses and linked the results with data captured in the national infectious disease reporting system. RITA classified a diagnosis as recent based on an AI < 1.5, unless epidemiological criteria (CD4 count <200 cells/mm3; viral load <400 copies/ml; the presence of AIDS-defining illness; prior antiretroviral therapy use) indicated a potential false-recent result. Of 508 diagnoses in 2016, we linked 448 (88.1%) to an avidity test result. RITA classified 12.5% of diagnoses as recent, with the highest proportion (26.3%) amongst people who inject drugs. On multivariable logistic regression recent infection was more likely with a concurrent sexually transmitted infection (aOR 2.59; 95% CI 1.04–6.45). Data were incomplete for at least one RITA criterion in 48% of cases. The study demonstrated the feasibility of integrating RITA into routine surveillance and showed some ongoing HIV transmission. To improve the interpretation of RITA, further efforts are required to improve completeness of the required epidemiological data.
OBJECTIVES/SPECIFIC AIMS: The Life’s Simple 7 (LS7) metric was created by the American Heart Association with the goal of educating the public on seven modifiable factors that contribute to heart health. While it is well documented that these ideal health behaviors lower risk of cardiovascular disease (CVD) in the general population, the association between the LS7 ideal health metrics and end stage renal disease (ESRD) risk has not been examined in a lower socioeconomic population at high risk for both ESRD and CVD. Our objective is to examine the association between the LS7 score and incident ESRD in a cohort of white and black men and women in the southeastern US, where rates of CVD and ESRD are high. METHODS/STUDY POPULATION: The Southern Community Cohort Study recruited ~86,000 low-income blacks and whites in the southeastern US (2002-2009). Utilizing a nested case-control design, our analysis included 1628 incident cases of ESRD identified via linkage of the cohort with the United States Renal Data System (USRDS) from January 1, 2002 to March 31, 2015. Controls (n = 4884) were individually matched 3:1 with ESRD cases based on age, sex, and race. Demographic, medical, and lifestyle information were obtained via baseline questionnaire. The AHA definitions for ideal health were used for non-smoking (never or quit >12 months), body mass index (BMI<25kg/m2) and physical activity (>75 min/week of vigorous physical activity or >150min/week of moderate/vigorous activity). Modified definitions were used for consuming a healthy diet [Healthy Eating Index (HEI10) score>70] and for blood pressure, fasting plasma glucose, and total cholesterol, based on self-reported no history of diagnosis of hypertension, diabetes, and hypercholesterolemia, respectively. The number of ideal health parameters were summed to generate the LS7 score, which ranged from 0-7 with higher scores indicating more ideal health. Adjusted odds ratios (95% confidence intervals) for incident ESRD associated with LS7 score were calculated using conditional logistic regression models, adjusting for income and education. The SCCS ESRD case-cohort dataset will be available by TS 2019 and analyses will be completed to adjust for baseline estimated glomerular filtration rate (eGFR) as a marker of kidney function and to examine whether eGFR modifies the relationship between LS7 and incident ESRD. RESULTS/ANTICIPATED RESULTS: At baseline, mean age was 54 years, 55% (3600) of participants were women, and 87% (5656) were black. A total of 58% (943) of ESRD cases were non-smokers compared to 54% (2633) of controls. ESRD cases had higher prevalence of BMI>25 kg/m2 (81% vs. 74%), hypertension (84% vs. 59%), hypercholesterolemia (48% vs. 34%), and diabetes (66% vs. 22%) compared to controls. A total of 18% (839) of controls and 12% (194) of ESRD cases met ideal exercise recommendations, and 20% of either cases (302) or controls (916) had a HEI10 score above 70. The median LS7 score for controls and ESRD cases was 3 and 2, respectively, and 17% (983) of participants had a low score (0-1) while 2% (105) met 6 or 7 ideal health metrics. Higher LS7 score was associated with lower odds of ESRD (P-trend<0.001). Participants with LS7 score >3 (above median) had 75% reduced odds of ESRD (OR 0.25; 95% CI 0.22, 0.29) compared to those with a score of 2 or less. DISCUSSION/SIGNIFICANCE OF IMPACT: In the SCCS population, the presence of any 3 or more ideal health behaviors is associated with reduced odds of developing ESRD. The components of the LS7 represent important modifiable risk factors that may be targets for future interventions driven by the patient. The attributable risk due to each factor is needed to dissect which ideal behaviors are the most beneficial.
In the behavioral sciences, it is common to explain behavior in terms of what was learned in a task, as if any subsequent change in performance had to denote a change in learning. However, learning alone cannot account for variability in performance. Instead, incentive motivation plays a direct role (and is more effective) in controlling moment-to-moment changes in an individual's responses than the learning process. After briefly introducing the history of the study of incentive motivation, we explain that incentive motivation consists of a dopamine-dependent process that does not require consciousness to influence responding to a task. We analyze two Pavlovian situations in which incentive motivation can modulate performance, irrespective of additional learning: the instant transformation of disgust into attraction for salt and the invigoration of responses under reward uncertainty. Finally, we consider drug addiction as an example of motivational dysregulation rather than as a consequence of the habit to consume substances of abuse.
Mood and anxiety disorders are ubiquitous but current treatment options are ineffective for many sufferers. Moreover, a number of promising pre-clinical interventions have failed to translate into clinical efficacy in humans. Improved treatments are unlikely without better animal–human translational pipelines. Here, we translate a rodent measure of negative affective bias into humans, exploring its relationship with (1) pathological mood and anxiety symptoms and (2) transient induced anxiety.
Adult participants (age = 29 ± 11) who met criteria for mood or anxiety disorder symptomatology according to a face-to-face neuropsychiatric interview were included in the symptomatic group. Study 1 included N = 77 (47 = asymptomatic [female = 21]; 30 = symptomatic [female = 25]), study 2 included N = 47 asymptomatic participants (25 = female). Outcome measures were choice ratios, reaction times and parameters recovered from a computational model of reaction time – the drift diffusion model (DDM) – from a two-alternative-forced-choice task in which ambiguous and unambiguous auditory stimuli were paired with high and low rewards.
Both groups showed over 93% accuracy on unambiguous tones indicating intact discrimination, but symptomatic individuals demonstrated increased negative affective bias on ambiguous tones [proportion high reward = 0.42 (s.d. = 0.14)] relative to asymptomatic individuals [0.53 (s.d. = 0.17)] as well as a significantly reduced DDM drift rate. No significant effects were observed for the within-subjects anxiety-induction.
Humans with pathological anxiety symptoms directly mimic rodents undergoing anxiogenic manipulation. The lack of sensitivity to transient anxiety suggests the paradigm might be more sensitive to clinically relevant symptoms. Our results establish a direct translational pipeline (and candidate therapeutics screen) from negative affective bias in rodents to pathological mood and anxiety symptoms in humans.
Our aim was to investigate patterns of change in public knowledge, attitudes, desire for social distance and reporting having contact with people with mental health problems in England during the Time to Change (TTC) programme to reduce stigma and discrimination 2009–2017.
Using data from an annual face-to-face survey of a nationally representative quota sample of adults, we evaluated longitudinal trends of the outcome measures with regression analyses and made assumptions on the basis of a simple random sample. We tested interactions between year and demographic subgroups.
There were improvements in all outcomes in 2017 compared with baseline measures (2008 or 2009). Reported in s.d. units [95% confidence interval (CI)], the improvement for knowledge was 0.17 (0.10–0.23); for attitudes 0.25 (0.18–0.31); and for social distance 0.29 (0.23–0.35). A higher likelihood of reporting contact was also associated with most recent survey year (odds ratio 1.47, 95% CI 1.27–1.71). Statistically significant interactions between year and region of England suggest greatest improvements in attitudes and intended behaviour in London, where both outcomes were significantly worse in the early years of the survey. However, for attitudes, this interaction was only significant among women. Other significant interactions suggest that attitudes improved most in the target age group (25–44).
The results provide support for the effectiveness of TTC across demographic groups. However, other societal changes may influence the results, such as the increasing prevalence of common mental disorder in young women.