To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study compared the level of education and tests from multiple cognitive domains as proxies for cognitive reserve.
The participants were educationally, ethnically, and cognitively diverse older adults enrolled in a longitudinal aging study. We examined independent and interactive effects of education, baseline cognitive scores, and MRI measures of cortical gray matter change on longitudinal cognitive change.
Baseline episodic memory was related to cognitive decline independent of brain and demographic variables and moderated (weakened) the impact of gray matter change. Education moderated (strengthened) the gray matter change effect. Non-memory cognitive measures did not incrementally explain cognitive decline or moderate gray matter change effects.
Episodic memory showed strong construct validity as a measure of cognitive reserve. Education effects on cognitive decline were dependent upon the rate of atrophy, indicating education effectively measures cognitive reserve only when atrophy rate is low. Results indicate that episodic memory has clinical utility as a predictor of future cognitive decline and better represents the neural basis of cognitive reserve than other cognitive abilities or static proxies like education.
During the last fifteen years there has been a paradigm shift in the continuum modelling of granular materials; most notably with the development of rheological models, such as the $\mu (I)$-rheology (where $\mu$ is the friction and I is the inertial number), but also with significant advances in theories for particle segregation. This paper details theoretical and numerical frameworks (based on OpenFOAM) which unify these currently disconnected endeavours. Coupling the segregation with the flow, and vice versa, is not only vital for a complete theory of granular materials, but is also beneficial for developing numerical methods to handle evolving free surfaces. This general approach is based on the partially regularized incompressible $\mu (I)$-rheology, which is coupled to the gravity-driven segregation theory of Gray & Ancey (J. Fluid Mech., vol. 678, 2011, pp. 353–588). These advection–diffusion–segregation equations describe the evolving concentrations of the constituents, which then couple back to the variable viscosity in the incompressible Navier–Stokes equations. A novel feature of this approach is that any number of differently sized phases may be included, which may have disparate frictional properties. Further inclusion of an excess air phase, which segregates away from the granular material, then allows the complex evolution of the free surface to be captured simultaneously. Three primary coupling mechanisms are identified: (i) advection of the particle concentrations by the bulk velocity, (ii) feedback of the particle-size and/or frictional properties on the bulk flow field and (iii) influence of the shear rate, pressure, gravity, particle size and particle-size ratio on the locally evolving segregation and diffusion rates. The numerical method is extensively tested in one-way coupled computations, before the fully coupled model is compared with the discrete element method simulations of Tripathi & Khakhar (Phys. Fluids, vol. 23, 2011, 113302) and used to compute the petal-like segregation pattern that spontaneously develops in a square rotating drum.
Submesoscale processes along coastal boundaries provide a potential mechanism for the dissipation of mesoscale kinetic energy in the ocean. Since these processes occur on scales not generally resolved by global ocean models, a physically motivated parametrisation is required to accurately describe their effects. Submesoscale dynamics is characterised by strong turbulent mixing, nonlinearity and topographic effects; all of which significantly modify the flow. A major component of the submesoscale boundary response to mesoscale forcing is the Kelvin – or coastally trapped – wave field, which has been shown to transport energy over large distances. This paper thus examines the influence of vertical mixing, nonlinearity and steep-slope topography on baroclinic Kelvin waves with the aim of assessing the importance of these effects. We consider the limit of a steep coastal boundary, weak mixing and weak nonlinearity and perform an asymptotic analysis to determine the modification of the classical Kelvin wave solution by these effects. Linear and nonlinear solutions are given and different mixing limits are discussed and compared with previous work. We find that vertical mixing acts to damp slowly propagating Kelvin waves while nonlinearity can cause wave breaking which may be important for fast waves. Steep-slope topography acts to modify the wave speed and structure consistent with previous work.
Cognitive behavior therapy (CBT) is effective for most patients with a social anxiety disorder (SAD) but a substantial proportion fails to remit. Experimental and clinical research suggests that enhancing CBT using imagery-based techniques could improve outcomes. It was hypothesized that imagery-enhanced CBT (IE-CBT) would be superior to verbally-based CBT (VB-CBT) on pre-registered outcomes.
A randomized controlled trial of IE-CBT v. VB-CBT for social anxiety was completed in a community mental health clinic setting. Participants were randomized to IE (n = 53) or VB (n = 54) CBT, with 1-month (primary end point) and 6-month follow-up assessments. Participants completed 12, 2-hour, weekly sessions of IE-CBT or VB-CBT plus 1-month follow-up.
Intention to treat analyses showed very large within-treatment effect sizes on the social interaction anxiety at all time points (ds = 2.09–2.62), with no between-treatment differences on this outcome or clinician-rated severity [1-month OR = 1.45 (0.45, 4.62), p = 0.53; 6-month OR = 1.31 (0.42, 4.08), p = 0.65], SAD remission (1-month: IE = 61.04%, VB = 55.09%, p = 0.59); 6-month: IE = 58.73%, VB = 61.89%, p = 0.77), or secondary outcomes. Three adverse events were noted (substance abuse, n = 1 in IE-CBT; temporary increase in suicide risk, n = 1 in each condition, with one being withdrawn at 1-month follow-up).
Group IE-CBT and VB-CBT were safe and there were no significant differences in outcomes. Both treatments were associated with very large within-group effect sizes and the majority of patients remitted following treatment.
Most clinical microbiology laboratories have replaced toxin immunoassay (EIA) alone with multistep testing (MST) protocols or nucleic acid amplification testing (NAAT) alone for the detection of C. difficile.
Study the effect of changing testing strategies on C. difficile detection and strain diversity.
A Veterans’ Affairs hospital.
Initially, toxin EIA testing was replaced by an MST approach utilizing a glutamate dehydrogenase (GDH) and toxin EIA followed by tcdB NAAT for discordant results. After 18 months, MST was replaced by a NAAT-only strategy. Available patient stool specimens were cultured for C. difficile. Restriction endonuclease analysis (REA) strain typing and quantitative in vitro toxin testing were performed on recovered isolates.
Before MST (toxin EIA), 79 of 708 specimens (11%) were positive, and after MST (MST-A), 121 of 517 specimens (23%) were positive (P < .0001). Prior to NAAT-only testing (MST-B), 80 of the 490 specimens (16%) were positive by MST, and after NAAT-only testing was implemented, 67 of the 368 specimens (18%) were positive (P = nonsignificant). After replacing toxin EIA testing, REA strain group diversity increased (8, 13, 13, and 10 REA groups in the toxin EIA, MST-A, MST-B, and NAAT-only periods, respectively) and in vitro toxin concentration decreased. The average log10 toxin concentration of the isolates were 2.08, 1.88, 1.20 and 1.55 ng/mL for the same periods, respectively.
MST and NAAT had similar detection rates for C. difficile. Compared to toxin testing alone, they detected increased diversity of C. difficile strains, many of which were low toxin producing.
In a European, phase 3 study (SPD489-325), lisdexamfetamine dimesylate (LDX) and osmotic-release oral system methylphenidate (OROS-MPH) were more effective than placebo in improving core symptoms in children and adolescents with attention-deficit/hyperactivity disorder (ADHD).
Objectives and aims
To compare post hoc the efficacy of LDX and OROS-MPH in study SPD489-325.
This 7-week, randomized, double-blind, parallel-group, dose-optimized, placebo-controlled trial enrolled patients aged 6-17 years with ADHD of at least moderate severity. Patients were randomized (1:1:1) to receive a once-daily dose of LDX (30, 50, 70 mg/day), OROS-MPH (18, 36, 54 mg/day) or placebo. Efficacy was assessed using the ADHD Rating Scale version IV (ADHD-RS-IV) and the Clinical Global Impression-Improvement (CGI-I) scale. Endpoint was defined as the last ontherapy treatment visit with a valid assessment.
The full analysis set comprised 317 patients (LDX, n=104; placebo, n=106; OROS-MPH, n=107). The difference between LDX and OROS-MPH in least squares mean change (95% confidence interval [CI]) in ADHD-RS-IV total score from baseline to endpoint was statistically significant in favour of LDX (-5.6 [-8.4, -2.7]; p < 0.001; effect size, 0.541). The difference (LDX minus OROS-MPH) in the percentage of patients (95% CI) with an improved CGI-I score at endpoint was also statistically significant in favour of LDX (17.4 [5.0, 29.8]; p < 0.05).
This post hoc analysis indicated that LDX is significantly more effective than OROS-MPH in improving core symptoms and global functioning in children and adolescents with ADHD.
In a European, phase 3 study (SPD489-325), lisdexamfetamine dimesylate (LDX) was more effective than placebo in improving symptoms and global functioning in children and adolescents with attention-deficit/hyperactivity disorder (ADHD).
Objectives and aims
To evaluate the impact of age, sex and baseline disease severity on efficacy outcomes in SPD489- 325.
This 7-week, double-blind, parallel-group, dose-optimized study enrolled patients aged 6-17 years with ADHD. Patients were randomized (1:1:1) to once-daily LDX (30, 50 or 70mg/day), osmotic-release oral system methylphenidate (OROS-MPH; 18, 36 or 54mg/day) or placebo. Efficacy outcomes were analysed in patients dichotomized by age (6-12 years [n=229] or 13-17 years [n=88]), sex (male [n=255] or female [n=62]) and baseline ADHD Rating Scale version IV (ADHD-RSIV) total score (28-41 [n=161] or 42-54 [n=152]). Endpoint was the last on-treatment visit with a valid assessment.
At endpoint, differences (active-placebo) in least-squares mean changes from baseline in ADHD-RS-IV total scores were statistically significant in all age, sex and ADHD-RS-IV total score subgroups for LDX (p< 0.001; effect sizes, 1.68-2.26) and OROS-MPH (p< 0.01; effect sizes, 0.88-1.46). Proportions of patients with a Clinical Global Impressions-Improvement rating of 1 (very much improved) or 2 (much improved) were statistically significantly greater than placebo at endpoint in all subgroups receiving LDX (p< 0.01) and in all subgroups except females receiving OROS-MPH (p< 0.05).
LDX showed greater efficacy than placebo in children and adolescents with ADHD, regardless of their age, sex or baseline disease severity.
Evaluate the efficacy and long-term safety of investigational aripiprazole once-monthly (ARI-OM) for maintenance treatment in schizophrenia.
Patients requiring chronic treatment for schizophrenia, not on aripiprazole monotherapy, were cross-titrated from other antipsychotic(s) to aripiprazole in an oral conversion phase (Phase 1). All patients entered an oral aripiprazole stabilization phase (Phase 2). Patients meeting stability criteria entered an ARI-OM stabilization phase (Phase 3), with coadministration of oral aripiprazole for the first 2 weeks. Patients meeting stability criteria were randomized to ARI-OM or placebo once-monthly (placebo-OM) during a 52-week, double-blind maintenance phase (Phase 4). Primary endpoint was time-to-impending relapse. Safety and tolerability were also assessed.
710 patients entered Phase 2, 576 Phase 3 and 403 Phase 4 (ARI-OM=269, placebo-OM=134). The study was terminated early because efficacy was demonstrated by a pre-planned interim analysis. Time-to-impending relapse was significantly delayed with ARI-OM vs. placebo-OM (p< 0.0001, log-rank test). Discontinuations due to treatment-emergent adverse events (AEs) were: Phase 1, 3.8% (n=24/632); Phase 2, 3.0% (n=21/709); Phase 3, 4.9% (n=28/576); Phase 4, 7.1% (n=19/269). Most AEs were mild or moderate. Insomnia was the only AE >5% incidence in any phase. Headache, somnolence, and nausea had a peak first onset within the first 4 weeks of treatment. There were no unusual shifts in all phases in laboratory values, fasting metabolic parameters, weight, or objective scales of movement disorders.
ARI-OM significantly delayed time-to-impending relapse compared with placebo-OM and was well tolerated as maintenance treatment in schizophrenia1.
Prevalence of violent behaviour within acute psychiatric services is about 10%.
To assess compliance of management of acutely disturbed patients with the National Institute for Health and Clinical Excellence (NICE) guidance for use of Rapid Tranquilisation (RT).
How did we assess practice?:
A sample of 24 patients admitted to local Psychiatric Intensive Care Unit (PICU) receiving RT during 2011 was assessed using retrospective analysis of records.
Areas of Good Practice:
100% compliance was achieved in many of the criteria assessed, including recording the risk assessment and management plan appropriately.
Areas of concern:
None of the patients had up-to-date advance directive detailing their preferred strategies in case of violent incidents.
50% of patients had their baseline blood pressure, pulse, temperature and respiratory rate recorded and 33% at regular intervals.
25% were debriefed and none offered an opportunity to write their account of RT.
38% had their medication reviewed following RT.
46% were suffering from psychotic disorders; manic episode accounted for 25% of all patients.
54% of all (65% of men) received Zuclopenthixol acetate for RT and 46% the combination of Haloperidol plus Lorazepam; however, the combination of Haloperidol plus Lorazepam was used in 71.4% of women and 35% of men.
12.5% required a short period of seclusion.
All patients were detained under the Mental Health Act.
How we can make changes and improve practice?:
Training of the PICU staff in the NICE Guidance: The short-term management of disturbed/violent behaviour in psychiatric inpatient settings and emergency departments.
The National Institutes of Health requires data and safety monitoring boards (DSMBs) for all phase III clinical trials. The National Heart, Lung and Blood Institute requires DSMBs for all clinical trials involving more than one site and those involving cooperative agreements and contracts. These policies have resulted in the establishment of DSMBs for many implementation trials, with little consideration regarding the appropriateness of DSMBs and/or key adaptations needed by DSMBs to monitor data quality and participant safety. In this perspective, we review the unique features of implementation trials and reflect on key questions regarding the justification for DSMBs and their potential role and monitoring targets within implementation trials.
Wild oat (Avena fatua L.) is one of the most problematic weed species in western Canada due to widespread populations, herbicide resistance, and seed dormancy. In wheat (Triticum aestivum L.), and especially in shorter crops such as lentil (Lens culinaris Medik.), A. fatua seed panicles elongate above the crop canopy, which can facilitate physical cutting of the panicles (clipping) to reduce viable seed return to the seedbank. However, the viability of A. fatua seed at the time of panicle elongation is not known. The objective of this study was to determine the viability of A. fatua seed at successive time intervals after elongation above a wheat or lentil crop canopy. A 2-yr panicle clipping and removal study in wheat and lentil was conducted in Lacombe, AB, and Saskatoon, SK, in 2015 and 2016 to determine the onset of viability in A. fatua seeds at successive clipping intervals. Manual panicle clipping of A. fatua panicles above each crop canopy began when the majority of panicles were visible above respective crop canopies and continued weekly until seed shed began. At the initiation of panicle clipping, A. fatua seed viability was between 0% and 10%. By the last clipping treatment (approximately 6 to 7 wk after elongation), 95% of the A. fatua seeds were viable. Seed moisture and awn angle were not good predictors of A. fatua viability, and therefore were unlikely to provide effective tools to estimate appropriate timing for implementation of A. fatua clipping as a management technique. Based on A. fatua seed viability, earlier clipping of A. fatua is likely to be more effective in terms of population management and easier to implement in shorter crops such as lentil. Investigations into long-term effects of clipping on A. fatua populations are needed to evaluate the efficacy of this management strategy on A. fatua.
Glyphosate-resistant (GR) canola is a widely grown crop across western Canada and has quickly become a prolific volunteer weed. Glyphosate-resistant soybean is rapidly gaining acreage in western Canada. Thus, there is a need to evaluate herbicide options to manage volunteer GR canola in GR soybean crops. We conducted an experiment to evaluate the efficacy of various PRE and POST herbicides applied sequentially to volunteer GR canola and to evaluate soybean injury caused by these herbicides. Trials were conducted across Saskatchewan and Manitoba in 2014 and 2015. All treatments provided a range of suppression (>70%) to control (>80%) of volunteer canola. All treatments with the exception of the glyphosate-treated control reduced aboveground canola biomass by an average of 96%. As well, canola seed contamination was reduced from 36% to less than 5% when a PRE and POST herbicide were both used. Moreover, all combinations of herbicides used had excellent crop safety (<10%). All PRE and POST herbicide combinations provided better control of volunteer canola compared with the glyphosate-only control, but tribenuron followed by bentazon and tribenuron followed by imazamox plus bentazon provided solutions that were low cost, currently available (registered in western Canada), and had the potential to minimize development of herbicide resistance in other weeds.
The role of silicon (Si) in alleviating the effects of biotic and abiotic stresses, including defence against insect herbivores, in plants is widely reported. Si defence against insect herbivores is overwhelmingly studied in grasses (especially the cereals), many of which are hyper-accumulators of Si. Despite being neglected, legumes such as soybean (Glycine max) have the capacity to control Si accumulation and benefit from increased Si supply. We tested how Si supplementation via potassium, sodium or calcium silicate affected a soybean pest, the native budworm Helicoverpa punctigera Wallengren (Lepidoptera: Noctuidae). Herbivory reduced leaf biomass similarly in Si-supplemented (+Si) and non-supplemented (–Si) plants (c. 29 and 23%, respectively) relative to herbivore-free plants. Both Si supplementation and herbivory increased leaf Si concentrations. In relative terms, herbivores induced Si uptake by c. 19% in both +Si and –Si plants. All Si treatments reduced H. punctigera relative growth rates (RGR) to a similar extent for potassium (−41%), sodium (−49%) and calcium (−48%) silicate. Moreover, there was a strong negative correlation between Si accumulation in leaves and herbivore RGR. To our knowledge, this is only the second report of Si-based herbivore defence in soybean; the rapid increase in leaf Si following herbivory being indicative of an induced defence. Taken together with the other benefits of Si supplementation of legumes, Si could prove an effective herbivore defence in legumes as well as grasses.
In recent years, soybean acreage has increased significantly in western Canada. One of the challenges associated with growing soybean in western Canada is the control of volunteer glyphosate-resistant (GR) canola, because most soybean cultivars are also glyphosate resistant. The objective of this research was to determine the impact of soybean seeding rate and planting date on competition with volunteer canola. We also attempted to determine how high seeding rate could be raised while still being economically feasible for producers. Soybean was seeded at five different seeding rates (targeted 10, 20, 40, 80, and 160 plants m−2) and three planting dates (targeted mid-May, late May, and early June) at four sites across western Canada in 2014 and 2015. Soybean yield consistently increased with higher seeding rates, whereas volunteer canola biomass decreased. Planting date generally produced variable results across site-years. An economic analysis determined that the optimal rate was 40 to 60 plants m−2, depending on market price, and the optimal planting date range was from May 20 to June 1.
To: (i) understand facilitators and barriers to healthy eating practices and physical activity in younger and older urban adolescent South African boys and girls; and (ii) understand how the views of caregivers interact with, and influence, adolescent behaviours.
Semi-structured focus group discussions (FGD) were conducted in July 2018. Data were analysed using thematic analysis.
Soweto, Johannesburg, South Africa.
Seventy-five participants were stratified into eight FGD as follows: two for young boys and girls (10–12 years); two for older boys and girls (15–17 years); two for caregivers of young adolescents (boys and girls); and two for caregivers of older adolescents (boys and girls).
Unlike their caregivers, adolescents were not motivated to eat healthily and failed to appreciate the need to develop consistent patterns of both healthy eating and physical activity for their long-term health. Although adolescents gained independence with age, they commonly attributed unhealthy food choices to a lack of autonomy and, thereby, to the influence of their caregivers. Adolescents and caregivers perceived their engagement in physical activity according to distinct siloes of recreational and routine activity, respectively. Both similarities and differences in the drivers of healthy eating and physical activity exist in adolescents and caregivers, and should be targeted in future interventions.
Our study identified a complex paradigm of eating practices and physical activity in South African adolescents and their caregivers. We also highlighted the need for a new narrative in addressing the multifaceted and interrelated determinants of adolescent health within urban poor settings.