To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Evaluation of a mandatory immunization program to increase and sustain high immunization coverage for healthcare personnel (HCP).
Descriptive study with before-and-after analysis.
Tertiary-care academic medical center.
Medical center HCP.
A comprehensive mandatory immunization initiative was implemented in 2 phases, starting in July 2014. Key facets of the initiative included a formalized exemption review process, incorporation into institutional quality goals, data feedback, and accountability to support compliance.
Both immunization and overall compliance rates with targeted immunizations increased significantly in the years after the implementation period. The influenza immunization rate increased from 80% the year prior to the initiative to >97% for the 3 subsequent influenza seasons (P < .0001). Mumps, measles and varicella vaccination compliance increased from 94% in January 2014 to >99% by January 2017, rubella vaccination compliance increased from 93% to 99.5%, and hepatitis B vaccination compliance from 95% to 99% (P < .0001 for all comparisons). An associated positive effect on TB testing compliance, which was not included in the mandatory program, was also noted; it increased from 76% to 92% over the same period (P < .0001).
Thoughtful, step-wise implementation of a mandatory immunization program linked to professional accountability can be successful in increasing immunization rates as well as overall compliance with policy requirements to cover all recommended HCP immunizations.
As herbicide-resistant weeds become more problematic, producers will consider the use of cover crops to suppress weeds. Weed suppression from cover crops may be especially in the label-mandated buffer areas of dicamba-resistant soybean where dicamba use is not allowed. Three cover crops terminated at three timings with three herbicide strategies were evaluated for their effect on weed suppression in dicamba-resistant soybean. Delaying termination to at soybean planting, or after, and using a cereal rye or cereal rye + crimson clover increased cover crop biomass by at least 40% compared to terminating early or using a crimson clover only cover crop. Densities of problematic weed species were evaluated in early-summer prior to a blanket POST application. Plots with cereal rye had 75% less horseweed compared to crimson clover at two of four site-years. Cereal rye or the mix cover crop terminated at, or after soybean planting reduced waterhemp densities by 87% compared to early termination timings of crimson clover and the earliest termination timing of the mix at one of two site-years. Cover crops were not as effective in reducing waterhemp densities as they were in reducing horseweed densities. This difference is due to a divergence in emergence patterns; waterhemp emergence generally peaks after termination of the cover crop while horseweed emergence coincides with establishment and rapid vegetative growth of cereal rye. Cover crops alone were generally not as effective as using a high biomass cover crop combined with herbicide strategy that contained dicamba and residual herbicides. However, within label-mandated buffer areas where dicamba cannot be used, a cover crop containing cereal rye with delayed termination to at soybean planting combined with residual herbicides could be utilized to improve suppression of horseweed and waterhemp.
To test the feasibility of targeted gown and glove use by healthcare personnel caring for high-risk nursing-home residents to prevent Staphylococcus aureus acquisition in short-stay residents.
Uncontrolled clinical trial.
This study was conducted in 2 community-based nursing homes in Maryland.
The study included 322 residents on mixed short- and long-stay units.
During a 2-month baseline period, all residents had nose and inguinal fold swabs taken to estimate S. aureus acquisition. The intervention was iteratively developed using a participatory human factors engineering approach. During a 2-month intervention period, healthcare personnel wore gowns and gloves for high-risk care activities while caring for residents with wounds or medical devices, and S. aureus acquisition was measured again. Whole-genome sequencing was used to assess whether the acquisition represented resident-to-resident transmission.
Among short-stay residents, the methicillin-resistant S. aureus acquisition rate decreased from 11.9% during the baseline period to 3.6% during the intervention period (odds ratio [OR], 0.28; 95% CI, 0.08–0.92; P = .026). The methicillin-susceptible S. aureus acquisition rate went from 9.1% during the baseline period to 4.0% during the intervention period (OR, 0.41; 95% CI, 0.12–1.42; P = .15). The S. aureus resident-to-resident transmission rate decreased from 5.9% during the baseline period to 0.8% during the intervention period.
Targeted gown and glove use by healthcare personnel for high-risk care activities while caring for residents with wounds or medical devices, regardless of their S. aureus colonization status, is feasible and potentially decreases S. aureus acquisition and transmission in short-stay community-based nursing-home residents.
Acanthocephalans are parasites with complex lifecycles that are important components of aquatic systems and are often model species for parasite-mediated host manipulation. Genetic characterization has recently resurrected Pomphorhynchus tereticollis as a distinct species from Pomphorhynchus laevis, with potential implications for fisheries management and host manipulation research. Morphological and molecular examinations of parasites from 7 English rivers across 9 fish species revealed that P. tereticollis was the only Pomphorhynchus parasite present in Britain, rather than P. laevis as previously recorded. Molecular analyses included two non-overlapping regions of the mitochondrial gene – cytochrome oxidase and generated 62 sequences for the shorter fragment (295 bp) and 74 for the larger fragment (583 bp). These were combined with 61 and 13 sequences respectively, from Genbank. A phylogenetic analysis using the two genetic regions and all the DNA sequences available for P. tereticollis identified two distinct genetic lineages in Britain. One lineage, possibly associated with cold water tolerant fish, potentially spread to the northern parts of Britain from the Baltic region via a northern route across the estuarine area of what is now the North Sea during the last Glaciation. The other lineage, associated with temperate freshwater fish, may have arrived later via the Rhine/Thames fluvial connection during the last glaciation or early Holocene when sea levels were low. These results raise important questions on this generalist parasite and its variously environmentally adapted hosts, and especially in relation to the consequences for parasite vicariance.
To evaluate the effect of the burden of Staphylococcus aureus colonization of nursing home residents on the risk of S. aureus transmission to healthcare worker (HCW) gowns and gloves.
Multicenter prospective cohort study.
Setting and participants:
Residents and HCWs from 13 community-based nursing homes in Maryland and Michigan.
Residents were cultured for S. aureus at the anterior nares and perianal skin. The S. aureus burden was estimated by quantitative polymerase chain reaction detecting the nuc gene. HCWs wore gowns and gloves during usual care activities; gowns and gloves were swabbed and then cultured for the presence of S. aureus.
In total, 403 residents were enrolled; 169 were colonized with methicillin-resistant S. aureus (MRSA) or methicillin-sensitive S. aureus (MSSA) and comprised the study population; 232 were not colonized and thus were excluded from this analysis; and 2 were withdrawn prior to being swabbed. After multivariable analysis, perianal colonization with S. aureus conferred the greatest odds for transmission to HCW gowns and gloves, and the odds increased with increasing burden of colonization: adjusted odds ratio (aOR), 2.1 (95% CI, 1.3–3.5) for low-level colonization and aOR 5.2 (95% CI, 3.1–8.7) for high level colonization.
Among nursing home patients colonized with S. aureus, the risk of transmission to HCW gowns and gloves was greater from those colonized with greater quantities of S. aureus on the perianal skin. Our findings inform future infection control practices for both MRSA and MSSA in nursing homes.
Artificial microswimmers, or ‘microbots’, have the potential to revolutionise non-invasive medicine and microfluidics. Microbots that are powered by self-phoretic mechanisms, such as Janus particles, often harness a solute fuel in their environment. Traditionally, self-phoretic particles are point like, but slender phoretic rods have become an increasingly prevalent design. While there has been substantial interest in creating efficient asymptotic theories for slender phoretic rods, hitherto such theories have been restricted to straight rods with axisymmetric patterning. However, modern manufacturing methods will soon allow fabrication of slender phoretic filaments with complex three-dimensional shapes. In this paper, we develop a slender body theory for the solute of self-diffusiophoretic filaments of arbitrary three-dimensional shape and patterning. We demonstrate analytically that, unlike other slender body theories, first-order azimuthal variations arising from curvature and confinement can make a leading-order contribution to the swimming kinematics.
The study of planning in second language (L2) writing research is heavily influenced by two research domains: (a) early research on cognition in first language (L1) composing processes and (b) second language acquisition (SLA) research. The first research domain has been instrumental in determining the specific systems and processes involved in composing and has led to widely accepted models of L1 writing (Bereiter & Scardamalia, 1987*; Flower & Hayes, 1980*; Hayes, 1996, 2012) as well as a widely accepted model of the interaction between working memory and L1 writing systems (Kellogg, 1996*; Kellogg, Whiteford, Turner, Cahill, & Mertens, 2013). The influence of these early studies is still felt in process approaches to composition instruction commonly implemented in L1 and L2 writing classes. The second research domain—SLA and more specifically task-based language teaching/learning—has come to view planning as a feature of task complexity that can be manipulated to facilitate the production of language that is complex (syntactically and/or lexically), accurate, and/or fluent (Robinson, 2011*; Skehan, 1998*; Skehan & Foster, 2001). This research timeline traces the study of planning in L2 writing in each of these domains by reviewing key L1 and L2 writing research over the last 30-plus years and by highlighting each study's findings. Prior to presenting the timeline, the following sections provide backgrounds in each of the domains noted above and situate planning within those domains.
Southeastern Appalachian Ohio has more than double the national average of diabetes and a critical shortage of healthcare providers. Paradoxically, there is limited research focused on primary care providers’ experiences treating people with diabetes in this region. This study explored providers’ perceived barriers to and facilitators for treating patients with diabetes in southeastern Appalachian Ohio.
We conducted in-depth interviews with healthcare providers who treat people with diabetes in rural southeastern Ohio. Interviews were transcribed, coded, and analyzed via content and thematic analyses using NVivo 12 software (QSR International, Chadstone, VIC, Australia).
Qualitative analysis revealed four themes: (1) patients’ diabetes fatalism and helplessness: providers recounted story after story of patients believing that their diabetes was inevitable and that they were helpless to prevent or delay diabetes complications. (2) Comorbid psychosocial issues: providers described high rates of depression, anxiety, incest, abuse, and post-traumatic stress disorder among people with diabetes in this region. (3) Inter-connected social determinants interfering with diabetes care: providers identified major barriers including lack of access to providers, lack of access to transportation, food insecurity, housing insecurity, and financial insecurity. (4) Providers’ cultural understanding and recommendations: providers emphasized the importance of understanding of the values central to Appalachian culture and gave culturally attuned clinical suggestions for how to use these values when working with this population.
Evidence-based interventions tailored to Appalachian culture and training designed to increase the cultural competency and cultural humility of primary care providers may be effective approaches to reduce barriers to diabetes care in Appalachian Ohio.
Field experiments were conducted in 2017 and 2018 at two locations in Indiana to evaluate the influence of cover crop species, termination timing, and herbicide treatment on winter and summer annual weed suppression and corn yield. Cereal rye and canola cover crops were terminated early or late (2 wk before or after corn planting) with a glyphosate- or glufosinate-based herbicide program. Canola and cereal rye reduced total weed biomass collected at termination by up to 74% and 91%, in comparison to fallow, respectively. Canola reduced horseweed density by up to 56% at termination and 57% at POST application compared to fallow. Cereal rye reduced horseweed density by up to 59% at termination and 87% at POST application compared to fallow. Canola did not reduce giant ragweed density at termination in comparison to fallow. Cereal rye reduced giant ragweed density by up to 66% at termination and 62% at POST application. Termination timing had little to no effect on weed biomass and density reduction in comparison to the effect of cover crop species. Cereal rye reduced corn grain yield at both locations in comparison to fallow, especially for the late-termination timing. Corn grain yield reduction up to 49% (4,770 kg ha–1) was recorded for cereal rye terminated late in comparison to fallow terminated late. Canola did not reduce corn grain yield in comparison to fallow within termination timing; however, late-terminated canola reduced corn grain yield by up to 21% (2,980 kg ha–1) in comparison to early-terminated fallow. Cereal rye can suppress giant ragweed emergence, whereas canola is not as effective at suppressing large-seeded broadleaves such as giant ragweed. These results also indicate that early-terminated cover crops can often result in higher corn grain yields than late-terminated cover crops in an integrated weed management program.
The 2017 solar eclipse was associated with mass gatherings in many of the 14 states along the path of totality. The Kentucky Department for Public Health implemented an enhanced syndromic surveillance system to detect increases in emergency department (ED) visits and other health care needs near Hopkinsville, Kentucky, where the point of greatest eclipse occurred.
EDs flagged visits of patients who participated in eclipse events from August 17–22. Data from 14 area emergency medical services and 26 first-aid stations were also monitored to detect health-related events occurring during the eclipse period.
Forty-four potential eclipse event-related visits were identified, primarily injuries, gastrointestinal illness, and heat-related illness. First-aid stations and emergency medical services commonly attended to patients with pain and heat-related illness.
Kentucky’s experience during the eclipse demonstrated the value of patient visit flagging to describe the disease burden during a mass gathering and to investigate epidemiological links between cases. A close collaboration between public health authorities within and across jurisdictions, health information exchanges, hospitals, and other first-response care providers will optimize health surveillance activities before, during, and after mass gatherings.
To examine the efficacy and tolerability of quetiapine SR in patients with schizophrenia switched from quetiapine IR.
Randomised, double-blind study (D1444C00146) using dual-matched placebo. Patients clinically stable on fixed doses of quetiapine IR received twice-daily quetiapine IR 400, 600 or 800 mg/day for 4 weeks. Stable patients were then randomised (1:2) to continue taking quetiapine IR or switch to the same total dose of quetiapine SR (active dose once-daily in the evening) for 6 weeks. Primary analysis: % of patients (modified ITT population) discontinuing due to lack of efficacy or with PANSS total increase ≥20% at any visit, using a 6% non-inferiority margin for the upper 95% CI of the treatment difference. Per-protocol (PP) analysis was also performed.
497 patients were randomised (quetiapine SR 331, IR 166); completion rates were 91.5% and 94.0%, respectively. Few patients discontinued due to lack of efficacy or had a PANSS increase ≥20% in both the MITT (n=496) and PP populations (n=393): 9.1% and 5.3% for quetiapine SR and 7.2% and 6.2% for quetiapine IR, respectively. Quetiapine SR was non-inferior to quetiapine IR in the PP population (treatment difference: -0.83% [95% CI -6.75, 3.71]; p=0017) but not in the MITT population (treatment difference: 1.86% [95% CI -3.78, 6.57]; p=0.0431). The incidence (quetiapine SR 38.7%; IR 35.5%) and profile of AEs were similar in both groups.
Clinically-stable patients receiving quetiapine IR can be switched, without titration, to an equivalent once-daily dose of quetiapine SR without any clinical deterioration or compromise in tolerability.
Improving the quality of care on psychiatric inpatient wards has been a major focus in recent mental health policy, a recurrent criticism being that contact between staff and patients is limited in time and therapeutic value. Change is unlikely to be achieved without recruitment and retention of a high quality and well-motivated work force.
The NHS commissioned national inpatient mental health staff morale study is intended to inform service planning and policy by delivering evidence on the morale of the inpatient mental health workforce and the clinical, organisational, architectural and human resources factors that influence it.
100 wards in 17 area ‘Trusts’ are participating in the study, in addition to 40 community teams. The study will take place over two years, and has 6 modules:
1. A quantitative questionnaire for all staff in participating wards and
2. A comparison group in 20 community mental health teams and 20 crisis teams.
3. Case studies of 10 wards scoring in the top and bottom quartile for indicators of morale.
4. Repeated questionnaires for 20 wards in the second year to investigate how morale changes over time.
5. Staff who leave the wards in the course of the first year will be asked their reasons for leaving.
6. Links between rates of staff sickness and morale will be investigated.
Questionnaires have been distributed to 3,500 staff with a response rate of 65%, results from which will be presented in 2009.
In a European, phase 3 study (SPD489-325), lisdexamfetamine dimesylate (LDX) and osmotic-release oral system methylphenidate (OROS-MPH) were more effective than placebo in improving core symptoms in children and adolescents with attention-deficit/hyperactivity disorder (ADHD).
Objectives and aims
To compare post hoc the efficacy of LDX and OROS-MPH in study SPD489-325.
This 7-week, randomized, double-blind, parallel-group, dose-optimized, placebo-controlled trial enrolled patients aged 6-17 years with ADHD of at least moderate severity. Patients were randomized (1:1:1) to receive a once-daily dose of LDX (30, 50, 70 mg/day), OROS-MPH (18, 36, 54 mg/day) or placebo. Efficacy was assessed using the ADHD Rating Scale version IV (ADHD-RS-IV) and the Clinical Global Impression-Improvement (CGI-I) scale. Endpoint was defined as the last ontherapy treatment visit with a valid assessment.
The full analysis set comprised 317 patients (LDX, n=104; placebo, n=106; OROS-MPH, n=107). The difference between LDX and OROS-MPH in least squares mean change (95% confidence interval [CI]) in ADHD-RS-IV total score from baseline to endpoint was statistically significant in favour of LDX (-5.6 [-8.4, -2.7]; p < 0.001; effect size, 0.541). The difference (LDX minus OROS-MPH) in the percentage of patients (95% CI) with an improved CGI-I score at endpoint was also statistically significant in favour of LDX (17.4 [5.0, 29.8]; p < 0.05).
This post hoc analysis indicated that LDX is significantly more effective than OROS-MPH in improving core symptoms and global functioning in children and adolescents with ADHD.
In a European, phase 3 study (SPD489-325), lisdexamfetamine dimesylate (LDX) was more effective than placebo in improving symptoms and global functioning in children and adolescents with attention-deficit/hyperactivity disorder (ADHD).
Objectives and aims
To evaluate the impact of age, sex and baseline disease severity on efficacy outcomes in SPD489- 325.
This 7-week, double-blind, parallel-group, dose-optimized study enrolled patients aged 6-17 years with ADHD. Patients were randomized (1:1:1) to once-daily LDX (30, 50 or 70mg/day), osmotic-release oral system methylphenidate (OROS-MPH; 18, 36 or 54mg/day) or placebo. Efficacy outcomes were analysed in patients dichotomized by age (6-12 years [n=229] or 13-17 years [n=88]), sex (male [n=255] or female [n=62]) and baseline ADHD Rating Scale version IV (ADHD-RSIV) total score (28-41 [n=161] or 42-54 [n=152]). Endpoint was the last on-treatment visit with a valid assessment.
At endpoint, differences (active-placebo) in least-squares mean changes from baseline in ADHD-RS-IV total scores were statistically significant in all age, sex and ADHD-RS-IV total score subgroups for LDX (p< 0.001; effect sizes, 1.68-2.26) and OROS-MPH (p< 0.01; effect sizes, 0.88-1.46). Proportions of patients with a Clinical Global Impressions-Improvement rating of 1 (very much improved) or 2 (much improved) were statistically significantly greater than placebo at endpoint in all subgroups receiving LDX (p< 0.01) and in all subgroups except females receiving OROS-MPH (p< 0.05).
LDX showed greater efficacy than placebo in children and adolescents with ADHD, regardless of their age, sex or baseline disease severity.
There is strong evidence that foods containing dietary fibre protect against colorectal cancer, resulting at least in part from its anti-proliferative properties. This study aimed to investigate the effects of supplementation with two non-digestible carbohydrates, resistant starch (RS) and polydextrose (PD), on crypt cell proliferative state (CCPS) in the macroscopically normal rectal mucosa of healthy individuals. We also investigated relationships between expression of regulators of apoptosis and of the cell cycle on markers of CCPS. Seventy-five healthy participants were supplemented with RS and/or PD or placebo for 50 d in a 2 × 2 factorial design in a randomised, double-blind, placebo-controlled trial (the Dietary Intervention, Stem cells and Colorectal Cancer (DISC) Study). CCPS was assessed, and the expression of regulators of the cell cycle and of apoptosis was measured by quantitative PCR in rectal mucosal biopsies. SCFA concentrations were quantified in faecal samples collected pre- and post-intervention. Supplementation with RS increased the total number of mitotic cells within the crypt by 60 % (P = 0·001) compared with placebo. This effect was limited to older participants (aged ≥50 years). No other differences were observed for the treatments with PD or RS as compared with their respective controls. PD did not influence any of the measured variables. RS, however, increased cell proliferation in the crypts of the macroscopically-normal rectum of older adults. Our findings suggest that the effects of RS on CCPS are not only dose, type of RS and health status-specific but are also influenced by age.
Recent research has begun to investigate implicit learning at the level of meaning. The general consensus is that implicitly linking a word with a meaning is constrained by existing linguistic knowledge. However, another factor to consider is the extent to which attention is drawn to the relevant meanings in implicit learning paradigms. We manipulated the presence of cue saliency during implicit rule learning for a grammatical form (i.e., articles) linked to meaning (i.e., animacy vs. varying notions of size). In a series of experiments, participants learned four novel words but did not know that article usage also depended on a hidden rule, creating an opportunity for implicit rule learning. We found implicit learning through the use of a highly salient meaning (Experiment 1) or if image size was made salient by being explicitly cued (Experiment 3), but not in a low salient paradigm for intrinsic object size (Experiment 2). The findings suggest that implicit learning of semantic information might not be as constrained as previously argued. Instead, implicit learning might be additionally influenced by feature-focusing cues that make the meaning contrasts more salient and thereby more readily available to learning.
Wild oat (Avena fatua L.) is one of the most problematic weed species in western Canada due to widespread populations, herbicide resistance, and seed dormancy. In wheat (Triticum aestivum L.), and especially in shorter crops such as lentil (Lens culinaris Medik.), A. fatua seed panicles elongate above the crop canopy, which can facilitate physical cutting of the panicles (clipping) to reduce viable seed return to the seedbank. However, the viability of A. fatua seed at the time of panicle elongation is not known. The objective of this study was to determine the viability of A. fatua seed at successive time intervals after elongation above a wheat or lentil crop canopy. A 2-yr panicle clipping and removal study in wheat and lentil was conducted in Lacombe, AB, and Saskatoon, SK, in 2015 and 2016 to determine the onset of viability in A. fatua seeds at successive clipping intervals. Manual panicle clipping of A. fatua panicles above each crop canopy began when the majority of panicles were visible above respective crop canopies and continued weekly until seed shed began. At the initiation of panicle clipping, A. fatua seed viability was between 0% and 10%. By the last clipping treatment (approximately 6 to 7 wk after elongation), 95% of the A. fatua seeds were viable. Seed moisture and awn angle were not good predictors of A. fatua viability, and therefore were unlikely to provide effective tools to estimate appropriate timing for implementation of A. fatua clipping as a management technique. Based on A. fatua seed viability, earlier clipping of A. fatua is likely to be more effective in terms of population management and easier to implement in shorter crops such as lentil. Investigations into long-term effects of clipping on A. fatua populations are needed to evaluate the efficacy of this management strategy on A. fatua.