To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
For centuries, paleontologists have sought functional explanations for the uniquely complex internal walls (septa) of ammonoids, extinct shelled cephalopods. Ammonoid septa developed increasingly complex fractal margins, unlike any modern shell morphologies, throughout more than 300 million years of evolution. Some have suggested these morphologies provided increased resistance to shell-crushing predators. We perform the first physical compression experiments on model ammonoid septa using controlled, theoretical morphologies generated by computer-aided design and 3D printing. These biomechanical experiments reveal that increasing complexity of septal margins does not increase compression resistance. Our results raise the question of whether the evolution of septal shape may be tied closely to the placement of the siphuncle foramen (anatomic septal hole). Our tests demonstrate weakness in the centers of uniformly thick septa, supporting work suggesting reinforcement by shell thickening at the center of septa. These experiments highlight the importance of 3D reconstruction using idealized theoretical morphologies that permit the testing of long-held hypotheses of functional evolutionary drivers by recreating extinct morphologies once rendered physically untestable by the fossil record.
In premodern economic systems where the social embedding of exchange provided actors with the ability to control or monopolize trade, including the goods that enter and leave a marketplace, “restricted markets” formed. These markets produced external revenues that could be used to achieve political goals. Conversely, commercialized systems required investment in public goods that incentivize the development of market cooperation and “open markets,” where buyers and sellers from across social sectors and diverse communities could engage in exchange as economic equals within marketplaces. In this article, we compare market development at the Late Postclassic sites of Chetumal, Belize, and Tlaxcallan, Mexico. We identified a restricted market at Chetumal, using the distribution of exotic goods, particularly militarily and ritually charged obsidian projectile points; in contrast, an open market was built at Tlaxcallan. Collective action theory provides a useful framework to understand these differences in market development. We argue that Tlaxcaltecan political architects adopted more collective strategies, in which open markets figured, to encourage cooperation among an ethnically diverse population.
Evaluation of a mandatory immunization program to increase and sustain high immunization coverage for healthcare personnel (HCP).
Descriptive study with before-and-after analysis.
Tertiary-care academic medical center.
Medical center HCP.
A comprehensive mandatory immunization initiative was implemented in 2 phases, starting in July 2014. Key facets of the initiative included a formalized exemption review process, incorporation into institutional quality goals, data feedback, and accountability to support compliance.
Both immunization and overall compliance rates with targeted immunizations increased significantly in the years after the implementation period. The influenza immunization rate increased from 80% the year prior to the initiative to >97% for the 3 subsequent influenza seasons (P < .0001). Mumps, measles and varicella vaccination compliance increased from 94% in January 2014 to >99% by January 2017, rubella vaccination compliance increased from 93% to 99.5%, and hepatitis B vaccination compliance from 95% to 99% (P < .0001 for all comparisons). An associated positive effect on TB testing compliance, which was not included in the mandatory program, was also noted; it increased from 76% to 92% over the same period (P < .0001).
Thoughtful, step-wise implementation of a mandatory immunization program linked to professional accountability can be successful in increasing immunization rates as well as overall compliance with policy requirements to cover all recommended HCP immunizations.
As herbicide-resistant weeds become more problematic, producers will consider the use of cover crops to suppress weeds. Weed suppression from cover crops may occur especially in the label-mandated buffer areas of dicamba-resistant soybean where dicamba use is not allowed. Three cover crops terminated at three timings with three herbicide strategies were evaluated for their effect on weed suppression in dicamba-resistant soybean. Delaying termination until soybean planting or after and using cereal rye or cereal rye + crimson clover increased cover-crop biomass by at least 40% compared to terminating early or using a crimson clover–only cover crop. Densities of problematic weed species were evaluated in early summer before a blanket POST application. Plots with cereal rye had 75% less horseweed compared to crimson clover at two of four site-years. Cereal rye or the mixed cover crop terminated at or after soybean planting reduced waterhemp densities by 87% compared to early termination timings of crimson clover and the earliest termination timing of the mix at one of two site-years. Cover crops were not as effective in reducing waterhemp densities as they were in reducing horseweed densities. This difference was due to a divergence in emergence patterns; waterhemp emergence generally peaks after termination of the cover crop, whereas horseweed emergence coincides with establishment and rapid vegetative growth of cereal rye. Cover crops alone were generally not as effective as was using a high-biomass cover crop combined with an herbicide strategy that contained dicamba and residual herbicides. However, within label-mandated buffer areas where dicamba cannot be used, a cover crop containing cereal rye with delayed termination until soybean planting combined with residual herbicides could be used to improve suppression of horseweed and waterhemp.
To test the feasibility of targeted gown and glove use by healthcare personnel caring for high-risk nursing-home residents to prevent Staphylococcus aureus acquisition in short-stay residents.
Uncontrolled clinical trial.
This study was conducted in 2 community-based nursing homes in Maryland.
The study included 322 residents on mixed short- and long-stay units.
During a 2-month baseline period, all residents had nose and inguinal fold swabs taken to estimate S. aureus acquisition. The intervention was iteratively developed using a participatory human factors engineering approach. During a 2-month intervention period, healthcare personnel wore gowns and gloves for high-risk care activities while caring for residents with wounds or medical devices, and S. aureus acquisition was measured again. Whole-genome sequencing was used to assess whether the acquisition represented resident-to-resident transmission.
Among short-stay residents, the methicillin-resistant S. aureus acquisition rate decreased from 11.9% during the baseline period to 3.6% during the intervention period (odds ratio [OR], 0.28; 95% CI, 0.08–0.92; P = .026). The methicillin-susceptible S. aureus acquisition rate went from 9.1% during the baseline period to 4.0% during the intervention period (OR, 0.41; 95% CI, 0.12–1.42; P = .15). The S. aureus resident-to-resident transmission rate decreased from 5.9% during the baseline period to 0.8% during the intervention period.
Targeted gown and glove use by healthcare personnel for high-risk care activities while caring for residents with wounds or medical devices, regardless of their S. aureus colonization status, is feasible and potentially decreases S. aureus acquisition and transmission in short-stay community-based nursing-home residents.
Acanthocephalans are parasites with complex lifecycles that are important components of aquatic systems and are often model species for parasite-mediated host manipulation. Genetic characterization has recently resurrected Pomphorhynchus tereticollis as a distinct species from Pomphorhynchus laevis, with potential implications for fisheries management and host manipulation research. Morphological and molecular examinations of parasites from 7 English rivers across 9 fish species revealed that P. tereticollis was the only Pomphorhynchus parasite present in Britain, rather than P. laevis as previously recorded. Molecular analyses included two non-overlapping regions of the mitochondrial gene – cytochrome oxidase and generated 62 sequences for the shorter fragment (295 bp) and 74 for the larger fragment (583 bp). These were combined with 61 and 13 sequences respectively, from Genbank. A phylogenetic analysis using the two genetic regions and all the DNA sequences available for P. tereticollis identified two distinct genetic lineages in Britain. One lineage, possibly associated with cold water tolerant fish, potentially spread to the northern parts of Britain from the Baltic region via a northern route across the estuarine area of what is now the North Sea during the last Glaciation. The other lineage, associated with temperate freshwater fish, may have arrived later via the Rhine/Thames fluvial connection during the last glaciation or early Holocene when sea levels were low. These results raise important questions on this generalist parasite and its variously environmentally adapted hosts, and especially in relation to the consequences for parasite vicariance.
To evaluate the effect of the burden of Staphylococcus aureus colonization of nursing home residents on the risk of S. aureus transmission to healthcare worker (HCW) gowns and gloves.
Multicenter prospective cohort study.
Setting and participants:
Residents and HCWs from 13 community-based nursing homes in Maryland and Michigan.
Residents were cultured for S. aureus at the anterior nares and perianal skin. The S. aureus burden was estimated by quantitative polymerase chain reaction detecting the nuc gene. HCWs wore gowns and gloves during usual care activities; gowns and gloves were swabbed and then cultured for the presence of S. aureus.
In total, 403 residents were enrolled; 169 were colonized with methicillin-resistant S. aureus (MRSA) or methicillin-sensitive S. aureus (MSSA) and comprised the study population; 232 were not colonized and thus were excluded from this analysis; and 2 were withdrawn prior to being swabbed. After multivariable analysis, perianal colonization with S. aureus conferred the greatest odds for transmission to HCW gowns and gloves, and the odds increased with increasing burden of colonization: adjusted odds ratio (aOR), 2.1 (95% CI, 1.3–3.5) for low-level colonization and aOR 5.2 (95% CI, 3.1–8.7) for high level colonization.
Among nursing home patients colonized with S. aureus, the risk of transmission to HCW gowns and gloves was greater from those colonized with greater quantities of S. aureus on the perianal skin. Our findings inform future infection control practices for both MRSA and MSSA in nursing homes.
Artificial microswimmers, or ‘microbots’, have the potential to revolutionise non-invasive medicine and microfluidics. Microbots that are powered by self-phoretic mechanisms, such as Janus particles, often harness a solute fuel in their environment. Traditionally, self-phoretic particles are point like, but slender phoretic rods have become an increasingly prevalent design. While there has been substantial interest in creating efficient asymptotic theories for slender phoretic rods, hitherto such theories have been restricted to straight rods with axisymmetric patterning. However, modern manufacturing methods will soon allow fabrication of slender phoretic filaments with complex three-dimensional shapes. In this paper, we develop a slender body theory for the solute of self-diffusiophoretic filaments of arbitrary three-dimensional shape and patterning. We demonstrate analytically that, unlike other slender body theories, first-order azimuthal variations arising from curvature and confinement can make a leading-order contribution to the swimming kinematics.
The study of planning in second language (L2) writing research is heavily influenced by two research domains: (a) early research on cognition in first language (L1) composing processes and (b) second language acquisition (SLA) research. The first research domain has been instrumental in determining the specific systems and processes involved in composing and has led to widely accepted models of L1 writing (Bereiter & Scardamalia, 1987*; Flower & Hayes, 1980*; Hayes, 1996, 2012) as well as a widely accepted model of the interaction between working memory and L1 writing systems (Kellogg, 1996*; Kellogg, Whiteford, Turner, Cahill, & Mertens, 2013). The influence of these early studies is still felt in process approaches to composition instruction commonly implemented in L1 and L2 writing classes. The second research domain—SLA and more specifically task-based language teaching/learning—has come to view planning as a feature of task complexity that can be manipulated to facilitate the production of language that is complex (syntactically and/or lexically), accurate, and/or fluent (Robinson, 2011*; Skehan, 1998*; Skehan & Foster, 2001). This research timeline traces the study of planning in L2 writing in each of these domains by reviewing key L1 and L2 writing research over the last 30-plus years and by highlighting each study's findings. Prior to presenting the timeline, the following sections provide backgrounds in each of the domains noted above and situate planning within those domains.
Southeastern Appalachian Ohio has more than double the national average of diabetes and a critical shortage of healthcare providers. Paradoxically, there is limited research focused on primary care providers’ experiences treating people with diabetes in this region. This study explored providers’ perceived barriers to and facilitators for treating patients with diabetes in southeastern Appalachian Ohio.
We conducted in-depth interviews with healthcare providers who treat people with diabetes in rural southeastern Ohio. Interviews were transcribed, coded, and analyzed via content and thematic analyses using NVivo 12 software (QSR International, Chadstone, VIC, Australia).
Qualitative analysis revealed four themes: (1) patients’ diabetes fatalism and helplessness: providers recounted story after story of patients believing that their diabetes was inevitable and that they were helpless to prevent or delay diabetes complications. (2) Comorbid psychosocial issues: providers described high rates of depression, anxiety, incest, abuse, and post-traumatic stress disorder among people with diabetes in this region. (3) Inter-connected social determinants interfering with diabetes care: providers identified major barriers including lack of access to providers, lack of access to transportation, food insecurity, housing insecurity, and financial insecurity. (4) Providers’ cultural understanding and recommendations: providers emphasized the importance of understanding of the values central to Appalachian culture and gave culturally attuned clinical suggestions for how to use these values when working with this population.
Evidence-based interventions tailored to Appalachian culture and training designed to increase the cultural competency and cultural humility of primary care providers may be effective approaches to reduce barriers to diabetes care in Appalachian Ohio.
Field experiments were conducted in 2017 and 2018 at two locations in Indiana to evaluate the influence of cover crop species, termination timing, and herbicide treatment on winter and summer annual weed suppression and corn yield. Cereal rye and canola cover crops were terminated early or late (2 wk before or after corn planting) with a glyphosate- or glufosinate-based herbicide program. Canola and cereal rye reduced total weed biomass collected at termination by up to 74% and 91%, in comparison to fallow, respectively. Canola reduced horseweed density by up to 56% at termination and 57% at POST application compared to fallow. Cereal rye reduced horseweed density by up to 59% at termination and 87% at POST application compared to fallow. Canola did not reduce giant ragweed density at termination in comparison to fallow. Cereal rye reduced giant ragweed density by up to 66% at termination and 62% at POST application. Termination timing had little to no effect on weed biomass and density reduction in comparison to the effect of cover crop species. Cereal rye reduced corn grain yield at both locations in comparison to fallow, especially for the late-termination timing. Corn grain yield reduction up to 49% (4,770 kg ha–1) was recorded for cereal rye terminated late in comparison to fallow terminated late. Canola did not reduce corn grain yield in comparison to fallow within termination timing; however, late-terminated canola reduced corn grain yield by up to 21% (2,980 kg ha–1) in comparison to early-terminated fallow. Cereal rye can suppress giant ragweed emergence, whereas canola is not as effective at suppressing large-seeded broadleaves such as giant ragweed. These results also indicate that early-terminated cover crops can often result in higher corn grain yields than late-terminated cover crops in an integrated weed management program.
The 2017 solar eclipse was associated with mass gatherings in many of the 14 states along the path of totality. The Kentucky Department for Public Health implemented an enhanced syndromic surveillance system to detect increases in emergency department (ED) visits and other health care needs near Hopkinsville, Kentucky, where the point of greatest eclipse occurred.
EDs flagged visits of patients who participated in eclipse events from August 17–22. Data from 14 area emergency medical services and 26 first-aid stations were also monitored to detect health-related events occurring during the eclipse period.
Forty-four potential eclipse event-related visits were identified, primarily injuries, gastrointestinal illness, and heat-related illness. First-aid stations and emergency medical services commonly attended to patients with pain and heat-related illness.
Kentucky’s experience during the eclipse demonstrated the value of patient visit flagging to describe the disease burden during a mass gathering and to investigate epidemiological links between cases. A close collaboration between public health authorities within and across jurisdictions, health information exchanges, hospitals, and other first-response care providers will optimize health surveillance activities before, during, and after mass gatherings.
To examine the efficacy and tolerability of quetiapine SR in patients with schizophrenia switched from quetiapine IR.
Randomised, double-blind study (D1444C00146) using dual-matched placebo. Patients clinically stable on fixed doses of quetiapine IR received twice-daily quetiapine IR 400, 600 or 800 mg/day for 4 weeks. Stable patients were then randomised (1:2) to continue taking quetiapine IR or switch to the same total dose of quetiapine SR (active dose once-daily in the evening) for 6 weeks. Primary analysis: % of patients (modified ITT population) discontinuing due to lack of efficacy or with PANSS total increase ≥20% at any visit, using a 6% non-inferiority margin for the upper 95% CI of the treatment difference. Per-protocol (PP) analysis was also performed.
497 patients were randomised (quetiapine SR 331, IR 166); completion rates were 91.5% and 94.0%, respectively. Few patients discontinued due to lack of efficacy or had a PANSS increase ≥20% in both the MITT (n=496) and PP populations (n=393): 9.1% and 5.3% for quetiapine SR and 7.2% and 6.2% for quetiapine IR, respectively. Quetiapine SR was non-inferior to quetiapine IR in the PP population (treatment difference: -0.83% [95% CI -6.75, 3.71]; p=0017) but not in the MITT population (treatment difference: 1.86% [95% CI -3.78, 6.57]; p=0.0431). The incidence (quetiapine SR 38.7%; IR 35.5%) and profile of AEs were similar in both groups.
Clinically-stable patients receiving quetiapine IR can be switched, without titration, to an equivalent once-daily dose of quetiapine SR without any clinical deterioration or compromise in tolerability.
Improving the quality of care on psychiatric inpatient wards has been a major focus in recent mental health policy, a recurrent criticism being that contact between staff and patients is limited in time and therapeutic value. Change is unlikely to be achieved without recruitment and retention of a high quality and well-motivated work force.
The NHS commissioned national inpatient mental health staff morale study is intended to inform service planning and policy by delivering evidence on the morale of the inpatient mental health workforce and the clinical, organisational, architectural and human resources factors that influence it.
100 wards in 17 area ‘Trusts’ are participating in the study, in addition to 40 community teams. The study will take place over two years, and has 6 modules:
1. A quantitative questionnaire for all staff in participating wards and
2. A comparison group in 20 community mental health teams and 20 crisis teams.
3. Case studies of 10 wards scoring in the top and bottom quartile for indicators of morale.
4. Repeated questionnaires for 20 wards in the second year to investigate how morale changes over time.
5. Staff who leave the wards in the course of the first year will be asked their reasons for leaving.
6. Links between rates of staff sickness and morale will be investigated.
Questionnaires have been distributed to 3,500 staff with a response rate of 65%, results from which will be presented in 2009.